WorldWideScience

Sample records for tool utilising predicted

  1. Metabolite signal identification in accurate mass metabolomics data with MZedDB, an interactive m/z annotation tool utilising predicted ionisation behaviour 'rules'

    Directory of Open Access Journals (Sweden)

    Snowdon Stuart

    2009-07-01

    Full Text Available Abstract Background Metabolomics experiments using Mass Spectrometry (MS technology measure the mass to charge ratio (m/z and intensity of ionised molecules in crude extracts of complex biological samples to generate high dimensional metabolite 'fingerprint' or metabolite 'profile' data. High resolution MS instruments perform routinely with a mass accuracy of Results Metabolite 'structures' harvested from publicly accessible databases were converted into a common format to generate a comprehensive archive in MZedDB. 'Rules' were derived from chemical information that allowed MZedDB to generate a list of adducts and neutral loss fragments putatively able to form for each structure and calculate, on the fly, the exact molecular weight of every potential ionisation product to provide targets for annotation searches based on accurate mass. We demonstrate that data matrices representing populations of ionisation products generated from different biological matrices contain a large proportion (sometimes > 50% of molecular isotopes, salt adducts and neutral loss fragments. Correlation analysis of ESI-MS data features confirmed the predicted relationships of m/z signals. An integrated isotope enumerator in MZedDB allowed verification of exact isotopic pattern distributions to corroborate experimental data. Conclusion We conclude that although ultra-high accurate mass instruments provide major insight into the chemical diversity of biological extracts, the facile annotation of a large proportion of signals is not possible by simple, automated query of current databases using computed molecular formulae. Parameterising MZedDB to take into account predicted ionisation behaviour and the biological source of any sample improves greatly both the frequency and accuracy of potential annotation 'hits' in ESI-MS data.

  2. Assessing the utilisation of a child health monitoring tool

    African Journals Online (AJOL)

    2017-12-06

    Dec 6, 2017 ... preventive or promotive tool for monitoring child health as neither ... attitudes and practices of both CGs and HCWs relating to these components; and (iii) identify HCWs' perceptions of the barriers .... In posession of old RtHC (n=54) .... number of CGs (16.4%; 409/1 646) knew that a young child should.

  3. New Tool to Predict Glaucoma

    Science.gov (United States)

    ... In This Section A New Tool to Predict Glaucoma email Send this article to a friend by ... Close Send Thanks for emailing that article! Tweet Glaucoma can be difficult to detect and diagnose. Measurement ...

  4. Impediments to Effective Utilisation of Information and Communication Technology Tools in Selected Universities in the North-Eastern Nigeria

    Science.gov (United States)

    Momoh, Mustapha

    2010-01-01

    This study examined the impediments to effective use of Information and Communication Technology (ICT) tools in Nigerian universities. Series of research conducted on the factors militating against computerisation indicated that, there were impediments to effective utilisation of ICT tools in most developing countries. In the light of this, the…

  5. Explaining low uptake for Down syndrome screening in the Netherlands : (and predicting utilisation of other programmes)

    NARCIS (Netherlands)

    Crombag, NMTH

    2016-01-01

    In the Netherlands, only a quarter of all pregnant women take part in the current Down syndrome screening(DSS) programme. Compared to other Northern European countries, Dutch uptake rates are very low. This thesis concentrates on the test-utilisation of DSS, in particular the factors impeding or

  6. Behavior Prediction Tools Strengthen Nanoelectronics

    Science.gov (United States)

    2013-01-01

    Several years ago, NASA started making plans to send robots to explore the deep, dark craters on the Moon. As part of these plans, NASA needed modeling tools to help engineer unique electronics to withstand extremely cold temperatures. According to Jonathan Pellish, a flight systems test engineer at Goddard Space Flight Center, "An instrument sitting in a shadowed crater on one of the Moon s poles would hover around 43 K", that is, 43 kelvin, equivalent to -382 F. Such frigid temperatures are one of the main factors that make the extreme space environments encountered on the Moon and elsewhere so extreme. Radiation is another main concern. "Radiation is always present in the space environment," says Pellish. "Small to moderate solar energetic particle events happen regularly and extreme events happen less than a handful of times throughout the 7 active years of the 11-year solar cycle." Radiation can corrupt data, propagate to other systems, require component power cycling, and cause a host of other harmful effects. In order to explore places like the Moon, Jupiter, Saturn, Venus, and Mars, NASA must use electronic communication devices like transmitters and receivers and data collection devices like infrared cameras that can resist the effects of extreme temperature and radiation; otherwise, the electronics would not be reliable for the duration of the mission.

  7. Predictive Data Tools Find Uses in Schools

    Science.gov (United States)

    Sparks, Sarah D.

    2011-01-01

    The use of analytic tools to predict student performance is exploding in higher education, and experts say the tools show even more promise for K-12 schools, in everything from teacher placement to dropout prevention. Use of such statistical techniques is hindered in precollegiate schools, however, by a lack of researchers trained to help…

  8. Facial expression: An under-utilised tool for the assessment of welfare in mammals.

    Science.gov (United States)

    Descovich, Kris A; Wathan, Jennifer; Leach, Matthew C; Buchanan-Smith, Hannah M; Flecknell, Paul; Farningham, David; Vick, Sarah-Jane

    2017-01-01

    Animal welfare is a key issue for industries that use or impact upon animals. The accurate identification of welfare states is particularly relevant to the field of bioscience, where the 3Rs framework encourages refinement of experimental procedures involving animal models. The assessment and improvement of welfare states in animals depends on reliable and valid measurement tools. Behavioral measures (activity, attention, posture and vocalization) are frequently used because they are immediate and non-invasive, however no single indicator can yield a complete picture of the internal state of an animal. Facial expressions are extensively studied in humans as a measure of psychological and emotional experiences but are infrequently used in animal studies, with the exception of emerging research on pain behavior. In this review, we discuss current evidence for facial representations of underlying affective states, and how communicative or functional expressions can be useful within welfare assessments. Validated tools for measuring facial movement are outlined, and the potential of expressions as honest signals is discussed, alongside other challenges and limitations to facial expression measurement within the context of animal welfare. We conclude that facial expression determination in animals is a useful but underutilized measure that complements existing tools in the assessment of welfare.

  9. PISCES: A Tool for Predicting Software Testability

    Science.gov (United States)

    Voas, Jeffrey M.; Miller, Keith W.; Payne, Jeffery E.

    1991-01-01

    Before a program can fail, a software fault must be executed, that execution must alter the data state, and the incorrect data state must propagate to a state that results directly in an incorrect output. This paper describes a tool called PISCES (developed by Reliable Software Technologies Corporation) for predicting the probability that faults in a particular program location will accomplish all three of these steps causing program failure. PISCES is a tool that is used during software verification and validation to predict a program's testability.

  10. Daylight prediction techniques in energy design tools

    Energy Technology Data Exchange (ETDEWEB)

    Milne, M.; Zurick, J. [California Univ., Los Angeles, Dept. of Architecture, CA (United States)

    1998-09-01

    Four different whole-building energy design tool systems that calculate energy savings from daylighting and that display annual performance on an-hour-by-hour basis, have been tested. The nature of design tools, the sources of hourly outdoor illuminance data, the ways of predicting indoor illumination, the assumptions of each tool, and the resulting energy savings of the design tools tested are discussed. The tests were carried out with the essential criteria for evaluating whole-building daylighting and energy design tools in mind. These have been identified as user confidence, accuracy, response time, and the amount of detail. Results of the tests, all four of them run on a single elementary school classroom for the sake of comparability, were provided. 9 refs., 2 figs.

  11. A Business Analytics Software Tool for Monitoring and Predicting Radiology Throughput Performance.

    Science.gov (United States)

    Jones, Stephen; Cournane, Seán; Sheehy, Niall; Hederman, Lucy

    2016-12-01

    Business analytics (BA) is increasingly being utilised by radiology departments to analyse and present data. It encompasses statistical analysis, forecasting and predictive modelling and is used as an umbrella term for decision support and business intelligence systems. The primary aim of this study was to determine whether utilising BA technologies could contribute towards improved decision support and resource management within radiology departments. A set of information technology requirements were identified with key stakeholders, and a prototype BA software tool was designed, developed and implemented. A qualitative evaluation of the tool was carried out through a series of semi-structured interviews with key stakeholders. Feedback was collated, and emergent themes were identified. The results indicated that BA software applications can provide visibility of radiology performance data across all time horizons. The study demonstrated that the tool could potentially assist with improving operational efficiencies and management of radiology resources.

  12. Gaussian process regression for tool wear prediction

    Science.gov (United States)

    Kong, Dongdong; Chen, Yongjie; Li, Ning

    2018-05-01

    To realize and accelerate the pace of intelligent manufacturing, this paper presents a novel tool wear assessment technique based on the integrated radial basis function based kernel principal component analysis (KPCA_IRBF) and Gaussian process regression (GPR) for real-timely and accurately monitoring the in-process tool wear parameters (flank wear width). The KPCA_IRBF is a kind of new nonlinear dimension-increment technique and firstly proposed for feature fusion. The tool wear predictive value and the corresponding confidence interval are both provided by utilizing the GPR model. Besides, GPR performs better than artificial neural networks (ANN) and support vector machines (SVM) in prediction accuracy since the Gaussian noises can be modeled quantitatively in the GPR model. However, the existence of noises will affect the stability of the confidence interval seriously. In this work, the proposed KPCA_IRBF technique helps to remove the noises and weaken its negative effects so as to make the confidence interval compressed greatly and more smoothed, which is conducive for monitoring the tool wear accurately. Moreover, the selection of kernel parameter in KPCA_IRBF can be easily carried out in a much larger selectable region in comparison with the conventional KPCA_RBF technique, which helps to improve the efficiency of model construction. Ten sets of cutting tests are conducted to validate the effectiveness of the presented tool wear assessment technique. The experimental results show that the in-process flank wear width of tool inserts can be monitored accurately by utilizing the presented tool wear assessment technique which is robust under a variety of cutting conditions. This study lays the foundation for tool wear monitoring in real industrial settings.

  13. GAPIT: genome association and prediction integrated tool.

    Science.gov (United States)

    Lipka, Alexander E; Tian, Feng; Wang, Qishan; Peiffer, Jason; Li, Meng; Bradbury, Peter J; Gore, Michael A; Buckler, Edward S; Zhang, Zhiwu

    2012-09-15

    Software programs that conduct genome-wide association studies and genomic prediction and selection need to use methodologies that maximize statistical power, provide high prediction accuracy and run in a computationally efficient manner. We developed an R package called Genome Association and Prediction Integrated Tool (GAPIT) that implements advanced statistical methods including the compressed mixed linear model (CMLM) and CMLM-based genomic prediction and selection. The GAPIT package can handle large datasets in excess of 10 000 individuals and 1 million single-nucleotide polymorphisms with minimal computational time, while providing user-friendly access and concise tables and graphs to interpret results. http://www.maizegenetics.net/GAPIT. zhiwu.zhang@cornell.edu Supplementary data are available at Bioinformatics online.

  14. Web tools for predictive toxicology model building.

    Science.gov (United States)

    Jeliazkova, Nina

    2012-07-01

    The development and use of web tools in chemistry has accumulated more than 15 years of history already. Powered by the advances in the Internet technologies, the current generation of web systems are starting to expand into areas, traditional for desktop applications. The web platforms integrate data storage, cheminformatics and data analysis tools. The ease of use and the collaborative potential of the web is compelling, despite the challenges. The topic of this review is a set of recently published web tools that facilitate predictive toxicology model building. The focus is on software platforms, offering web access to chemical structure-based methods, although some of the frameworks could also provide bioinformatics or hybrid data analysis functionalities. A number of historical and current developments are cited. In order to provide comparable assessment, the following characteristics are considered: support for workflows, descriptor calculations, visualization, modeling algorithms, data management and data sharing capabilities, availability of GUI or programmatic access and implementation details. The success of the Web is largely due to its highly decentralized, yet sufficiently interoperable model for information access. The expected future convergence between cheminformatics and bioinformatics databases provides new challenges toward management and analysis of large data sets. The web tools in predictive toxicology will likely continue to evolve toward the right mix of flexibility, performance, scalability, interoperability, sets of unique features offered, friendly user interfaces, programmatic access for advanced users, platform independence, results reproducibility, curation and crowdsourcing utilities, collaborative sharing and secure access.

  15. A new methodology for predictive tool wear

    Science.gov (United States)

    Kim, Won-Sik

    turned with various cutting conditions and the results were compared with the proposed analytical wear models. The crater surfaces after machining have been carefully studied to shed light on the physics behind the crater wear. In addition, the abrasive wear mechanism plays a major role in the development of crater wear. Laser shock processing (LSP) has been applied to locally relieve the deleterious tensile residual stresses on the crater surface of a coated tool, thus to improve the hardness of the coating. This thesis shows that LSP has indeed improve wear resistance of CVD coated alumina tool inserts, which has residual stress due to high processing temperature. LSP utilizes a very short laser pulse with high energy density, which induces high-pressure stress wave propagation. The residual stresses are relieved by incident shock waves on the coating surface. Residual stress levels of LSP CVD alumina-coated carbide insert were evaluated by the X-ray diffractometer. Based on these results, LSP parameters such as number of laser pulses and laser energy density can be controlled to reduce residual stress. Crater wear shows that the wear resistance increase with LSP treated tool inserts. Because the hardness data are used to predict the wear, the improvement in hardness and wear resistance shows that the mechanism of crater wear also involves abrasive wear.

  16. Development of an attrition risk prediction tool.

    Science.gov (United States)

    Fowler, John; Norrie, Peter

    To review lecturers' and students' perceptions of the factors that may lead to attrition from pre-registration nursing and midwifery programmes and to identify ways to reduce the impact of such factors on the student's experience. Comparable attrition rates for nursing and midwifery students across various universities are difficult to monitor accurately; however, estimates that there is approximately a 25% national attrition rate are not uncommon. The financial and human implications of this are significant and worthy of investigation. A study was carried out in one medium-sized UK school of nursing and midwifery, aimed at identifying perceived factors associated with attrition and retention. Thirty-five lecturers were interviewed individually; 605 students completed a questionnaire, and of these, 10 were individually interviewed. Attrition data kept by the student service department were reviewed. Data were collected over an 18-month period in 2007-2008. Regression analysis of the student data identified eight significant predictors. Four of these were 'positive' factors in that they aided student retention and four were 'negative' in that they were associated with students' thoughts of resigning. Student attrition and retention is multifactorial, and, as such, needs to be managed holistically. One aspect of this management could be an attrition risk prediction tool.

  17. Determination of the predictive factors of long-lasting insecticide-treated net ownership and utilisation in the Bamenda Health District of Cameroon

    Directory of Open Access Journals (Sweden)

    Eric B. Fokam

    2017-03-01

    Full Text Available Abstract Background Malaria is a serious health concern in Africa. In Cameroon, an endemic country where malaria remains a major public health problem, several control measures have been put in place among which the use of insecticide-treated bednets (LLINs/ITNs is considered one of the core vector control strategies. However, the greatest challenges include ownership and utilisation by individuals and households. Factors such as age, marital status, gender, education and occupation of the household head, household size, knowledge of bednets, socioeconomic status, and environmental factors have been suggested to have an impact on bednet ownership and utilisation in different settings. The present study sought to determine bednet ownership and utilisation rates and to assess the impact of predictive factors on bednet ownership and use in the Bamenda Health District (BHD of Cameroon. Methods A cross-sectional study involving 384 households was conducted in six health areas in the BHD. A structured and semi-structured questionnaire was used to collect data on demographic and household characteristics as well as information on their bednet ownership and utilisation. Descriptive statistics, bivariate and multivariate logistic regression analysis were performed. Results Frequency of bednet ownership was relatively high (63.5% with LLINs being most abundant (91.9%; the majority of households (87.7% obtained their bednets during the 2011 free distribution campaign. Utilisation was relatively high (69.3%, with negligence (29.3% and heat discomfort (26.7% accounting most for non-usage of bednets. Children less than 5 years (63% and pregnant women (60% most often used these nets. Households headed by a married couple, those with older household heads, household with smaller size (5–12 persons, and knowledge of bednets (good knowledge had positive impacts on bednet ownership (p < 0.05. The gender of the household head (males, their educational level

  18. Common features of microRNA target prediction tools

    Directory of Open Access Journals (Sweden)

    Sarah M. Peterson

    2014-02-01

    Full Text Available The human genome encodes for over 1800 microRNAs, which are short noncoding RNA molecules that function to regulate gene expression post-transcriptionally. Due to the potential for one microRNA to target multiple gene transcripts, microRNAs are recognized as a major mechanism to regulate gene expression and mRNA translation. Computational prediction of microRNA targets is a critical initial step in identifying microRNA:mRNA target interactions for experimental validation. The available tools for microRNA target prediction encompass a range of different computational approaches, from the modeling of physical interactions to the incorporation of machine learning. This review provides an overview of the major computational approaches to microRNA target prediction. Our discussion highlights three tools for their ease of use, reliance on relatively updated versions of miRBase, and range of capabilities, and these are DIANA-microT-CDS, miRanda-mirSVR, and TargetScan. In comparison across all microRNA target prediction tools, four main aspects of the microRNA:mRNA target interaction emerge as common features on which most target prediction is based: seed match, conservation, free energy, and site accessibility. This review explains these features and identifies how they are incorporated into currently available target prediction tools. MicroRNA target prediction is a dynamic field with increasing attention on development of new analysis tools. This review attempts to provide a comprehensive assessment of these tools in a manner that is accessible across disciplines. Understanding the basis of these prediction methodologies will aid in user selection of the appropriate tools and interpretation of the tool output.

  19. Water Impact Prediction Tool for Recoverable Rockets

    Science.gov (United States)

    Rooker, William; Glaese, John; Clayton, Joe

    2011-01-01

    Reusing components from a rocket launch can be cost saving. NASA's space shuttle system has reusable components that return to the Earth and impact the ocean. A primary example is the Space Shuttle Solid Rocket Booster (SRB) that descends on parachutes to the Earth after separation and impacts the ocean. Water impact generates significant structural loads that can damage the booster, so it is important to study this event in detail in the design of the recovery system. Some recent examples of damage due to water impact include the Ares I-X First Stage deformation as seen in Figure 1 and the loss of the SpaceX Falcon 9 First Stage.To ensure that a component can be recovered or that the design of the recovery system is adequate, an adequate set of structural loads is necessary for use in failure assessments. However, this task is difficult since there are many conditions that affect how a component impacts the water and the resulting structural loading that a component sees. These conditions include the angle of impact with respect to the water, the horizontal and vertical velocities, the rotation rate, the wave height and speed, and many others. There have been attempts to simulate water impact. One approach is to analyze water impact using explicit finite element techniques such as those employed by the LS-Dyna tool [1]. Though very detailed, this approach is time consuming and would not be suitable for running Monte Carlo or optimization analyses. The purpose of this paper is to describe a multi-body simulation tool that runs quickly and that captures the environments a component might see. The simulation incorporates the air and water interaction with the component, the component dynamics (i.e. modes and mode shapes), any applicable parachutes and lines, the interaction of winds and gusts, and the wave height and speed. It is capable of quickly conducting Monte Carlo studies to better capture the environments and genetic algorithm optimizations to reproduce a

  20. Dissemination of public health information: key tools utilised by the NECOBELAC network in Europe and Latin America.

    Science.gov (United States)

    De Castro, Paola; Marsili, Daniela; Poltronieri, Elisabetta; Calderón, Carlos Agudelo

    2012-06-01

     Open Access (OA) to scientific information is an important step forward in communication patterns, yet we still need to reinforce OA principles to promote a cultural change of traditional publishing practices. The advantages of free access to scientific information are even more evident in public health where knowledge is directly associated with human wellbeing.  An OA 'consolidation' initiative in public health is presented to show how the involvement of people and institutions is fundamental to create awareness on OA and promote a cultural change. This initiative is developed within the project NEtwork of COllaboration Between Europe and Latin American Caribbean countries (NECOBELAC), financed by the European Commission.  Three actions are envisaged: Capacity building through a flexible and sustainable training programme on scientific writing and OA publishing; creation of training tools based on semantic web technologies; development of a network of supporting institutions.  In 2010-2011, 23 training initiatives were performed involving 856 participants from 15 countries; topic maps on scientific publication and OA were produced; 195 institutions are included in the network.  Cultural change in scientific dissemination practices is a long process requiring a flexible approach and strong commitment by all stakeholders. © 2012 The authors. Health Information and Libraries Journal © 2012 Health Libraries Group Health Information and Libraries Journal.

  1. Prediction of Machine Tool Condition Using Support Vector Machine

    International Nuclear Information System (INIS)

    Wang Peigong; Meng Qingfeng; Zhao Jian; Li Junjie; Wang Xiufeng

    2011-01-01

    Condition monitoring and predicting of CNC machine tools are investigated in this paper. Considering the CNC machine tools are often small numbers of samples, a condition predicting method for CNC machine tools based on support vector machines (SVMs) is proposed, then one-step and multi-step condition prediction models are constructed. The support vector machines prediction models are used to predict the trends of working condition of a certain type of CNC worm wheel and gear grinding machine by applying sequence data of vibration signal, which is collected during machine processing. And the relationship between different eigenvalue in CNC vibration signal and machining quality is discussed. The test result shows that the trend of vibration signal Peak-to-peak value in surface normal direction is most relevant to the trend of surface roughness value. In trends prediction of working condition, support vector machine has higher prediction accuracy both in the short term ('One-step') and long term (multi-step) prediction compared to autoregressive (AR) model and the RBF neural network. Experimental results show that it is feasible to apply support vector machine to CNC machine tool condition prediction.

  2. Virtual Beach: Decision Support Tools for Beach Pathogen Prediction

    Science.gov (United States)

    The Virtual Beach Managers Tool (VB) is decision-making software developed to help local beach managers make decisions as to when beaches should be closed due to predicted high levels of water borne pathogens. The tool is being developed under the umbrella of EPA's Advanced Monit...

  3. RNA-SSPT: RNA Secondary Structure Prediction Tools.

    Science.gov (United States)

    Ahmad, Freed; Mahboob, Shahid; Gulzar, Tahsin; Din, Salah U; Hanif, Tanzeela; Ahmad, Hifza; Afzal, Muhammad

    2013-01-01

    The prediction of RNA structure is useful for understanding evolution for both in silico and in vitro studies. Physical methods like NMR studies to predict RNA secondary structure are expensive and difficult. Computational RNA secondary structure prediction is easier. Comparative sequence analysis provides the best solution. But secondary structure prediction of a single RNA sequence is challenging. RNA-SSPT is a tool that computationally predicts secondary structure of a single RNA sequence. Most of the RNA secondary structure prediction tools do not allow pseudoknots in the structure or are unable to locate them. Nussinov dynamic programming algorithm has been implemented in RNA-SSPT. The current studies shows only energetically most favorable secondary structure is required and the algorithm modification is also available that produces base pairs to lower the total free energy of the secondary structure. For visualization of RNA secondary structure, NAVIEW in C language is used and modified in C# for tool requirement. RNA-SSPT is built in C# using Dot Net 2.0 in Microsoft Visual Studio 2005 Professional edition. The accuracy of RNA-SSPT is tested in terms of Sensitivity and Positive Predicted Value. It is a tool which serves both secondary structure prediction and secondary structure visualization purposes.

  4. Updating risk prediction tools: a case study in prostate cancer.

    Science.gov (United States)

    Ankerst, Donna P; Koniarski, Tim; Liang, Yuanyuan; Leach, Robin J; Feng, Ziding; Sanda, Martin G; Partin, Alan W; Chan, Daniel W; Kagan, Jacob; Sokoll, Lori; Wei, John T; Thompson, Ian M

    2012-01-01

    Online risk prediction tools for common cancers are now easily accessible and widely used by patients and doctors for informed decision-making concerning screening and diagnosis. A practical problem is as cancer research moves forward and new biomarkers and risk factors are discovered, there is a need to update the risk algorithms to include them. Typically, the new markers and risk factors cannot be retrospectively measured on the same study participants used to develop the original prediction tool, necessitating the merging of a separate study of different participants, which may be much smaller in sample size and of a different design. Validation of the updated tool on a third independent data set is warranted before the updated tool can go online. This article reports on the application of Bayes rule for updating risk prediction tools to include a set of biomarkers measured in an external study to the original study used to develop the risk prediction tool. The procedure is illustrated in the context of updating the online Prostate Cancer Prevention Trial Risk Calculator to incorporate the new markers %freePSA and [-2]proPSA measured on an external case-control study performed in Texas, U.S.. Recent state-of-the art methods in validation of risk prediction tools and evaluation of the improvement of updated to original tools are implemented using an external validation set provided by the U.S. Early Detection Research Network. Copyright © 2012 WILEY-VCH Verlag GmbH & Co. KGaA, Weinheim.

  5. Flight Experiment Verification of Shuttle Boundary Layer Transition Prediction Tool

    Science.gov (United States)

    Berry, Scott A.; Berger, Karen T.; Horvath, Thomas J.; Wood, William A.

    2016-01-01

    Boundary layer transition at hypersonic conditions is critical to the design of future high-speed aircraft and spacecraft. Accurate methods to predict transition would directly impact the aerothermodynamic environments used to size a hypersonic vehicle's thermal protection system. A transition prediction tool, based on wind tunnel derived discrete roughness correlations, was developed and implemented for the Space Shuttle return-to-flight program. This tool was also used to design a boundary layer transition flight experiment in order to assess correlation uncertainties, particularly with regard to high Mach-number transition and tunnel-to-flight scaling. A review is provided of the results obtained from the flight experiment in order to evaluate the transition prediction tool implemented for the Shuttle program.

  6. Predictions of titanium alloy properties using thermodynamic modeling tools

    Science.gov (United States)

    Zhang, F.; Xie, F.-Y.; Chen, S.-L.; Chang, Y. A.; Furrer, D.; Venkatesh, V.

    2005-12-01

    Thermodynamic modeling tools have become essential in understanding the effect of alloy chemistry on the final microstructure of a material. Implementation of such tools to improve titanium processing via parameter optimization has resulted in significant cost savings through the elimination of shop/laboratory trials and tests. In this study, a thermodynamic modeling tool developed at CompuTherm, LLC, is being used to predict β transus, phase proportions, phase chemistries, partitioning coefficients, and phase boundaries of multicomponent titanium alloys. This modeling tool includes Pandat, software for multicomponent phase equilibrium calculations, and PanTitanium, a thermodynamic database for titanium alloys. Model predictions are compared with experimental results for one α-β alloy (Ti-64) and two near-β alloys (Ti-17 and Ti-10-2-3). The alloying elements, especially the interstitial elements O, N, H, and C, have been shown to have a significant effect on the β transus temperature, and are discussed in more detail herein.

  7. Empirical comparison of web-based antimicrobial peptide prediction tools.

    Science.gov (United States)

    Gabere, Musa Nur; Noble, William Stafford

    2017-07-01

    Antimicrobial peptides (AMPs) are innate immune molecules that exhibit activities against a range of microbes, including bacteria, fungi, viruses and protozoa. Recent increases in microbial resistance against current drugs has led to a concomitant increase in the need for novel antimicrobial agents. Over the last decade, a number of AMP prediction tools have been designed and made freely available online. These AMP prediction tools show potential to discriminate AMPs from non-AMPs, but the relative quality of the predictions produced by the various tools is difficult to quantify. We compiled two sets of AMP and non-AMP peptides, separated into three categories-antimicrobial, antibacterial and bacteriocins. Using these benchmark data sets, we carried out a systematic evaluation of ten publicly available AMP prediction methods. Among the six general AMP prediction tools-ADAM, CAMPR3(RF), CAMPR3(SVM), MLAMP, DBAASP and MLAMP-we find that CAMPR3(RF) provides a statistically significant improvement in performance, as measured by the area under the receiver operating characteristic (ROC) curve, relative to the other five methods. Surprisingly, for antibacterial prediction, the original AntiBP method significantly outperforms its successor, AntiBP2 based on one benchmark dataset. The two bacteriocin prediction tools, BAGEL3 and BACTIBASE, both provide very good performance and BAGEL3 outperforms its predecessor, BACTIBASE, on the larger of the two benchmarks. gaberemu@ngha.med.sa or william-noble@uw.edu. Supplementary data are available at Bioinformatics online. © The Author 2017. Published by Oxford University Press. All rights reserved. For permissions, please e-mail: journals.permissions@oup.com

  8. WPPT, a tool for on-line wind power prediction

    Energy Technology Data Exchange (ETDEWEB)

    Skov Nielsen, T. [Dept. of Mathematical Modelling (IMM-DTU), Kgs. Lyngby (Denmark); Madsen, H. [Dept. of Mathematical Modelling (IMM-DTU) Kgs. Lyngby (Denmark); Toefting, J. [Elsam, Fredericia (Denmark)

    2004-07-01

    This paper dsecribes VPPT (Wind Power Prediction Tool), an application for assessing the future available wind power up to 36 hours ahead in time. WPPT has been installed in the Eltra/Elsam central dispatch center since October 1997. The paper describes the prediction model used, the actual implementation of WPPT as well as the experience gained by the operators in the dispatch center (au)

  9. Prediction Of Abrasive And Diffusive Tool Wear Mechanisms In Machining

    Science.gov (United States)

    Rizzuti, S.; Umbrello, D.

    2011-01-01

    Tool wear prediction is regarded as very important task in order to maximize tool performance, minimize cutting costs and improve the quality of workpiece in cutting. In this research work, an experimental campaign was carried out at the varying of cutting conditions with the aim to measure both crater and flank tool wear, during machining of an AISI 1045 with an uncoated carbide tool P40. Parallel a FEM-based analysis was developed in order to study the tool wear mechanisms, taking also into account the influence of the cutting conditions and the temperature reached on the tool surfaces. The results show that, when the temperature of the tool rake surface is lower than the activation temperature of the diffusive phenomenon, the wear rate can be estimated applying an abrasive model. In contrast, in the tool area where the temperature is higher than the diffusive activation temperature, the wear rate can be evaluated applying a diffusive model. Finally, for a temperature ranges within the above cited values an adopted abrasive-diffusive wear model furnished the possibility to correctly evaluate the tool wear phenomena.

  10. MONRATE, a descriptive tool for calculation and prediction of re ...

    African Journals Online (AJOL)

    The objective of the study was to develop an interactive and systematic descriptive tool, MONRATE for calculating and predicting reinfection rates and time of Ascaris lumbricoides following mass chemotherapy using levamisole. Each pupil previously treated was retreated 6 or 7 months after the initial treatment in Ogun ...

  11. Utilising symptom dimensions with diagnostic categories improves prediction of time to first remission in first-episode psychosis.

    Science.gov (United States)

    Ajnakina, Olesya; Lally, John; Di Forti, Marta; Stilo, Simona A; Kolliakou, Anna; Gardner-Sood, Poonam; Dazzan, Paola; Pariante, Carmine; Reis Marques, Tiago; Mondelli, Valeria; MacCabe, James; Gaughran, Fiona; David, Anthony S; Stamate, Daniel; Murray, Robin M; Fisher, Helen L

    2018-03-01

    There has been much recent debate concerning the relative clinical utility of symptom dimensions versus conventional diagnostic categories in patients with psychosis. We investigated whether symptom dimensions rated at presentation for first-episode psychosis (FEP) better predicted time to first remission than categorical diagnosis over a four-year follow-up. The sample comprised 193 FEP patients aged 18-65years who presented to psychiatric services in South London, UK, between 2006 and 2010. Psychopathology was assessed at baseline with the Positive and Negative Syndrome Scale and five symptom dimensions were derived using Wallwork/Fortgang's model; baseline diagnoses were grouped using DSM-IV codes. Time to start of first remission was ascertained from clinical records. The Bayesian Information Criterion (BIC) was used to find the best fitting accelerated failure time model of dimensions, diagnoses and time to first remission. Sixty percent of patients remitted over the four years following first presentation to psychiatric services, and the average time to start of first remission was 18.3weeks (SD=26.0, median=8). The positive (BIC=166.26), excited (BIC=167.30) and disorganised/concrete (BIC=168.77) symptom dimensions, and a diagnosis of schizophrenia (BIC=166.91) predicted time to first remission. However, a combination of the DSM-IV diagnosis of schizophrenia with all five symptom dimensions led to the best fitting model (BIC=164.35). Combining categorical diagnosis with symptom dimension scores in FEP patients improved the accuracy of predicting time to first remission. Thus our data suggest that the decision to consign symptom dimensions to an annexe in DSM-5 should be reconsidered at the earliest opportunity. Copyright © 2017 Elsevier B.V. All rights reserved.

  12. Predicting malicious behavior tools and techniques for ensuring global security

    CERN Document Server

    Jackson, Gary M

    2012-01-01

    A groundbreaking exploration of how to identify and fight security threats at every level This revolutionary book combines real-world security scenarios with actual tools to predict and prevent incidents of terrorism, network hacking, individual criminal behavior, and more. Written by an expert with intelligence officer experience who invented the technology, it explores the keys to understanding the dark side of human nature, various types of security threats (current and potential), and how to construct a methodology to predict and combat malicious behavior. The companion CD demonstrates ava

  13. GOPET: A tool for automated predictions of Gene Ontology terms

    Directory of Open Access Journals (Sweden)

    Glatting Karl-Heinz

    2006-03-01

    Full Text Available Abstract Background Vast progress in sequencing projects has called for annotation on a large scale. A Number of methods have been developed to address this challenging task. These methods, however, either apply to specific subsets, or their predictions are not formalised, or they do not provide precise confidence values for their predictions. Description We recently established a learning system for automated annotation, trained with a broad variety of different organisms to predict the standardised annotation terms from Gene Ontology (GO. Now, this method has been made available to the public via our web-service GOPET (Gene Ontology term Prediction and Evaluation Tool. It supplies annotation for sequences of any organism. For each predicted term an appropriate confidence value is provided. The basic method had been developed for predicting molecular function GO-terms. It is now expanded to predict biological process terms. This web service is available via http://genius.embnet.dkfz-heidelberg.de/menu/biounit/open-husar Conclusion Our web service gives experimental researchers as well as the bioinformatics community a valuable sequence annotation device. Additionally, GOPET also provides less significant annotation data which may serve as an extended discovery platform for the user.

  14. Popularity prediction tool for ATLAS distributed data management

    International Nuclear Information System (INIS)

    Beermann, T; Maettig, P; Stewart, G; Lassnig, M; Garonne, V; Barisits, M; Vigne, R; Serfon, C; Goossens, L; Nairz, A; Molfetas, A

    2014-01-01

    This paper describes a popularity prediction tool for data-intensive data management systems, such as ATLAS distributed data management (DDM). It is fed by the DDM popularity system, which produces historical reports about ATLAS data usage, providing information about files, datasets, users and sites where data was accessed. The tool described in this contribution uses this historical information to make a prediction about the future popularity of data. It finds trends in the usage of data using a set of neural networks and a set of input parameters and predicts the number of accesses in the near term future. This information can then be used in a second step to improve the distribution of replicas at sites, taking into account the cost of creating new replicas (bandwidth and load on the storage system) compared to gain of having new ones (faster access of data for analysis). To evaluate the benefit of the redistribution a grid simulator is introduced that is able replay real workload on different data distributions. This article describes the popularity prediction method and the simulator that is used to evaluate the redistribution.

  15. Popularity Prediction Tool for ATLAS Distributed Data Management

    Science.gov (United States)

    Beermann, T.; Maettig, P.; Stewart, G.; Lassnig, M.; Garonne, V.; Barisits, M.; Vigne, R.; Serfon, C.; Goossens, L.; Nairz, A.; Molfetas, A.; Atlas Collaboration

    2014-06-01

    This paper describes a popularity prediction tool for data-intensive data management systems, such as ATLAS distributed data management (DDM). It is fed by the DDM popularity system, which produces historical reports about ATLAS data usage, providing information about files, datasets, users and sites where data was accessed. The tool described in this contribution uses this historical information to make a prediction about the future popularity of data. It finds trends in the usage of data using a set of neural networks and a set of input parameters and predicts the number of accesses in the near term future. This information can then be used in a second step to improve the distribution of replicas at sites, taking into account the cost of creating new replicas (bandwidth and load on the storage system) compared to gain of having new ones (faster access of data for analysis). To evaluate the benefit of the redistribution a grid simulator is introduced that is able replay real workload on different data distributions. This article describes the popularity prediction method and the simulator that is used to evaluate the redistribution.

  16. SitesIdentify: a protein functional site prediction tool

    Directory of Open Access Journals (Sweden)

    Doig Andrew J

    2009-11-01

    Full Text Available Abstract Background The rate of protein structures being deposited in the Protein Data Bank surpasses the capacity to experimentally characterise them and therefore computational methods to analyse these structures have become increasingly important. Identifying the region of the protein most likely to be involved in function is useful in order to gain information about its potential role. There are many available approaches to predict functional site, but many are not made available via a publicly-accessible application. Results Here we present a functional site prediction tool (SitesIdentify, based on combining sequence conservation information with geometry-based cleft identification, that is freely available via a web-server. We have shown that SitesIdentify compares favourably to other functional site prediction tools in a comparison of seven methods on a non-redundant set of 237 enzymes with annotated active sites. Conclusion SitesIdentify is able to produce comparable accuracy in predicting functional sites to its closest available counterpart, but in addition achieves improved accuracy for proteins with few characterised homologues. SitesIdentify is available via a webserver at http://www.manchester.ac.uk/bioinformatics/sitesidentify/

  17. Tools for Predicting Cleaning Efficiency in the LHC

    CERN Document Server

    Assmann, R W; Brugger, M; Hayes, M; Jeanneret, J B; Kain, V; Kaltchev, D I; Schmidt, F

    2003-01-01

    The computer codes SIXTRACK and DIMAD have been upgraded to include realistic models of proton scattering in collimator jaws, mechanical aperture restrictions, and time-dependent fields. These new tools complement long-existing simplified linear tracking programs used up to now for tracking with collimators. Scattering routines from STRUCT and K2 have been compared with one another and the results have been cross-checked to the FLUKA Monte Carlo package. A systematic error is assigned to the predictions of cleaning efficiency. Now, predictions of the cleaning efficiency are possible with a full LHC model, including chromatic effects, linear and nonlinear errors, beam-beam kicks and associated diffusion, and time-dependent fields. The beam loss can be predicted around the ring, both for regular and irregular beam losses. Examples are presented.

  18. HostPhinder: A Phage Host Prediction Tool

    Directory of Open Access Journals (Sweden)

    Julia Villarroel

    2016-05-01

    Full Text Available The current dramatic increase of antibiotic resistant bacteria has revitalised the interest in bacteriophages as alternative antibacterial treatment. Meanwhile, the development of bioinformatics methods for analysing genomic data places high-throughput approaches for phage characterization within reach. Here, we present HostPhinder, a tool aimed at predicting the bacterial host of phages by examining the phage genome sequence. Using a reference database of 2196 phages with known hosts, HostPhinder predicts the host species of a query phage as the host of the most genomically similar reference phages. As a measure of genomic similarity the number of co-occurring k-mers (DNA sequences of length k is used. Using an independent evaluation set, HostPhinder was able to correctly predict host genus and species for 81% and 74% of the phages respectively, giving predictions for more phages than BLAST and significantly outperforming BLAST on phages for which both had predictions. HostPhinder predictions on phage draft genomes from the INTESTI phage cocktail corresponded well with the advertised targets of the cocktail. Our study indicates that for most phages genomic similarity correlates well with related bacterial hosts. HostPhinder is available as an interactive web service [1] and as a stand alone download from the Docker registry [2].

  19. Under-utilisation of preventive medication in patients with cardiovascular disease is greatest in younger age groups (PREDICT-CVD 15).

    Science.gov (United States)

    Mehta, Suneela; Wells, Sue; Riddell, Tania; Kerr, Andrew; Pylypchuk, Romana; Marshall, Roger; Ameratunga, Shanthi; Chan, Wing Cheuk; Thornley, Simon; Crengle, Sue; Harrison, Jeff; Drury, Paul; Elley, C Raina; Bell, Fionna; Jackson, Rod

    2011-06-01

    Blood pressure-lowering (BPL) and lipid-lowering (LL) medications together reduce estimated absolute five-year cardiovascular disease (CVD) risk by >40%. International studies indicate that the proportion of people with CVD receiving pharmacotherapy increases with advancing age. To compare BPL and LL medications, by sociodemographic characteristics, for patients with known CVD in primary care settings. The study population included patients aged 35-74 with known CVD assessed in primary care from July 2006 to October 2009 using a web-based computerised decision support system (PREDICT) for risk assessment and management. Clinical data linked anonymously to national sociodemographic and pharmaceutical dispensing databases. Differences in dispensing BPL and LL medications in six months before first PREDICT assessment was analysed according to age, sex, ethnicity and deprivation. Of 7622 people with CVD, 1625 <55 years old, 2862 were women and 4609 lived in deprived areas (NZDep quintiles 4/5). The study population included 4249 European, 1556 Maori, 1151 Pacific and 329 Indian peoples. BPL medications were dispensed to 81%, LL medications to 73%, both BPL and LL medications to 67%, and 87% received either class of medication. Compared with people aged 65-75, people aged 35-44 were 30-40% less likely and those aged 45-54 were 10-15% less likely to be dispensed BPL, LL medications or both. There were minimal differences in likelihood of dispensing according to sex, ethnicity or deprivation. BPL and LL medications are under-utilised in patients with known CVD in New Zealand. Only two-thirds of patients in this cohort are on both. Younger patients are considerably less likely to be on recommended medications.

  20. Analysis and Prediction of Micromilling Stability with Variable Tool Geometry

    Directory of Open Access Journals (Sweden)

    Ziyang Cao

    2014-11-01

    Full Text Available Micromilling can fabricate miniaturized components using micro-end mill at high rotational speeds. The analysis of machining stability in micromilling plays an important role in characterizing the cutting process, estimating the tool life, and optimizing the process. A numerical analysis and experimental method are presented to investigate the chatter stability in micro-end milling process with variable milling tool geometry. The schematic model of micromilling process is constructed and the calculation formula to predict cutting force and displacements is derived. This is followed by a detailed numerical analysis on micromilling forces between helical ball and square end mills through time domain and frequency domain method and the results are compared. Furthermore, a detailed time domain simulation for micro end milling with straight teeth and helical teeth end mill is conducted based on the machine-tool system frequency response function obtained through modal experiment. The forces and displacements are predicted and the simulation result between variable cutter geometry is deeply compared. The simulation results have important significance for the actual milling process.

  1. Popularity Prediction Tool for ATLAS Distributed Data Management

    CERN Document Server

    Beermann, T; The ATLAS collaboration; Stewart, G; Lassnig, M; Garonne, V; Barisits, M; Vigne, R; Serfon, C; Goossens, L; Nairz, A; Molfetas, A

    2013-01-01

    This paper describes a popularity prediction tool for data-intensive data management systems, such as ATLAS distributed data management (DDM). It is fed by the DDM popularity system, which produces historical reports about ATLAS data usage, providing information about files, datasets, users and sites where data was accessed. The tool described in this contribution uses this historical information to make a prediction about the future popularity of data. It finds trends in the usage of data using a set of neural networks and a set of input parameters and predicts the number of accesses in the near term future. This information can then be used in a second step to improve the distribution of replicas at sites, taking into account the cost of creating new replicas (bandwidth and load on the storage system) compared to gain of having new ones (faster access of data for analysis). To evaluate the benefit of the redistribution a grid simulator is introduced that is able replay real workload on different data distri...

  2. Popularity Prediction Tool for ATLAS Distributed Data Management

    CERN Document Server

    Beermann, T; The ATLAS collaboration; Stewart, G; Lassnig, M; Garonne, V; Barisits, M; Vigne, R; Serfon, C; Goossens, L; Nairz, A; Molfetas, A

    2014-01-01

    This paper describes a popularity prediction tool for data-intensive data management systems, such as ATLAS distributed data management (DDM). It is fed by the DDM popularity system, which produces historical reports about ATLAS data usage, providing information about files, datasets, users and sites where data was accessed. The tool described in this contribution uses this historical information to make a prediction about the future popularity of data. It finds trends in the usage of data using a set of neural networks and a set of input parameters and predicts the number of accesses in the near term future. This information can then be used in a second step to improve the distribution of replicas at sites, taking into account the cost of creating new replicas (bandwidth and load on the storage system) compared to gain of having new ones (faster access of data for analysis). To evaluate the benefit of the redistribution a grid simulator is introduced that is able replay real workload on different data distri...

  3. PRmePRed: A protein arginine methylation prediction tool.

    Directory of Open Access Journals (Sweden)

    Pawan Kumar

    Full Text Available Protein methylation is an important Post-Translational Modification (PTMs of proteins. Arginine methylation carries out and regulates several important biological functions, including gene regulation and signal transduction. Experimental identification of arginine methylation site is a daunting task as it is costly as well as time and labour intensive. Hence reliable prediction tools play an important task in rapid screening and identification of possible methylation sites in proteomes. Our preliminary assessment using the available prediction methods on collected data yielded unimpressive results. This motivated us to perform a comprehensive data analysis and appraisal of features relevant in the context of biological significance, that led to the development of a prediction tool PRmePRed with better performance. The PRmePRed perform reasonably well with an accuracy of 84.10%, 82.38% sensitivity, 83.77% specificity, and Matthew's correlation coefficient of 66.20% in 10-fold cross-validation. PRmePRed is freely available at http://bioinfo.icgeb.res.in/PRmePRed/.

  4. Development of Next Generation Multiphase Pipe Flow Prediction Tools

    Energy Technology Data Exchange (ETDEWEB)

    Cem Sarica; Holden Zhang

    2006-05-31

    The developments of oil and gas fields in deep waters (5000 ft and more) will become more common in the future. It is inevitable that production systems will operate under multiphase flow conditions (simultaneous flow of gas, oil and water possibly along with sand, hydrates, and waxes). Multiphase flow prediction tools are essential for every phase of hydrocarbon recovery from design to operation. Recovery from deep-waters poses special challenges and requires accurate multiphase flow predictive tools for several applications, including the design and diagnostics of the production systems, separation of phases in horizontal wells, and multiphase separation (topside, seabed or bottom-hole). It is crucial for any multiphase separation technique, either at topside, seabed or bottom-hole, to know inlet conditions such as flow rates, flow patterns, and volume fractions of gas, oil and water coming into the separation devices. Therefore, the development of a new generation of multiphase flow predictive tools is needed. The overall objective of the proposed study is to develop a unified model for gas-oil-water three-phase flow in wells, flow lines, and pipelines to predict flow characteristics such as flow patterns, phase distributions, and pressure gradient encountered during petroleum production at different flow conditions (pipe diameter and inclination, fluid properties and flow rates). In the current multiphase modeling approach, flow pattern and flow behavior (pressure gradient and phase fractions) prediction modeling are separated. Thus, different models based on different physics are employed, causing inaccuracies and discontinuities. Moreover, oil and water are treated as a pseudo single phase, ignoring the distinct characteristics of both oil and water, and often resulting in inaccurate design that leads to operational problems. In this study, a new model is being developed through a theoretical and experimental study employing a revolutionary approach. The

  5. An Engineering Tool for the Prediction of Internal Dielectric Charging

    Science.gov (United States)

    Rodgers, D. J.; Ryden, K. A.; Wrenn, G. L.; Latham, P. M.; Sorensen, J.; Levy, L.

    1998-11-01

    A practical internal charging tool has been developed. It provides an easy-to-use means for satellite engineers to predict whether on-board dielectrics are vulnerable to electrostatic discharge in the outer radiation belt. The tool is designed to simulate irradiation of single-dielectric planar or cylindrical structures with or without shielding. Analytical equations are used to describe current deposition in the dielectric. This is fast and gives charging currents to sufficient accuracy given the uncertainties in other aspects of the problem - particularly material characteristics. Time-dependent internal electric fields are calculated, taking into account the effect on conductivity of electric field, dose rate and temperature. A worst-case model of electron fluxes in the outer belt has been created specifically for the internal charging problem and is built into the code. For output, the tool gives a YES or NO decision on the susceptibility of the structure to internal electrostatic breakdown and if necessary, calculates the required changes to bring the system below the breakdown threshold. A complementary programme of laboratory irradiations has been carried out to validate the tool. The results for Epoxy-fibreglass samples show that the code models electric field realistically for a wide variety of shields, dielectric thicknesses and electron spectra. Results for Teflon samples indicate that some further experimentation is required and the radiation-induced conductivity aspects of the code have not been validated.

  6. A tool model for predicting atmospheric kinetics with sensitivity analysis

    Institute of Scientific and Technical Information of China (English)

    2001-01-01

    A package( a tool model) for program of predicting atmospheric chemical kinetics with sensitivity analysis is presented. The new direct method of calculating the first order sensitivity coefficients using sparse matrix technology to chemical kinetics is included in the tool model, it is only necessary to triangularize the matrix related to the Jacobian matrix of the model equation. The Gear type procedure is used to integrate amodel equation and its coupled auxiliary sensitivity coefficient equations. The FORTRAN subroutines of the model equation, the sensitivity coefficient equations, and their Jacobian analytical expressions are generated automatically from a chemical mechanism. The kinetic representation for the model equation and its sensitivity coefficient equations, and their Jacobian matrix is presented. Various FORTRAN subroutines in packages, such as SLODE, modified MA28, Gear package, with which the program runs in conjunction are recommended.The photo-oxidation of dimethyl disulfide is used for illustration.

  7. Predicting SPE Fluxes: Coupled Simulations and Analysis Tools

    Science.gov (United States)

    Gorby, M.; Schwadron, N.; Linker, J.; Caplan, R. M.; Wijaya, J.; Downs, C.; Lionello, R.

    2017-12-01

    Presented here is a nuts-and-bolts look at the coupled framework of Predictive Science Inc's Magnetohydrodynamics Around a Sphere (MAS) code and the Energetic Particle Radiation Environment Module (EPREM). MAS simulated coronal mass ejection output from a variety of events can be selected as the MHD input to EPREM and a variety of parameters can be set to run against: bakground seed particle spectra, mean free path, perpendicular diffusion efficiency, etc.. A standard set of visualizations are produced as well as a library of analysis tools for deeper inquiries. All steps will be covered end-to-end as well as the framework's user interface and availability.

  8. Using social media as a tool to predict syphilis.

    Science.gov (United States)

    Young, Sean D; Mercer, Neil; Weiss, Robert E; Torrone, Elizabeth A; Aral, Sevgi O

    2018-04-01

    Syphilis rates have been rapidly rising in the United States. New technologies, such as social media, might be used to anticipate and prevent the spread of disease. Because social media data collection is easy and inexpensive, integration of social media data into syphilis surveillance may be a cost-effective surveillance strategy, especially in low-resource regions. People are increasingly using social media to discuss health-related issues, such as sexual risk behaviors, allowing social media to be a potential tool for public health and medical research. This study mined Twitter data to assess whether social media could be used to predict syphilis cases in 2013 based on 2012 data. We collected 2012 and 2013 county-level primary and secondary (P&S) and early latent syphilis cases reported to the Center for Disease Control and Prevention, along with >8500 geolocated tweets in the United States that were filtered to include sexual risk-related keywords, including colloquial terms for intercourse. We assessed the relationship between syphilis-related tweets and actual case reports by county, controlling for socioeconomic indicators and prior year syphilis cases. We found a significant positive relationship between tweets and cases of P&S and early latent syphilis. This study shows that social media may be an additional tool to enhance syphilis prediction and surveillance. Copyright © 2017 Elsevier Inc. All rights reserved.

  9. Predictive tools for designing new insulins and treatment regimens

    DEFF Research Database (Denmark)

    Klim, Søren

    The thesis deals with the development of "Predictive tools for designing new insulins and treatments regimens" and consists of two parts: A model based approach for bridging properties of new insulin analogues from glucose clamp experiments to meal tolerance tests (MTT) and a second part that des......The thesis deals with the development of "Predictive tools for designing new insulins and treatments regimens" and consists of two parts: A model based approach for bridging properties of new insulin analogues from glucose clamp experiments to meal tolerance tests (MTT) and a second part...... that describes an implemented software program able to handle stochastic differential equations (SDEs) with mixed effects. The thesis is supplemented with scientific papers published during the PhD. Developing an insulin analogue from candidate molecule to a clinical drug consists of a development programme...... and efficacy are investigated. Numerous methods are used to quantify dose and efficacy in Phase II - especially of interest is the 24-hour meal tolerance test as it tries to portray near normal living conditions. Part I describes an integrated model for insulin and glucose which is aimed at simulating 24-hour...

  10. Tools for predicting the PK/PD of therapeutic proteins.

    Science.gov (United States)

    Diao, Lei; Meibohm, Bernd

    2015-07-01

    Assessments of the pharmacokinetic/pharmacodynamic (PK/PD) characteristics are an integral part in the development of novel therapeutic agents. Compared with traditional small molecule drugs, therapeutic proteins possess many distinct PK/PD features that necessitate the application of modified or separate approaches for assessing their PK/PD relationships. In this review, the authors discuss tools that are utilized to describe and predict the PK/PD features of therapeutic proteins and that are valuable additions in the armamentarium of drug development approaches to facilitate and accelerate their successful preclinical and clinical development. A variety of state-of-the-art PK/PD tools is currently being applied and has been adjusted to support the development of proteins as therapeutics, including allometric scaling approaches, target-mediated disposition models, first-in-man dose calculations, physiologically based PK models and empirical and semi-mechanistic PK/PD modeling. With the advent of the next generation of biologics including bioengineered antibody constructs being developed, these tools will need to be further refined and adapted to ensure their applicability and successful facilitation of the drug development process for these novel scaffolds.

  11. Aspects of biogas utilisation

    International Nuclear Information System (INIS)

    Luning, L.

    1992-01-01

    Utilisation of biogas has received considerable attention over the last decade, its full potential has not been reached however. The paper discusses various options for utilisation of biogas and the limitations that may occur as far as they are associated with the characteristics of biogas. As a result the prospects for the future are presented. (au)

  12. Cardiovascular risk prediction tools for populations in Asia.

    Science.gov (United States)

    Barzi, F; Patel, A; Gu, D; Sritara, P; Lam, T H; Rodgers, A; Woodward, M

    2007-02-01

    Cardiovascular risk equations are traditionally derived from the Framingham Study. The accuracy of this approach in Asian populations, where resources for risk factor measurement may be limited, is unclear. To compare "low-information" equations (derived using only age, systolic blood pressure, total cholesterol and smoking status) derived from the Framingham Study with those derived from the Asian cohorts, on the accuracy of cardiovascular risk prediction. Separate equations to predict the 8-year risk of a cardiovascular event were derived from Asian and Framingham cohorts. The performance of these equations, and a subsequently "recalibrated" Framingham equation, were evaluated among participants from independent Chinese cohorts. Six cohort studies from Japan, Korea and Singapore (Asian cohorts); six cohort studies from China; the Framingham Study from the US. 172,077 participants from the Asian cohorts; 25,682 participants from Chinese cohorts and 6053 participants from the Framingham Study. In the Chinese cohorts, 542 cardiovascular events occurred during 8 years of follow-up. Both the Asian cohorts and the Framingham equations discriminated cardiovascular risk well in the Chinese cohorts; the area under the receiver-operator characteristic curve was at least 0.75 for men and women. However, the Framingham risk equation systematically overestimated risk in the Chinese cohorts by an average of 276% among men and 102% among women. The corresponding average overestimation using the Asian cohorts equation was 11% and 10%, respectively. Recalibrating the Framingham risk equation using cardiovascular disease incidence from the non-Chinese Asian cohorts led to an overestimation of risk by an average of 4% in women and underestimation of risk by an average of 2% in men. A low-information Framingham cardiovascular risk prediction tool, which, when recalibrated with contemporary data, is likely to estimate future cardiovascular risk with similar accuracy in Asian

  13. Predicted impact and evaluation of North Carolina's phosphorus indexing tool.

    Science.gov (United States)

    Johnson, Amy M; Osmond, Deanna L; Hodges, Steven C

    2005-01-01

    Increased concern about potential losses of phosphorus (P) from agricultural fields receiving animal waste has resulted in the implementation of new state and federal regulations related to nutrient management. In response to strengthened nutrient management standards that require consideration of P, North Carolina has developed a site-specific P indexing system called the Phosphorus Loss Assessment Tool (PLAT) to predict relative amounts of potential P loss from agricultural fields. The purpose of this study was to apply the PLAT index on farms throughout North Carolina in an attempt to predict the percentage and types of farms that will be forced to change management practices due to implementation of new regulations. Sites from all 100 counties were sampled, with the number of samples taken from each county depending on the proportion of the state's agricultural land that occurs in that county. Results showed that approximately 8% of producers in the state will be required to apply animal waste or inorganic fertilizer on a P rather than nitrogen basis, with the percentage increasing for farmers who apply animal waste (approximately 27%). The PLAT index predicted the greatest amounts of P loss from sites in the Coastal Plain region of North Carolina and from sites receiving poultry waste. Loss of dissolved P through surface runoff tended to be greater than other loss pathways and presents an area of concern as no best management practices (BMPs) currently exist for the reduction of in-field dissolved P. The PLAT index predicted the areas in the state that are known to be disproportionately vulnerable to P loss due to histories of high P applications, high densities of animal units, or soil type and landscapes that are most susceptible to P loss.

  14. Development of Next Generation Multiphase Pipe Flow Prediction Tools

    Energy Technology Data Exchange (ETDEWEB)

    Tulsa Fluid Flow

    2008-08-31

    The developments of fields in deep waters (5000 ft and more) is a common occurrence. It is inevitable that production systems will operate under multiphase flow conditions (simultaneous flow of gas-oil-and water possibly along with sand, hydrates, and waxes). Multiphase flow prediction tools are essential for every phase of the hydrocarbon recovery from design to operation. The recovery from deep-waters poses special challenges and requires accurate multiphase flow predictive tools for several applications including the design and diagnostics of the production systems, separation of phases in horizontal wells, and multiphase separation (topside, seabed or bottom-hole). It is very crucial to any multiphase separation technique that is employed either at topside, seabed or bottom-hole to know inlet conditions such as the flow rates, flow patterns, and volume fractions of gas, oil and water coming into the separation devices. The overall objective was to develop a unified model for gas-oil-water three-phase flow in wells, flow lines, and pipelines to predict the flow characteristics such as flow patterns, phase distributions, and pressure gradient encountered during petroleum production at different flow conditions (pipe diameter and inclination, fluid properties and flow rates). The project was conducted in two periods. In Period 1 (four years), gas-oil-water flow in pipes were investigated to understand the fundamental physical mechanisms describing the interaction between the gas-oil-water phases under flowing conditions, and a unified model was developed utilizing a novel modeling approach. A gas-oil-water pipe flow database including field and laboratory data was formed in Period 2 (one year). The database was utilized in model performance demonstration. Period 1 primarily consisted of the development of a unified model and software to predict the gas-oil-water flow, and experimental studies of the gas-oil-water project, including flow behavior description and

  15. Pattern of Smartphones Utilisation among Engineering Undergraduates

    Directory of Open Access Journals (Sweden)

    Muliati Sedek

    2014-04-01

    Full Text Available The smartphones ownership among the undergraduates in Malaysia was recorded as high. However, little was known about its utilization patterns, thus, the focus of this research was to determine the utilisation patterns of smartphones based on the National Education Technology Standard for Students (NETS.S among engineering undergraduates in Malaysia. This study was based on a quantitative research and the population comprised undergraduates from four Malaysian Technical Universities. A total of 400 questionnaires were analyzed. Based on the results, the undergraduates’ utilisation level of smartphones for communication and collaboration tool was at a high level. Meanwhile, utilisation for operations and concepts tool and research and information fluency tool were at moderate level. Finally, smartphones utilisation as digital citizenship tool and critical thinking, problem solving and creativity tool were both at a low level. Hence, more training and workshops should be given to the students in order to encourage them to fully utilise smartphones in enhancing the higher order thinking skills.

  16. Does the Risk Assessment and Prediction Tool Predict Discharge Disposition After Joint Replacement?

    DEFF Research Database (Denmark)

    Hansen, Viktor J.; Gromov, Kirill; Lebrun, Lauren M

    2015-01-01

    BACKGROUND: Payers of health services and policymakers place a major focus on cost containment in health care. Studies have shown that early planning of discharge is essential in reducing length of stay and achieving financial benefit; tools that can help predict discharge disposition would...... populations is unknown. A low RAPT score is reported to indicate a high risk of needing any form of inpatient rehabilitation after TJA, including short-term nursing facilities. QUESTIONS/PURPOSES: This study attempts (1) to assess predictive accuracy of the RAPT on US patients undergoing total hip and knee....... Based on our findings, the risk categories in our populations should be high risk intermediate risk 7 to 10, and low risk > 10. CONCLUSIONS: The RAPT accurately predicted discharge disposition for high- and low-risk patients in our cohort. Based on our data, intermediate-risk patients should...

  17. A clinical tool to predict Plasmodium vivax recurrence in Malaysia.

    Science.gov (United States)

    Mat Ariffin, Norliza; Islahudin, Farida; Kumolosasi, Endang; Makmor-Bakry, Mohd

    2017-12-08

    Recurrence rates of Plasmodium vivax infections differ across various geographic regions. Interestingly, South-East Asia and the Asia-Pacific region are documented to exhibit the most frequent recurrence incidences. Identifying patients at a higher risk for recurrences gives valuable information in strengthening the efforts to control P. vivax infections. The aim of the study was to develop a tool to identify P. vivax- infected patients that are at a higher risk of recurrence in Malaysia. Patient data was obtained retrospectively through the Ministry of Health, Malaysia, from 2011 to 2016. Patients with incomplete data were excluded. A total of 2044 clinical P. vivax malaria cases treated with primaquine were included. Data collected were patient, disease, and treatment characteristics. Two-thirds of the cases (n = 1362) were used to develop a clinical risk score, while the remaining third (n = 682) was used for validation. Using multivariate analysis, age (p = 0.03), gametocyte sexual count (p = 0.04), indigenous transmission (p = 0.04), type of treatment (p = 0.12), and incomplete primaquine treatment (p = 0.14) were found to be predictors of recurrence after controlling for other confounding factors; these predictors were then used in developing the final model. The beta-coefficient values were used to develop a clinical scoring tool to predict possible recurrence. The total scores ranged between 0 and 8. A higher score indicated a higher risk for recurrence (odds ratio [OR]: 1.971; 95% confidence interval [CI]: 1.562-2.487; p ≤ 0.001). The area under the receiver operating characteristic (ROC) curve of the developed (n = 1362) and validated model (n = 682) was of good accuracy (ROC: 0.728, 95% CI: 0.670-0.785, p value useful tool in targeting patients at a higher risk for recurrence for closer monitoring during follow-up, after treatment with primaquine.

  18. Bitter or not? BitterPredict, a tool for predicting taste from chemical structure.

    Science.gov (United States)

    Dagan-Wiener, Ayana; Nissim, Ido; Ben Abu, Natalie; Borgonovo, Gigliola; Bassoli, Angela; Niv, Masha Y

    2017-09-21

    Bitter taste is an innately aversive taste modality that is considered to protect animals from consuming toxic compounds. Yet, bitterness is not always noxious and some bitter compounds have beneficial effects on health. Hundreds of bitter compounds were reported (and are accessible via the BitterDB http://bitterdb.agri.huji.ac.il/dbbitter.php ), but numerous additional bitter molecules are still unknown. The dramatic chemical diversity of bitterants makes bitterness prediction a difficult task. Here we present a machine learning classifier, BitterPredict, which predicts whether a compound is bitter or not, based on its chemical structure. BitterDB was used as the positive set, and non-bitter molecules were gathered from literature to create the negative set. Adaptive Boosting (AdaBoost), based on decision trees machine-learning algorithm was applied to molecules that were represented using physicochemical and ADME/Tox descriptors. BitterPredict correctly classifies over 80% of the compounds in the hold-out test set, and 70-90% of the compounds in three independent external sets and in sensory test validation, providing a quick and reliable tool for classifying large sets of compounds into bitter and non-bitter groups. BitterPredict suggests that about 40% of random molecules, and a large portion (66%) of clinical and experimental drugs, and of natural products (77%) are bitter.

  19. Comparison of the Nosocomial Pneumonia Mortality Prediction (NPMP) model with standard mortality prediction tools.

    Science.gov (United States)

    Srinivasan, M; Shetty, N; Gadekari, S; Thunga, G; Rao, K; Kunhikatta, V

    2017-07-01

    Severity or mortality prediction of nosocomial pneumonia could aid in the effective triage of patients and assisting physicians. To compare various severity assessment scoring systems for predicting intensive care unit (ICU) mortality in nosocomial pneumonia patients. A prospective cohort study was conducted in a tertiary care university-affiliated hospital in Manipal, India. One hundred patients with nosocomial pneumonia, admitted in the ICUs who developed pneumonia after >48h of admission, were included. The Nosocomial Pneumonia Mortality Prediction (NPMP) model, developed in our hospital, was compared with Acute Physiology and Chronic Health Evaluation II (APACHE II), Mortality Probability Model II (MPM 72  II), Simplified Acute Physiology Score II (SAPS II), Multiple Organ Dysfunction Score (MODS), Sequential Organ Failure Assessment (SOFA), Clinical Pulmonary Infection Score (CPIS), Ventilator-Associated Pneumonia Predisposition, Insult, Response, Organ dysfunction (VAP-PIRO). Data and clinical variables were collected on the day of pneumonia diagnosis. The outcome for the study was ICU mortality. The sensitivity and specificity of the various scoring systems was analysed by plotting receiver operating characteristic (ROC) curves and computing the area under the curve for each of the mortality predicting tools. NPMP, APACHE II, SAPS II, MPM 72  II, SOFA, and VAP-PIRO were found to have similar and acceptable discrimination power as assessed by the area under the ROC curve. The AUC values for the above scores ranged from 0.735 to 0.762. CPIS and MODS showed least discrimination. NPMP is a specific tool to predict mortality in nosocomial pneumonia and is comparable to other standard scores. Copyright © 2017 The Healthcare Infection Society. Published by Elsevier Ltd. All rights reserved.

  20. Prediction models : the right tool for the right problem

    NARCIS (Netherlands)

    Kappen, Teus H.; Peelen, Linda M.

    2016-01-01

    PURPOSE OF REVIEW: Perioperative prediction models can help to improve personalized patient care by providing individual risk predictions to both patients and providers. However, the scientific literature on prediction model development and validation can be quite technical and challenging to

  1. Stop. Think. Delirium! A quality improvement initiative to explore utilising a validated cognitive assessment tool in the acute inpatient medical setting to detect delirium and prompt early intervention.

    Science.gov (United States)

    Malik, Angela; Harlan, Todd; Cobb, Janice

    2016-11-01

    The paper examines the ability of nursing staff to detect delirium and apply early intervention to decrease adverse events associated with delirium. To characterise nursing practices associated with staff knowledge, delirium screening utilising the Modified Richmond Assessment Sedation Score (mRASS), and multicomponent interventions in an acute inpatient medical unit. Delirium incidence rates are up to 60% in frail elderly hospitalised patients. Under-recognition and inconsistent management of delirium is an international problem. Falls, restraints, and increased hospital length of stay are linked to delirium. A descriptive study. Exploration of relationships between cause and effect among cognitive screening, knowledge assessment and interventions. Success in identifying sufficient cases of delirium was not evident; however, multicomponent interventions were applied to patients with obvious symptoms. An increase in nursing knowledge was demonstrated after additional training. Delirium screening occurred in 49-61% of the target population monthly, with challenges in compliance and documentation of screening and interventions. Technological capabilities for trending mRASS results do not exist within the current computerised patient record system. Delirium screening increases awareness of nursing staff, prompting more emphasis on early intervention in apparent symptoms. Technological support is needed to effectively document and visualise trends in screening results. The study imparts future research on the effects of cognitive screening on delirium prevention and reduction in adverse patient outcomes. Evidence-based literature reveals negative patient outcomes associated with delirium. However, delirium is highly under-recognised indicating future research is needed to address nursing awareness and recognition of delirium. Additional education and knowledge transformation from research to nursing practice are paramount in the application of innovative strategies

  2. Early Antenatal Prediction of Gestational Diabetes in Obese Women: Development of Prediction Tools for Targeted Intervention.

    Directory of Open Access Journals (Sweden)

    Sara L White

    Full Text Available All obese women are categorised as being of equally high risk of gestational diabetes (GDM whereas the majority do not develop the disorder. Lifestyle and pharmacological interventions in unselected obese pregnant women have been unsuccessful in preventing GDM. Our aim was to develop a prediction tool for early identification of obese women at high risk of GDM to facilitate targeted interventions in those most likely to benefit. Clinical and anthropometric data and non-fasting blood samples were obtained at 15+0-18+6 weeks' gestation in 1303 obese pregnant women from UPBEAT, a randomised controlled trial of a behavioural intervention. Twenty one candidate biomarkers associated with insulin resistance, and a targeted nuclear magnetic resonance (NMR metabolome were measured. Prediction models were constructed using stepwise logistic regression. Twenty six percent of women (n = 337 developed GDM (International Association of Diabetes and Pregnancy Study Groups criteria. A model based on clinical and anthropometric variables (age, previous GDM, family history of type 2 diabetes, systolic blood pressure, sum of skinfold thicknesses, waist:height and neck:thigh ratios provided an area under the curve of 0.71 (95%CI 0.68-0.74. This increased to 0.77 (95%CI 0.73-0.80 with addition of candidate biomarkers (random glucose, haemoglobin A1c (HbA1c, fructosamine, adiponectin, sex hormone binding globulin, triglycerides, but was not improved by addition of NMR metabolites (0.77; 95%CI 0.74-0.81. Clinically translatable models for GDM prediction including readily measurable variables e.g. mid-arm circumference, age, systolic blood pressure, HbA1c and adiponectin are described. Using a ≥35% risk threshold, all models identified a group of high risk obese women of whom approximately 50% (positive predictive value later developed GDM, with a negative predictive value of 80%. Tools for early pregnancy identification of obese women at risk of GDM are described

  3. Chemical Utilisation of CO

    Indian Academy of Sciences (India)

    Home; Journals; Resonance – Journal of Science Education; Volume 20; Issue 2. Chemical Utilisation of CO2: A Challenge for the Sustainable World. Dinesh Jagadeesan Bhaskar Joshi Prashant Parameswaran. General Article Volume 20 Issue 2 February 2015 pp 165-176 ...

  4. Predicting Great Lakes fish yields: tools and constraints

    Science.gov (United States)

    Lewis, C.A.; Schupp, D.H.; Taylor, W.W.; Collins, J.J.; Hatch, Richard W.

    1987-01-01

    Prediction of yield is a critical component of fisheries management. The development of sound yield prediction methodology and the application of the results of yield prediction are central to the evolution of strategies to achieve stated goals for Great Lakes fisheries and to the measurement of progress toward those goals. Despite general availability of species yield models, yield prediction for many Great Lakes fisheries has been poor due to the instability of the fish communities and the inadequacy of available data. A host of biological, institutional, and societal factors constrain both the development of sound predictions and their application to management. Improved predictive capability requires increased stability of Great Lakes fisheries through rehabilitation of well-integrated communities, improvement of data collection, data standardization and information-sharing mechanisms, and further development of the methodology for yield prediction. Most important is the creation of a better-informed public that will in turn establish the political will to do what is required.

  5. IPMP 2013 - A comprehensive data analysis tool for predictive microbiology

    Science.gov (United States)

    Predictive microbiology is an area of applied research in food science that uses mathematical models to predict the changes in the population of pathogenic or spoilage microorganisms in foods undergoing complex environmental changes during processing, transportation, distribution, and storage. It f...

  6. The Efficacy of Violence Prediction: A Meta-Analytic Comparison of Nine Risk Assessment Tools

    Science.gov (United States)

    Yang, Min; Wong, Stephen C. P.; Coid, Jeremy

    2010-01-01

    Actuarial risk assessment tools are used extensively to predict future violence, but previous studies comparing their predictive accuracies have produced inconsistent findings as a result of various methodological issues. We conducted meta-analyses of the effect sizes of 9 commonly used risk assessment tools and their subscales to compare their…

  7. The Predictive Accuracy of PREDICT : A Personalized Decision-Making Tool for Southeast Asian Women With Breast Cancer

    NARCIS (Netherlands)

    Wong, Hoong-Seam; Subramaniam, Shridevi; Alias, Zarifah; Taib, Nur Aishah; Ho, Gwo-Fuang; Ng, Char-Hong; Yip, Cheng-Har; Verkooijen, Helena M.; Hartman, Mikael; Bhoo Pathy, N

    Web-based prognostication tools may provide a simple and economically feasible option to aid prognostication and selection of chemotherapy in early breast cancers. We validated PREDICT, a free online breast cancer prognostication and treatment benefit tool, in a resource-limited setting. All 1480

  8. Kenya develops tool to predict malaria | IDRC - International ...

    International Development Research Centre (IDRC) Digital Library (Canada)

    2010-10-13

    Oct 13, 2010 ... In collaboration with scientists from the Kenya Meteorological Department and the International Centre ... a scientific model that uses weather predictions, information about the reproductive mechanisms of ... Related articles ...

  9. An investigation into utilising gestational body mass index as a screening tool for adverse birth outcomes and maternal morbidities in a group of pregnant women in Khayelitsha

    Science.gov (United States)

    Davies, HR; Visser, J; Tomlinson, M; Rotheram-Borus, MJ; Gissane, C; Harwood, J; LeRoux, I

    2014-01-01

    Objective The aim of this study was to investigate the ability of the gestational body mass index (BMI) method to screen for adverse birth outcomes and maternal morbidities. Design This was a substudy of a randomised controlled trial, the Philani Mentor Mothers’ study. Setting and subjects The Philani Mentor Mothers’ study took place in a peri-urban settlement, Khayelitsha, between 2009 and 2010. Pregnant women living in the area in 2009-2010 were recruited for the study. Outcome measures Maternal anthropometry (height and weight) and gestational weeks were obtained at baseline to calculate the gestational BMI, which is maternal BMI adjusted for gestational age. Participants were classified into four gestational BMI categories: underweight, normal, overweight and obese. Birth outcomes and maternal morbidities were obtained from clinic cards after the births. Results Pregnant women were recruited into the study (n = 1 058). Significant differences were found between the different gestational BMI categories and the following birth outcomes: maternal (p-value = 0.019), infant hospital stay (p-value = 0.03), infants staying for over 24 hours in hospital (p-value = 0.001), delivery mode (p-value = 0.001), birthweight (p-value = 0.006), birth length (p-value = 0.007), birth head circumference (p-value = 0.007) and pregnancy-induced hypertension (p-value = 0.001). Conclusion To the best of our knowledge, this is the first study that has used the gestational BMI method in a peri-urban South African pregnant population. Based on the findings that this method is able to identify unfavourable birth outcomes, it is recommended that it is implemented as a pilot study in selected rural, peri-urban and urban primary health clinics, and that its ease and effectiveness as a screening tool is evaluated. Appropriate medical and nutritional advice can then be given to pregnant women to improve both their own and their infants’ birth-related outcomes and maternal morbidities

  10. Predictive Monte Carlo tools for LHC physics (1/3)

    CERN Multimedia

    CERN. Geneva

    2012-01-01

    Simulations of events taking place at the LHC play key role in all experimental analyses. Starting from the basics concepts of QCD, we first review how accurate predictions can be obtained via fixed-order calculations at higher orders. Parton showers and event generation are then introduced as a means to achieve fully exclusive predictions. Finally the recent merging and matching techniques between fixed-order and fully exclusive simulations are presented, as well as their implementations via the MLM/CKKW and MC@NLO/POWHEG methods.

  11. Predictive Monte Carlo tools for LHC physics (3/3)

    CERN Multimedia

    CERN. Geneva

    2012-01-01

    Simulations of events taking place at the LHC play key role in all experimental analyses. Starting from the basics concepts of QCD, we first review how accurate predictions can be obtained via fixed-order calculations at higher orders. Parton showers and event generation are then introduced as a means to achieve fully exclusive predictions. Finally the recent merging and matching techniques between fixed-order and fully exclusive simulations are presented, as well as their implementations via the MLM/CKKW and MC@NLO/POWHEG methods.

  12. Predictive Monte Carlo tools for LHC physics (2/3)

    CERN Multimedia

    CERN. Geneva

    2012-01-01

    Simulations of events taking place at the LHC play key role in all experimental analyses. Starting from the basics concepts of QCD, we first review how accurate predictions can be obtained via fixed-order calculations at higher orders. Parton showers and event generation are then introduced as a means to achieve fully exclusive predictions. Finally the recent merging and matching techniques between fixed-order and fully exclusive simulations are presented, as well as their implementations via the MLM/CKKW and MC@NLO/POWHEG methods.

  13. Emerging Tools to Estimate and to Predict Exposures to ...

    Science.gov (United States)

    The timely assessment of the human and ecological risk posed by thousands of existing and emerging commercial chemicals is a critical challenge facing EPA in its mission to protect public health and the environment The US EPA has been conducting research to enhance methods used to estimate and forecast exposures for tens of thousands of chemicals. This research is aimed at both assessing risks and supporting life cycle analysis, by developing new models and tools for high throughput exposure screening and prioritization, as well as databases that support these and other tools, especially regarding consumer products. The models and data address usage, and take advantage of quantitative structural activity relationships (QSARs) for both inherent chemical properties and function (why the chemical is a product ingredient). To make them more useful and widely available, the new tools, data and models are designed to be: • Flexible • Intraoperative • Modular (useful to more than one, stand-alone application) • Open (publicly available software) Presented at the Society for Risk Analysis Forum: Risk Governance for Key Enabling Technologies, Venice, Italy, March 1-3, 2017

  14. Enhanced clinical pharmacy service targeting tools: risk-predictive algorithms.

    Science.gov (United States)

    El Hajji, Feras W D; Scullin, Claire; Scott, Michael G; McElnay, James C

    2015-04-01

    This study aimed to determine the value of using a mix of clinical pharmacy data and routine hospital admission spell data in the development of predictive algorithms. Exploration of risk factors in hospitalized patients, together with the targeting strategies devised, will enable the prioritization of clinical pharmacy services to optimize patient outcomes. Predictive algorithms were developed using a number of detailed steps using a 75% sample of integrated medicines management (IMM) patients, and validated using the remaining 25%. IMM patients receive targeted clinical pharmacy input throughout their hospital stay. The algorithms were applied to the validation sample, and predicted risk probability was generated for each patient from the coefficients. Risk threshold for the algorithms were determined by identifying the cut-off points of risk scores at which the algorithm would have the highest discriminative performance. Clinical pharmacy staffing levels were obtained from the pharmacy department staffing database. Numbers of previous emergency admissions and admission medicines together with age-adjusted co-morbidity and diuretic receipt formed a 12-month post-discharge and/or readmission risk algorithm. Age-adjusted co-morbidity proved to be the best index to predict mortality. Increased numbers of clinical pharmacy staff at ward level was correlated with a reduction in risk-adjusted mortality index (RAMI). Algorithms created were valid in predicting risk of in-hospital and post-discharge mortality and risk of hospital readmission 3, 6 and 12 months post-discharge. The provision of ward-based clinical pharmacy services is a key component to reducing RAMI and enabling the full benefits of pharmacy input to patient care to be realized. © 2014 John Wiley & Sons, Ltd.

  15. PROFITABILITY RATIO AS A TOOL FOR BANKRUPTCY PREDICTION

    Directory of Open Access Journals (Sweden)

    Daniel BRÎNDESCU – OLARIU

    2016-07-01

    Full Text Available The current study evaluates the potential of the profitability ratio in predicting corporate bankruptcy. The research is focused on Romanian companies, with the targeted event being represented by the manifestation of bankruptcy 2 years after the date of the financial statements of reference. All tests were conducted over 2 paired samples of 1176 Romanian companies. The methodology employed in evaluating the potential of the profitability ratio was based on the Area Under the ROC Curve (0.663 and the general accuracy ensured by the ratio (62.6% out-of-sample accuracy. The results confirm the practical utility of the profitability ratio in the prediction of bankruptcy and thus validate the need for further research focused on developing a methodology of analysis.

  16. SOLVENCY RATIO AS A TOOL FOR BANKRUPTCY PREDICTION

    Directory of Open Access Journals (Sweden)

    Daniel BRÎNDESCU–OLARIU

    2016-08-01

    Full Text Available The current study evaluates the potential of the solvency ratio in predicting corporate bankruptcy. The research is focused on Romania and, in particular, on Timis County. The interest for the solvency ratio was based on the recommendations of the scientific literature, as well as on the availability of information concerning its values to all stakeholders. The event on which the research was focused was represented by the manifestation of bankruptcy 2 years after the date of the financial statements of reference. All tests were performed over 2 paired samples of 1176 companies in total. The methodology employed in evaluating the potential of the solvency ratio was based on the Area Under the ROC Curve (0.646 and the general accuracy ensured by the ratio (64.5% out-of-sample accuracy. The results confirm the practical utility of the solvency ratio in the prediction of bankruptcy.

  17. Development of Antimicrobial Peptide Prediction Tool for Aquaculture Industries.

    Science.gov (United States)

    Gautam, Aditi; Sharma, Asuda; Jaiswal, Sarika; Fatma, Samar; Arora, Vasu; Iquebal, M A; Nandi, S; Sundaray, J K; Jayasankar, P; Rai, Anil; Kumar, Dinesh

    2016-09-01

    Microbial diseases in fish, plant, animal and human are rising constantly; thus, discovery of their antidote is imperative. The use of antibiotic in aquaculture further compounds the problem by development of resistance and consequent consumer health risk by bio-magnification. Antimicrobial peptides (AMPs) have been highly promising as natural alternative to chemical antibiotics. Though AMPs are molecules of innate immune defense of all advance eukaryotic organisms, fish being heavily dependent on their innate immune defense has been a good source of AMPs with much wider applicability. Machine learning-based prediction method using wet laboratory-validated fish AMP can accelerate the AMP discovery using available fish genomic and proteomic data. Earlier AMP prediction servers are based on multi-phyla/species data, and we report here the world's first AMP prediction server in fishes. It is freely accessible at http://webapp.cabgrid.res.in/fishamp/ . A total of 151 AMPs related to fish collected from various databases and published literature were taken for this study. For model development and prediction, N-terminus residues, C-terminus residues and full sequences were considered. Best models were with kernels polynomial-2, linear and radial basis function with accuracy of 97, 99 and 97 %, respectively. We found that performance of support vector machine-based models is superior to artificial neural network. This in silico approach can drastically reduce the time and cost of AMP discovery. This accelerated discovery of lead AMP molecules having potential wider applications in diverse area like fish and human health as substitute of antibiotics, immunomodulator, antitumor, vaccine adjuvant and inactivator, and also for packaged food can be of much importance for industries.

  18. PROFITABILITY RATIO AS A TOOL FOR BANKRUPTCY PREDICTION

    OpenAIRE

    Daniel BRÎNDESCU – OLARIU

    2016-01-01

    The current study evaluates the potential of the profitability ratio in predicting corporate bankruptcy. The research is focused on Romanian companies, with the targeted event being represented by the manifestation of bankruptcy 2 years after the date of the financial statements of reference. All tests were conducted over 2 paired samples of 1176 Romanian companies. The methodology employed in evaluating the potential of the profitability ratio was based on the Area Under the ROC Curve (0.663...

  19. Prediction of boiling points of organic compounds by QSPR tools.

    Science.gov (United States)

    Dai, Yi-min; Zhu, Zhi-ping; Cao, Zhong; Zhang, Yue-fei; Zeng, Ju-lan; Li, Xun

    2013-07-01

    The novel electro-negativity topological descriptors of YC, WC were derived from molecular structure by equilibrium electro-negativity of atom and relative bond length of molecule. The quantitative structure-property relationships (QSPR) between descriptors of YC, WC as well as path number parameter P3 and the normal boiling points of 80 alkanes, 65 unsaturated hydrocarbons and 70 alcohols were obtained separately. The high-quality prediction models were evidenced by coefficient of determination (R(2)), the standard error (S), average absolute errors (AAE) and predictive parameters (Qext(2),RCV(2),Rm(2)). According to the regression equations, the influences of the length of carbon backbone, the size, the degree of branching of a molecule and the role of functional groups on the normal boiling point were analyzed. Comparison results with reference models demonstrated that novel topological descriptors based on the equilibrium electro-negativity of atom and the relative bond length were useful molecular descriptors for predicting the normal boiling points of organic compounds. Copyright © 2013 Elsevier Inc. All rights reserved.

  20. Using Search Engine Data as a Tool to Predict Syphilis.

    Science.gov (United States)

    Young, Sean D; Torrone, Elizabeth A; Urata, John; Aral, Sevgi O

    2018-07-01

    Researchers have suggested that social media and online search data might be used to monitor and predict syphilis and other sexually transmitted diseases. Because people at risk for syphilis might seek sexual health and risk-related information on the internet, we investigated associations between internet state-level search query data (e.g., Google Trends) and reported weekly syphilis cases. We obtained weekly counts of reported primary and secondary syphilis for 50 states from 2012 to 2014 from the US Centers for Disease Control and Prevention. We collected weekly internet search query data regarding 25 risk-related keywords from 2012 to 2014 for 50 states using Google Trends. We joined 155 weeks of Google Trends data with 1-week lag to weekly syphilis data for a total of 7750 data points. Using the least absolute shrinkage and selection operator, we trained three linear mixed models on the first 10 weeks of each year. We validated models for 2012 and 2014 for the following 52 weeks and the 2014 model for the following 42 weeks. The models, consisting of different sets of keyword predictors for each year, accurately predicted 144 weeks of primary and secondary syphilis counts for each state, with an overall average R of 0.9 and overall average root mean squared error of 4.9. We used Google Trends search data from the prior week to predict cases of syphilis in the following weeks for each state. Further research could explore how search data could be integrated into public health monitoring systems.

  1. JV Task 5 - Predictive Coal Quality Effects Screening Tool (PCQUEST)

    Energy Technology Data Exchange (ETDEWEB)

    Jason Laumb; Joshua Stanislowski

    2007-07-01

    PCQUEST, a package of eight predictive indices, was developed with U.S. Department of Energy (DOE) support by the Energy & Environmental Research Center to predict fireside performance in coal-fired utility boilers more reliably than traditional indices. Since the development of PCQUEST, the need has arisen for additional improvement, validation, and enhancement of the model, as well as to incorporate additional fuel types into the program database. PCQUEST was developed using combustion inorganic transformation theory from previous projects and from empirical data derived from laboratory experiments and coal boiler field observations. The goal of this joint venture project between commercial industry clients and DOE is to further enhance PCQUEST and improve its utility for a variety of new fuels and systems. Specific objectives include initiating joint venture projects with utilities, boiler vendors, and coal companies that involve real-world situations and needs in order to strategically improve algorithms and input-output functions of PCQUEST, as well as to provide technology transfer to the industrial sector. The main body of this report provides a short summary of the projects that were closed from February 1999 through July 2007. All of the reports sent to the commercial clients can be found in the appendix.

  2. Cluster analysis as a prediction tool for pregnancy outcomes.

    Science.gov (United States)

    Banjari, Ines; Kenjerić, Daniela; Šolić, Krešimir; Mandić, Milena L

    2015-03-01

    Considering specific physiology changes during gestation and thinking of pregnancy as a "critical window", classification of pregnant women at early pregnancy can be considered as crucial. The paper demonstrates the use of a method based on an approach from intelligent data mining, cluster analysis. Cluster analysis method is a statistical method which makes possible to group individuals based on sets of identifying variables. The method was chosen in order to determine possibility for classification of pregnant women at early pregnancy to analyze unknown correlations between different variables so that the certain outcomes could be predicted. 222 pregnant women from two general obstetric offices' were recruited. The main orient was set on characteristics of these pregnant women: their age, pre-pregnancy body mass index (BMI) and haemoglobin value. Cluster analysis gained a 94.1% classification accuracy rate with three branch- es or groups of pregnant women showing statistically significant correlations with pregnancy outcomes. The results are showing that pregnant women both of older age and higher pre-pregnancy BMI have a significantly higher incidence of delivering baby of higher birth weight but they gain significantly less weight during pregnancy. Their babies are also longer, and these women have significantly higher probability for complications during pregnancy (gestosis) and higher probability of induced or caesarean delivery. We can conclude that the cluster analysis method can appropriately classify pregnant women at early pregnancy to predict certain outcomes.

  3. Physics-based Modeling Tools for Life Prediction and Durability Assessment of Advanced Materials, Phase I

    Data.gov (United States)

    National Aeronautics and Space Administration — The technical objectives of this program are: (1) to develop a set of physics-based modeling tools to predict the initiation of hot corrosion and to address pit and...

  4. About Using Predictive Models and Tools To Assess Chemicals under TSCA

    Science.gov (United States)

    As part of EPA's effort to promote chemical safety, OPPT provides public access to predictive models and tools which can help inform the public on the hazards and risks of substances and improve chemical management decisions.

  5. An Evaluation of Growth Models as Predictive Tools for Estimates at Completion (EAC)

    National Research Council Canada - National Science Library

    Trahan, Elizabeth N

    2009-01-01

    ...) as the Estimates at Completion (EAC). Our research evaluates the prospect of nonlinear growth modeling as an alternative to the current predictive tools used for calculating EAC, such as the Cost Performance Index (CPI...

  6. Predictive Maintenance--An Effective Money Saving Tool Being Applied in Industry Today.

    Science.gov (United States)

    Smyth, Tom

    2000-01-01

    Looks at preventive/predictive maintenance as it is used in industry. Discusses core preventive maintenance tools that must be understood to prepare students. Includes a list of websites related to the topic. (JOW)

  7. A clinical tool for predicting survival in ALS.

    Science.gov (United States)

    Knibb, Jonathan A; Keren, Noa; Kulka, Anna; Leigh, P Nigel; Martin, Sarah; Shaw, Christopher E; Tsuda, Miho; Al-Chalabi, Ammar

    2016-12-01

    Amyotrophic lateral sclerosis (ALS) is a progressive and usually fatal neurodegenerative disease. Survival from diagnosis varies considerably. Several prognostic factors are known, including site of onset (bulbar or limb), age at symptom onset, delay from onset to diagnosis and the use of riluzole and non-invasive ventilation (NIV). Clinicians and patients would benefit from a practical way of using these factors to provide an individualised prognosis. 575 consecutive patients with incident ALS from a population-based registry in South-East England register for ALS (SEALS) were studied. Their survival was modelled as a two-step process: the time from diagnosis to respiratory muscle involvement, followed by the time from respiratory involvement to death. The effects of predictor variables were assessed separately for each time interval. Younger age at symptom onset, longer delay from onset to diagnosis and riluzole use were associated with slower progression to respiratory involvement, and NIV use was associated with lower mortality after respiratory involvement, each with a clinically significant effect size. Riluzole may have a greater effect in younger patients and those with longer delay to diagnosis. A patient's survival time has a roughly 50% chance of falling between half and twice the predicted median. A simple and clinically applicable graphical method of predicting an individual patient's survival from diagnosis is presented. The model should be validated in an independent cohort, and extended to include other important prognostic factors. Published by the BMJ Publishing Group Limited. For permission to use (where not already granted under a licence) please go to http://www.bmj.com/company/products-services/rights-and-licensing/.

  8. Predictive tool of energy performance of cold storage in agrifood industries: The Portuguese case study

    International Nuclear Information System (INIS)

    Nunes, José; Neves, Diogo; Gaspar, Pedro D.; Silva, Pedro D.; Andrade, Luís P.

    2014-01-01

    Highlights: • A predictive tool for assessment of the energy performance in agrifood industries that use cold storage is developed. • The correlations used by the predictive tool result from the greatest number of data sets collected to date in Portugal. • Strong relationships between raw material, energy consumption and volume of cold stores were established. • Case studies were analyzed that demonstrate the applicability of the tool. • The tool results are useful in the decision-making process of practice measures for the improvement of energy efficiency. - Abstract: Food processing and conservation represent decisive factors for the sustainability of the planet given the significant growth of the world population in the last decades. Therefore, the cooling process during the manufacture and/or storage of food products has been subject of study and improvement in order to ensure the food supply with good quality and safety. A predictive tool for assessment of the energy performance in agrifood industries that use cold storage is developed in order to contribute to the improvement of the energy efficiency of this industry. The predictive tool is based on a set of characteristic correlated parameters: amount of raw material annually processed, annual energy consumption and volume of cold rooms. Case studies of application of the predictive tool consider industries in the meat sector, specifically slaughterhouses. The results obtained help on the decision-making of practice measures for improvement of the energy efficiency in this industry

  9. Variability in Predictions from Online Tools: A Demonstration Using Internet-Based Melanoma Predictors.

    Science.gov (United States)

    Zabor, Emily C; Coit, Daniel; Gershenwald, Jeffrey E; McMasters, Kelly M; Michaelson, James S; Stromberg, Arnold J; Panageas, Katherine S

    2018-02-22

    Prognostic models are increasingly being made available online, where they can be publicly accessed by both patients and clinicians. These online tools are an important resource for patients to better understand their prognosis and for clinicians to make informed decisions about treatment and follow-up. The goal of this analysis was to highlight the possible variability in multiple online prognostic tools in a single disease. To demonstrate the variability in survival predictions across online prognostic tools, we applied a single validation dataset to three online melanoma prognostic tools. Data on melanoma patients treated at Memorial Sloan Kettering Cancer Center between 2000 and 2014 were retrospectively collected. Calibration was assessed using calibration plots and discrimination was assessed using the C-index. In this demonstration project, we found important differences across the three models that led to variability in individual patients' predicted survival across the tools, especially in the lower range of predictions. In a validation test using a single-institution data set, calibration and discrimination varied across the three models. This study underscores the potential variability both within and across online tools, and highlights the importance of using methodological rigor when developing a prognostic model that will be made publicly available online. The results also reinforce that careful development and thoughtful interpretation, including understanding a given tool's limitations, are required in order for online prognostic tools that provide survival predictions to be a useful resource for both patients and clinicians.

  10. DEEP--a tool for differential expression effector prediction.

    Science.gov (United States)

    Degenhardt, Jost; Haubrock, Martin; Dönitz, Jürgen; Wingender, Edgar; Crass, Torsten

    2007-07-01

    High-throughput methods for measuring transcript abundance, like SAGE or microarrays, are widely used for determining differences in gene expression between different tissue types, dignities (normal/malignant) or time points. Further analysis of such data frequently aims at the identification of gene interaction networks that form the causal basis for the observed properties of the systems under examination. To this end, it is usually not sufficient to rely on the measured gene expression levels alone; rather, additional biological knowledge has to be taken into account in order to generate useful hypotheses about the molecular mechanism leading to the realization of a certain phenotype. We present a method that combines gene expression data with biological expert knowledge on molecular interaction networks, as described by the TRANSPATH database on signal transduction, to predict additional--and not necessarily differentially expressed--genes or gene products which might participate in processes specific for either of the examined tissues or conditions. In a first step, significance values for over-expression in tissue/condition A or B are assigned to all genes in the expression data set. Genes with a significance value exceeding a certain threshold are used as starting points for the reconstruction of a graph with signaling components as nodes and signaling events as edges. In a subsequent graph traversal process, again starting from the previously identified differentially expressed genes, all encountered nodes 'inherit' all their starting nodes' significance values. In a final step, the graph is visualized, the nodes being colored according to a weighted average of their inherited significance values. Each node's, or sub-network's, predominant color, ranging from green (significant for tissue/condition A) over yellow (not significant for either tissue/condition) to red (significant for tissue/condition B), thus gives an immediate visual clue on which molecules

  11. A systematic review on popularity, application and characteristics of protein secondary structure prediction tools.

    Science.gov (United States)

    Kashani-Amin, Elaheh; Tabatabaei-Malazy, Ozra; Sakhteman, Amirhossein; Larijani, Bagher; Ebrahim-Habibi, Azadeh

    2018-02-27

    Prediction of proteins' secondary structure is one of the major steps in the generation of homology models. These models provide structural information which is used to design suitable ligands for potential medicinal targets. However, selecting a proper tool between multiple secondary structure prediction (SSP) options is challenging. The current study is an insight onto currently favored methods and tools, within various contexts. A systematic review was performed for a comprehensive access to recent (2013-2016) studies which used or recommended protein SSP tools. Three databases, Web of Science, PubMed and Scopus were systematically searched and 99 out of 209 studies were finally found eligible to extract data. Four categories of applications for 59 retrieved SSP tools were: (I) prediction of structural features of a given sequence, (II) evaluation of a method, (III) providing input for a new SSP method and (IV) integrating a SSP tool as a component for a program. PSIPRED was found to be the most popular tool in all four categories. JPred and tools utilizing PHD (Profile network from HeiDelberg) method occupied second and third places of popularity in categories I and II. JPred was only found in the two first categories, while PHD was present in three fields. This study provides a comprehensive insight about the recent usage of SSP tools which could be helpful for selecting a proper tool's choice. Copyright© Bentham Science Publishers; For any queries, please email at epub@benthamscience.org.

  12. Can clinical prediction tools predict the need for computed tomography in blunt abdominal? A systematic review.

    Science.gov (United States)

    Sharples, Alistair; Brohi, Karim

    2016-08-01

    Blunt abdominal trauma is a common reason for admission to the Emergency Department. Early detection of injuries is an important goal but is often not straightforward as physical examination alone is not a good predictor of serious injury. Computed tomography (CT) has become the primary method for assessing the stable trauma patient. It has high sensitivity and specificity but there remains concern regarding the long term consequences of high doses of radiation. Therefore an accurate and reliable method of assessing which patients are at higher risk of injury and hence require a CT would be clinically useful. We perform a systematic review to investigate the use of clinical prediction tools (CPTs) for the identification of abdominal injuries in patients suffering blunt trauma. A literature search was performed using Medline, Embase, The Cochrane Library and NHS Evidence up to August 2014. English language, prospective and retrospective studies were included if they derived, validated or assessed a CPT, aimed at identifying intra-abdominal injuries or the need for intervention to treat an intra-abdominal after blunt trauma. Methodological quality was assessed using a 14 point scale. Performance was assessed predominantly by sensitivity. Seven relevant studies were identified. All studies were derivative studies and no CPT was validated in a separate study. There were large differences in the study design, composition of the CPTs, the outcomes analysed and the methodological quality of the included studies. Sensitivities ranged from 86 to 100%. The highest performing CPT had a lower limit of the 95% CI of 95.8% and was of high methodological quality (11 of 14). Had this rule been applied to the population then 25.1% of patients would have avoided a CT scan. Seven CPTs were identified of varying designs and methodological quality. All demonstrate relatively high sensitivity with some achieving very high sensitivity whilst still managing to reduce the number of CTs

  13. Clinical Prediction Model and Tool for Assessing Risk of Persistent Pain After Breast Cancer Surgery

    DEFF Research Database (Denmark)

    Meretoja, Tuomo J; Andersen, Kenneth Geving; Bruce, Julie

    2017-01-01

    are missing. The aim was to develop a clinically applicable risk prediction tool. Methods The prediction models were developed and tested using three prospective data sets from Finland (n = 860), Denmark (n = 453), and Scotland (n = 231). Prediction models for persistent pain of moderate to severe intensity......), high body mass index ( P = .039), axillary lymph node dissection ( P = .008), and more severe acute postoperative pain intensity at the seventh postoperative day ( P = .003) predicted persistent pain in the final prediction model, which performed well in the Danish (ROC-AUC, 0.739) and Scottish (ROC......-AUC, 0.740) cohorts. At the 20% risk level, the model had 32.8% and 47.4% sensitivity and 94.4% and 82.4% specificity in the Danish and Scottish cohorts, respectively. Conclusion Our validated prediction models and an online risk calculator provide clinicians and researchers with a simple tool to screen...

  14. Validation of the online prediction tool PREDICT v. 2.0 in the Dutch breast cancer population

    NARCIS (Netherlands)

    Maaren, M.C. van; Steenbeek, C.D. van; Pharoah, P.D.; Witteveen, A.; Sonke, G.S.; Strobbe, L.J.A.; Poortmans, P.; Siesling, S.

    2017-01-01

    BACKGROUND: PREDICT version 2.0 is increasingly used to estimate prognosis in breast cancer. This study aimed to validate this tool in specific prognostic subgroups in the Netherlands. METHODS: All operated women with non-metastatic primary invasive breast cancer, diagnosed in 2005, were selected

  15. Validation of the online prediction tool PREDICT v. 2.0 in the Dutch breast cancer population

    NARCIS (Netherlands)

    van Maaren, M. C.; van Steenbeek, C. D.; Pharoah, P. D.P.; Witteveen, A.; Sonke, Gabe S.; Strobbe, L.J.A.; Poortmans, P.M.P.; Siesling, S.

    2017-01-01

    Background PREDICT version 2.0 is increasingly used to estimate prognosis in breast cancer. This study aimed to validate this tool in specific prognostic subgroups in the Netherlands. Methods All operated women with non-metastatic primary invasive breast cancer, diagnosed in 2005, were selected from

  16. Influence of Genotype on Warfarin Maintenance Dose Predictions Produced Using a Bayesian Dose Individualization Tool.

    Science.gov (United States)

    Saffian, Shamin M; Duffull, Stephen B; Roberts, Rebecca L; Tait, Robert C; Black, Leanne; Lund, Kirstin A; Thomson, Alison H; Wright, Daniel F B

    2016-12-01

    A previously established Bayesian dosing tool for warfarin was found to produce biased maintenance dose predictions. In this study, we aimed (1) to determine whether the biased warfarin dose predictions previously observed could be replicated in a new cohort of patients from 2 different clinical settings, (2) to explore the influence of CYP2C9 and VKORC1 genotype on predictive performance of the Bayesian dosing tool, and (3) to determine whether the previous population used to develop the kinetic-pharmacodynamic model underpinning the Bayesian dosing tool was sufficiently different from the test (posterior) population to account for the biased dose predictions. The warfarin maintenance doses for 140 patients were predicted using the dosing tool and compared with the observed maintenance dose. The impact of genotype was assessed by predicting maintenance doses with prior parameter values known to be altered by genetic variability (eg, EC50 for VKORC1 genotype). The prior population was evaluated by fitting the published kinetic-pharmacodynamic model, which underpins the Bayesian tool, to the observed data using NONMEM and comparing the model parameter estimates with published values. The Bayesian tool produced positively biased dose predictions in the new cohort of patients (mean prediction error [95% confidence interval]; 0.32 mg/d [0.14-0.5]). The bias was only observed in patients requiring ≥7 mg/d. The direction and magnitude of the observed bias was not influenced by genotype. The prior model provided a good fit to our data, which suggests that the bias was not caused by different prior and posterior populations. Maintenance doses for patients requiring ≥7 mg/d were overpredicted. The bias was not due to the influence of genotype nor was it related to differences between the prior and posterior populations. There is a need for a more mechanistic model that captures warfarin dose-response relationship at higher warfarin doses.

  17. Application of the PredictAD Software Tool to Predict Progression in Patients with Mild Cognitive Impairment

    DEFF Research Database (Denmark)

    Simonsen, Anja H; Mattila, Jussi; Hejl, Anne-Mette

    2012-01-01

    of incremental data presentation using the software tool. A 5th phase was done with all available patient data presented on paper charts. Classifications by the clinical raters were compared to the clinical diagnoses made by the Alzheimer's Disease Neuroimaging Initiative investigators. Results: A statistical...... significant trend (p classification accuracy (from 62.6 to 70.0%) was found when using the PredictAD tool during the stepwise procedure. When the same data were presented on paper, classification accuracy of the raters dropped significantly from 70.0 to 63.2%. Conclusion: Best...... classification accuracy was achieved by the clinical raters when using the tool for decision support, suggesting that the tool can add value in diagnostic classification when large amounts of heterogeneous data are presented....

  18. Predicting tool life in turning operations using neural networks and image processing

    Science.gov (United States)

    Mikołajczyk, T.; Nowicki, K.; Bustillo, A.; Yu Pimenov, D.

    2018-05-01

    A two-step method is presented for the automatic prediction of tool life in turning operations. First, experimental data are collected for three cutting edges under the same constant processing conditions. In these experiments, the parameter of tool wear, VB, is measured with conventional methods and the same parameter is estimated using Neural Wear, a customized software package that combines flank wear image recognition and Artificial Neural Networks (ANNs). Second, an ANN model of tool life is trained with the data collected from the first two cutting edges and the subsequent model is evaluated on two different subsets for the third cutting edge: the first subset is obtained from the direct measurement of tool wear and the second is obtained from the Neural Wear software that estimates tool wear using edge images. Although the complete-automated solution, Neural Wear software for tool wear recognition plus the ANN model of tool life prediction, presented a slightly higher error than the direct measurements, it was within the same range and can meet all industrial requirements. These results confirm that the combination of image recognition software and ANN modelling could potentially be developed into a useful industrial tool for low-cost estimation of tool life in turning operations.

  19. Automatic generation of bioinformatics tools for predicting protein-ligand binding sites.

    Science.gov (United States)

    Komiyama, Yusuke; Banno, Masaki; Ueki, Kokoro; Saad, Gul; Shimizu, Kentaro

    2016-03-15

    Predictive tools that model protein-ligand binding on demand are needed to promote ligand research in an innovative drug-design environment. However, it takes considerable time and effort to develop predictive tools that can be applied to individual ligands. An automated production pipeline that can rapidly and efficiently develop user-friendly protein-ligand binding predictive tools would be useful. We developed a system for automatically generating protein-ligand binding predictions. Implementation of this system in a pipeline of Semantic Web technique-based web tools will allow users to specify a ligand and receive the tool within 0.5-1 day. We demonstrated high prediction accuracy for three machine learning algorithms and eight ligands. The source code and web application are freely available for download at http://utprot.net They are implemented in Python and supported on Linux. shimizu@bi.a.u-tokyo.ac.jp Supplementary data are available at Bioinformatics online. © The Author 2015. Published by Oxford University Press.

  20. Towards early software reliability prediction for computer forensic tools (case study).

    Science.gov (United States)

    Abu Talib, Manar

    2016-01-01

    Versatility, flexibility and robustness are essential requirements for software forensic tools. Researchers and practitioners need to put more effort into assessing this type of tool. A Markov model is a robust means for analyzing and anticipating the functioning of an advanced component based system. It is used, for instance, to analyze the reliability of the state machines of real time reactive systems. This research extends the architecture-based software reliability prediction model for computer forensic tools, which is based on Markov chains and COSMIC-FFP. Basically, every part of the computer forensic tool is linked to a discrete time Markov chain. If this can be done, then a probabilistic analysis by Markov chains can be performed to analyze the reliability of the components and of the whole tool. The purposes of the proposed reliability assessment method are to evaluate the tool's reliability in the early phases of its development, to improve the reliability assessment process for large computer forensic tools over time, and to compare alternative tool designs. The reliability analysis can assist designers in choosing the most reliable topology for the components, which can maximize the reliability of the tool and meet the expected reliability level specified by the end-user. The approach of assessing component-based tool reliability in the COSMIC-FFP context is illustrated with the Forensic Toolkit Imager case study.

  1. Musite, a tool for global prediction of general and kinase-specific phosphorylation sites.

    Science.gov (United States)

    Gao, Jianjiong; Thelen, Jay J; Dunker, A Keith; Xu, Dong

    2010-12-01

    Reversible protein phosphorylation is one of the most pervasive post-translational modifications, regulating diverse cellular processes in various organisms. High throughput experimental studies using mass spectrometry have identified many phosphorylation sites, primarily from eukaryotes. However, the vast majority of phosphorylation sites remain undiscovered, even in well studied systems. Because mass spectrometry-based experimental approaches for identifying phosphorylation events are costly, time-consuming, and biased toward abundant proteins and proteotypic peptides, in silico prediction of phosphorylation sites is potentially a useful alternative strategy for whole proteome annotation. Because of various limitations, current phosphorylation site prediction tools were not well designed for comprehensive assessment of proteomes. Here, we present a novel software tool, Musite, specifically designed for large scale predictions of both general and kinase-specific phosphorylation sites. We collected phosphoproteomics data in multiple organisms from several reliable sources and used them to train prediction models by a comprehensive machine-learning approach that integrates local sequence similarities to known phosphorylation sites, protein disorder scores, and amino acid frequencies. Application of Musite on several proteomes yielded tens of thousands of phosphorylation site predictions at a high stringency level. Cross-validation tests show that Musite achieves some improvement over existing tools in predicting general phosphorylation sites, and it is at least comparable with those for predicting kinase-specific phosphorylation sites. In Musite V1.0, we have trained general prediction models for six organisms and kinase-specific prediction models for 13 kinases or kinase families. Although the current pretrained models were not correlated with any particular cellular conditions, Musite provides a unique functionality for training customized prediction models

  2. Which screening tools can predict injury to the lower extremities in team sports?: a systematic review.

    Science.gov (United States)

    Dallinga, Joan M; Benjaminse, Anne; Lemmink, Koen A P M

    2012-09-01

    Injuries to lower extremities are common in team sports such as soccer, basketball, volleyball, football and field hockey. Considering personal grief, disabling consequences and high costs caused by injuries to lower extremities, the importance for the prevention of these injuries is evident. From this point of view it is important to know which screening tools can identify athletes who are at risk of injury to their lower extremities. The aim of this article is to determine the predictive values of anthropometric and/or physical screening tests for injuries to the leg, anterior cruciate ligament (ACL), knee, hamstring, groin and ankle in team sports. A systematic review was conducted in MEDLINE (1966 to September 2011), EMBASE (1989 to September 2011) and CINAHL (1982 to September 2011). Based on inclusion criteria defined a priori, titles, abstracts and full texts were analysed to find relevant studies. The analysis showed that different screening tools can be predictive for injuries to the knee, ACL, hamstring, groin and ankle. For injuries in general there is some support in the literature to suggest that general joint laxity is a predictive measure for leg injuries. The anterior right/left reach distance >4 cm and the composite reach distance injuries. Furthermore, an increasing age, a lower hamstring/quadriceps (H : Q) ratio and a decreased range of motion (ROM) of hip abduction may predict the occurrence of leg injuries. Hyperextension of the knee, side-to-side differences in anterior-posterior knee laxity and differences in knee abduction moment between both legs are suggested to be predictive tests for sustaining an ACL injury and height was a predictive screening tool for knee ligament injuries. There is some evidence that when age increases, the probability of sustaining a hamstring injury increases. Debate exists in the analysed literature regarding measurement of the flexibility of the hamstring as a predictive screening tool, as well as using the H

  3. Prediction of ingredient quality and the effect of a combination of xylanase, amylase, protease and phytase in the diets of broiler chicks. 2. Energy and nutrient utilisation.

    Science.gov (United States)

    Cowieson, A J; Singh, D N; Adeola, O

    2006-08-01

    1. In order to investigate the effects of xylanase, amylase, protease and phytase in the diets of broiler chickens containing graded concentrations of metabolisable energy (ME), two 42-d experiments were conducted using a total of 2208 broiler chicks (8 treatments with 12 replicate pens in each experiment). 2. Four diets including one positive and three negative control diets were used. Three maize/soybean meal-based negative control (NC) diets were formulated to be identical in available phosphorus (P), calcium (Ca) and amino acids but NC1 contained approximately 0.17 MJ/kg less ME than NC2 and approximately 0.34 MJ/kg less ME than NC3. A positive control (PC) was fed for comparison and was formulated to be adequate in all nutrients, providing approximately 0.63 MJ/kg ME, 0.13% available P, 0.12% Ca and 1 to 2% amino acids more than NC1. 3. The reduction in nutrient density between NC1 and PC was determined using ingredient quality models Avichecktrade mark Corn and Phychecktrade mark that can predict the response to exogenous enzymes in maize/soybean meal-based broiler diets. Supplementation of each diet with or without a cocktail of xylanase, amylase, protease and phytase gave a total of 8 dietary treatments in a 4 x 2 factorial arrangement. The same treatments and diet designs were used in both experiments but conducted in different locations using different batches of maize, soybean meal and minor ingredients. 4. In both experiments, digestibility was improved by the addition of exogenous enzymes, particularly those for P, Ca and certain amino acids. In addition, the supplementation of the PC with enzymes elicited a positive response indicating that over-the-top addition of xylanase, amylase, protease and phytase may offer a nutritionally and economically viable alternative to feed cost reduction. 5. It can be concluded that the digestibility of nutrients by broilers fed on maize/soybean meal-based diets can be improved by the use of a combination of xylanase

  4. Atomic Oxygen Erosion Yield Predictive Tool for Spacecraft Polymers in Low Earth Orbit

    Science.gov (United States)

    Bank, Bruce A.; de Groh, Kim K.; Backus, Jane A.

    2008-01-01

    A predictive tool was developed to estimate the low Earth orbit (LEO) atomic oxygen erosion yield of polymers based on the results of the Polymer Erosion and Contamination Experiment (PEACE) Polymers experiment flown as part of the Materials International Space Station Experiment 2 (MISSE 2). The MISSE 2 PEACE experiment accurately measured the erosion yield of a wide variety of polymers and pyrolytic graphite. The 40 different materials tested were selected specifically to represent a variety of polymers used in space as well as a wide variety of polymer chemical structures. The resulting erosion yield data was used to develop a predictive tool which utilizes chemical structure and physical properties of polymers that can be measured in ground laboratory testing to predict the in-space atomic oxygen erosion yield of a polymer. The properties include chemical structure, bonding information, density and ash content. The resulting predictive tool has a correlation coefficient of 0.914 when compared with actual MISSE 2 space data for 38 polymers and pyrolytic graphite. The intent of the predictive tool is to be able to make estimates of atomic oxygen erosion yields for new polymers without requiring expensive and time consumptive in-space testing.

  5. Development of METAL-ACTIVE SITE and ZINCCLUSTER tool to predict active site pockets.

    Science.gov (United States)

    Ajitha, M; Sundar, K; Arul Mugilan, S; Arumugam, S

    2018-03-01

    The advent of whole genome sequencing leads to increasing number of proteins with known amino acid sequences. Despite many efforts, the number of proteins with resolved three dimensional structures is still low. One of the challenging tasks the structural biologists face is the prediction of the interaction of metal ion with any protein for which the structure is unknown. Based on the information available in Protein Data Bank, a site (METALACTIVE INTERACTION) has been generated which displays information for significant high preferential and low-preferential combination of endogenous ligands for 49 metal ions. User can also gain information about the residues present in the first and second coordination sphere as it plays a major role in maintaining the structure and function of metalloproteins in biological system. In this paper, a novel computational tool (ZINCCLUSTER) is developed, which can predict the zinc metal binding sites of proteins even if only the primary sequence is known. The purpose of this tool is to predict the active site cluster of an uncharacterized protein based on its primary sequence or a 3D structure. The tool can predict amino acids interacting with a metal or vice versa. This tool is based on the occurrence of significant triplets and it is tested to have higher prediction accuracy when compared to that of other available techniques. © 2017 Wiley Periodicals, Inc.

  6. Accurate Prediction of Motor Failures by Application of Multi CBM Tools: A Case Study

    Science.gov (United States)

    Dutta, Rana; Singh, Veerendra Pratap; Dwivedi, Jai Prakash

    2018-02-01

    Motor failures are very difficult to predict accurately with a single condition-monitoring tool as both electrical and the mechanical systems are closely related. Electrical problem, like phase unbalance, stator winding insulation failures can, at times, lead to vibration problem and at the same time mechanical failures like bearing failure, leads to rotor eccentricity. In this case study of a 550 kW blower motor it has been shown that a rotor bar crack was detected by current signature analysis and vibration monitoring confirmed the same. In later months in a similar motor vibration monitoring predicted bearing failure and current signature analysis confirmed the same. In both the cases, after dismantling the motor, the predictions were found to be accurate. In this paper we will be discussing the accurate predictions of motor failures through use of multi condition monitoring tools with two case studies.

  7. Development and Validation of a Multidisciplinary Tool for Accurate and Efficient Rotorcraft Noise Prediction (MUTE)

    Science.gov (United States)

    Liu, Yi; Anusonti-Inthra, Phuriwat; Diskin, Boris

    2011-01-01

    A physics-based, systematically coupled, multidisciplinary prediction tool (MUTE) for rotorcraft noise was developed and validated with a wide range of flight configurations and conditions. MUTE is an aggregation of multidisciplinary computational tools that accurately and efficiently model the physics of the source of rotorcraft noise, and predict the noise at far-field observer locations. It uses systematic coupling approaches among multiple disciplines including Computational Fluid Dynamics (CFD), Computational Structural Dynamics (CSD), and high fidelity acoustics. Within MUTE, advanced high-order CFD tools are used around the rotor blade to predict the transonic flow (shock wave) effects, which generate the high-speed impulsive noise. Predictions of the blade-vortex interaction noise in low speed flight are also improved by using the Particle Vortex Transport Method (PVTM), which preserves the wake flow details required for blade/wake and fuselage/wake interactions. The accuracy of the source noise prediction is further improved by utilizing a coupling approach between CFD and CSD, so that the effects of key structural dynamics, elastic blade deformations, and trim solutions are correctly represented in the analysis. The blade loading information and/or the flow field parameters around the rotor blade predicted by the CFD/CSD coupling approach are used to predict the acoustic signatures at far-field observer locations with a high-fidelity noise propagation code (WOPWOP3). The predicted results from the MUTE tool for rotor blade aerodynamic loading and far-field acoustic signatures are compared and validated with a variation of experimental data sets, such as UH60-A data, DNW test data and HART II test data.

  8. On-Line, Self-Learning, Predictive Tool for Determining Payload Thermal Response

    Science.gov (United States)

    Jen, Chian-Li; Tilwick, Leon

    2000-01-01

    This paper will present the results of a joint ManTech / Goddard R&D effort, currently under way, to develop and test a computer based, on-line, predictive simulation model for use by facility operators to predict the thermal response of a payload during thermal vacuum testing. Thermal response was identified as an area that could benefit from the algorithms developed by Dr. Jeri for complex computer simulations. Most thermal vacuum test setups are unique since no two payloads have the same thermal properties. This requires that the operators depend on their past experiences to conduct the test which requires time for them to learn how the payload responds while at the same time limiting any risk of exceeding hot or cold temperature limits. The predictive tool being developed is intended to be used with the new Thermal Vacuum Data System (TVDS) developed at Goddard for the Thermal Vacuum Test Operations group. This model can learn the thermal response of the payload by reading a few data points from the TVDS, accepting the payload's current temperature as the initial condition for prediction. The model can then be used as a predictive tool to estimate the future payload temperatures according to a predetermined shroud temperature profile. If the error of prediction is too big, the model can be asked to re-learn the new situation on-line in real-time and give a new prediction. Based on some preliminary tests, we feel this predictive model can forecast the payload temperature of the entire test cycle within 5 degrees Celsius after it has learned 3 times during the beginning of the test. The tool will allow the operator to play "what-if' experiments to decide what is his best shroud temperature set-point control strategy. This tool will save money by minimizing guess work and optimizing transitions as well as making the testing process safer and easier to conduct.

  9. The impact of frailty on healthcare utilisation in Ireland: evidence from the Irish longitudinal study on ageing.

    Science.gov (United States)

    Roe, Lorna; Normand, Charles; Wren, Maev-Ann; Browne, John; O'Halloran, Aisling M

    2017-09-05

    To examine the impact of frailty on medical and social care utilisation among the Irish community-dwelling older population to inform strategies of integrated care for older people with complex needs. Participants aged ≥65 years from the Irish Longitudinal Study on Ageing (TILDA) representative of the Irish community-dwelling older population were analysed (n = 3507). The frailty index was used to examine patterns of utilisation across medical and social care services. Multivariate logistic and negative binomial regression models were employed to examine the impact of frailty on service utilisation outcomes after controlling for other factors. The prevalence of frailty and pre-frailty was 24% (95% CI: 23, 26%) and 45% (95% CI: 43, 47%) respectively. Frailty was a significant predictor of utilisation of most social care and medical care services after controlling for the main correlates of frailty and observed individual effects. Frailty predicts utilisation of many different types of healthcare services rendering it a useful risk stratification tool for targeting strategies of integrated care. The pattern of care is predominantly medical as few of the frail older population use social care prompting questions about sub-groups of the frail older population with unmet care needs.

  10. Prediction and Control of Cutting Tool Vibration in Cnc Lathe with Anova and Ann

    Directory of Open Access Journals (Sweden)

    S. S. Abuthakeer

    2011-06-01

    Full Text Available Machining is a complex process in which many variables can deleterious the desired results. Among them, cutting tool vibration is the most critical phenomenon which influences dimensional precision of the components machined, functional behavior of the machine tools and life of the cutting tool. In a machining operation, the cutting tool vibrations are mainly influenced by cutting parameters like cutting speed, depth of cut and tool feed rate. In this work, the cutting tool vibrations are controlled using a damping pad made of Neoprene. Experiments were conducted in a CNC lathe where the tool holder is supported with and without damping pad. The cutting tool vibration signals were collected through a data acquisition system supported by LabVIEW software. To increase the buoyancy and reliability of the experiments, a full factorial experimental design was used. Experimental data collected were tested with analysis of variance (ANOVA to understand the influences of the cutting parameters. Empirical models have been developed using analysis of variance (ANOVA. Experimental studies and data analysis have been performed to validate the proposed damping system. Multilayer perceptron neural network model has been constructed with feed forward back-propagation algorithm using the acquired data. On the completion of the experimental test ANN is used to validate the results obtained and also to predict the behavior of the system under any cutting condition within the operating range. The onsite tests show that the proposed system reduces the vibration of cutting tool to a greater extend.

  11. AllerTool: a web server for predicting allergenicity and allergic cross-reactivity in proteins.

    Science.gov (United States)

    Zhang, Zong Hong; Koh, Judice L Y; Zhang, Guang Lan; Choo, Khar Heng; Tammi, Martti T; Tong, Joo Chuan

    2007-02-15

    Assessment of potential allergenicity and patterns of cross-reactivity is necessary whenever novel proteins are introduced into human food chain. Current bioinformatic methods in allergology focus mainly on the prediction of allergenic proteins, with no information on cross-reactivity patterns among known allergens. In this study, we present AllerTool, a web server with essential tools for the assessment of predicted as well as published cross-reactivity patterns of allergens. The analysis tools include graphical representation of allergen cross-reactivity information; a local sequence comparison tool that displays information of known cross-reactive allergens; a sequence similarity search tool for assessment of cross-reactivity in accordance to FAO/WHO Codex alimentarius guidelines; and a method based on support vector machine (SVM). A 10-fold cross-validation results showed that the area under the receiver operating curve (A(ROC)) of SVM models is 0.90 with 86.00% sensitivity (SE) at specificity (SP) of 86.00%. AllerTool is freely available at http://research.i2r.a-star.edu.sg/AllerTool/.

  12. Predictive technologies: Can smart tools augment the brain’s predictive abilities?

    Directory of Open Access Journals (Sweden)

    Giovanni ePezzulo

    2016-04-01

    Full Text Available The ability of looking into the future – namely, the capacity of anticipating future states of the environment or of the body – represents a fundamental function of human (and animal brains. A goalkeeper who tries to guess the ball’s direction; a chess player who attempts to anticipate the opponent’s next move; or a man-in-love who tries to calculate what are the chances of her saying yes – in all these cases, people are simulating possible future states of the world, in order to maximize the success of their decisions or actions. Research in neuroscience is showing that our ability to predict the behaviour of physical or social phenomena is largely dependent on the brain’s ability to integrate current and past information to generate (probabilistic simulations of the future. But could predictive processing be augmented using advanced technologies? In this contribution, we discuss how computational technologies may be used to support, facilitate or enhance the prediction of future events, by considering exemplificative scenarios across different domains, from simpler sensorimotor decisions to more complex cognitive tasks. We also examine the key scientific and technical challenges that must be faced to turn this vision into reality.

  13. PROSPER: an integrated feature-based tool for predicting protease substrate cleavage sites.

    Directory of Open Access Journals (Sweden)

    Jiangning Song

    Full Text Available The ability to catalytically cleave protein substrates after synthesis is fundamental for all forms of life. Accordingly, site-specific proteolysis is one of the most important post-translational modifications. The key to understanding the physiological role of a protease is to identify its natural substrate(s. Knowledge of the substrate specificity of a protease can dramatically improve our ability to predict its target protein substrates, but this information must be utilized in an effective manner in order to efficiently identify protein substrates by in silico approaches. To address this problem, we present PROSPER, an integrated feature-based server for in silico identification of protease substrates and their cleavage sites for twenty-four different proteases. PROSPER utilizes established specificity information for these proteases (derived from the MEROPS database with a machine learning approach to predict protease cleavage sites by using different, but complementary sequence and structure characteristics. Features used by PROSPER include local amino acid sequence profile, predicted secondary structure, solvent accessibility and predicted native disorder. Thus, for proteases with known amino acid specificity, PROSPER provides a convenient, pre-prepared tool for use in identifying protein substrates for the enzymes. Systematic prediction analysis for the twenty-four proteases thus far included in the database revealed that the features we have included in the tool strongly improve performance in terms of cleavage site prediction, as evidenced by their contribution to performance improvement in terms of identifying known cleavage sites in substrates for these enzymes. In comparison with two state-of-the-art prediction tools, PoPS and SitePrediction, PROSPER achieves greater accuracy and coverage. To our knowledge, PROSPER is the first comprehensive server capable of predicting cleavage sites of multiple proteases within a single substrate

  14. Predicting Knowledge Workers' Participation in Voluntary Learning with Employee Characteristics and Online Learning Tools

    Science.gov (United States)

    Hicks, Catherine

    2018-01-01

    Purpose: This paper aims to explore predicting employee learning activity via employee characteristics and usage for two online learning tools. Design/methodology/approach: Statistical analysis focused on observational data collected from user logs. Data are analyzed via regression models. Findings: Findings are presented for over 40,000…

  15. Towards a consensus on datasets and evaluation metrics for developing B-cell epitope prediction tools

    DEFF Research Database (Denmark)

    Greenbaum, Jason A.; Andersen, Pernille; Blythe, Martin

    2007-01-01

    and immunology communities. Improving the accuracy of B-cell epitope prediction methods depends on a community consensus on the data and metrics utilized to develop and evaluate such tools. A workshop, sponsored by the National Institute of Allergy and Infectious Disease (NIAID), was recently held in Washington...

  16. Users' experiences of an emergency department patient admission predictive tool: A qualitative evaluation.

    Science.gov (United States)

    Jessup, Melanie; Crilly, Julia; Boyle, Justin; Wallis, Marianne; Lind, James; Green, David; Fitzgerald, Gerard

    2016-09-01

    Emergency department overcrowding is an increasing issue impacting patients, staff and quality of care, resulting in poor patient and system outcomes. In order to facilitate better management of emergency department resources, a patient admission predictive tool was developed and implemented. Evaluation of the tool's accuracy and efficacy was complemented with a qualitative component that explicated the experiences of users and its impact upon their management strategies, and is the focus of this article. Semi-structured interviews were conducted with 15 pertinent users, including bed managers, after-hours managers, specialty department heads, nurse unit managers and hospital executives. Analysis realised dynamics of accuracy, facilitating communication and enabling group decision-making Users generally welcomed the enhanced potential to predict and plan following the incorporation of the patient admission predictive tool into their daily and weekly decision-making processes. They offered astute feedback with regard to their responses when faced with issues of capacity and communication. Participants reported an growing confidence in making informed decisions in a cultural context that is continually moving from reactive to proactive. This information will inform further patient admission predictive tool development specifically and implementation processes generally. © The Author(s) 2015.

  17. RDNAnalyzer: A tool for DNA secondary structure prediction and sequence analysis.

    Science.gov (United States)

    Afzal, Muhammad; Shahid, Ahmad Ali; Shehzadi, Abida; Nadeem, Shahid; Husnain, Tayyab

    2012-01-01

    RDNAnalyzer is an innovative computer based tool designed for DNA secondary structure prediction and sequence analysis. It can randomly generate the DNA sequence or user can upload the sequences of their own interest in RAW format. It uses and extends the Nussinov dynamic programming algorithm and has various application for the sequence analysis. It predicts the DNA secondary structure and base pairings. It also provides the tools for routinely performed sequence analysis by the biological scientists such as DNA replication, reverse compliment generation, transcription, translation, sequence specific information as total number of nucleotide bases, ATGC base contents along with their respective percentages and sequence cleaner. RDNAnalyzer is a unique tool developed in Microsoft Visual Studio 2008 using Microsoft Visual C# and Windows Presentation Foundation and provides user friendly environment for sequence analysis. It is freely available. http://www.cemb.edu.pk/sw.html RDNAnalyzer - Random DNA Analyser, GUI - Graphical user interface, XAML - Extensible Application Markup Language.

  18. Can we predict Acute Medical readmissions using the BOOST tool? A retrospective case note review.

    Science.gov (United States)

    Lee, Geraldine A; Freedman, Daniel; Beddoes, Penelope; Lyness, Emily; Nixon, Imogen; Srivastava, Vivek

    2016-01-01

    Readmissions within 30-days of hospital discharge are a problem. The aim was to determine if the Better Outcomes for Older Adults through Safe Transitions (BOOST) risk assessment tool was applicable within the UK. Patients over 65 readmitted were identified retrospectively via a casenote review. BOOST assessment was applied with 1 point for each risk factor. 324 patients were readmitted (mean age 77 years) with a median of 7 days between discharge and readmission. The median BOOST score was 3 (IQR 2-4) with polypharmacy evident in 88% and prior hospitalisation in 70%. The tool correctly predicted 90% of readmissions using two or more risk factors and 99.1% if one risk factor was included. The BOOST assessment tool appears appropriate in predicting readmissions however further analysis is required to determine its precision.

  19. International multicenter tool to predict the risk of nonsentinel node metastases in breast cancer

    DEFF Research Database (Denmark)

    Meretoja, Tuomo J; Leidenius, Marjut H K; Heikkilä, Päivi S

    2012-01-01

    predicting nonsentinel node involvement were identified in logistic regression analysis. A multivariable predictive model was developed and validated by area under the receiver operating characteristics curve (AUC), first internally in 500 additional patients and then externally in 1068 patients from other...... centers. All statistical tests were two-sided. Results Nine tumor- and sentinel node-specific variables were identified as statistically significant factors predicting nonsentinel node involvement in logistic regression analysis. A resulting predictive model applied to the internal validation series...... resulted in an AUC of 0.714 (95% confidence interval [CI] = 0.665 to 0.763). For the external validation series, the AUC was 0.719 (95% CI = 0.689 to 0.750). The model was well calibrated in the external validation series. Conclusions We present a novel, international, multicenter, predictive tool...

  20. Predicted Interval Plots (PIPS): A Graphical Tool for Data Monitoring of Clinical Trials.

    Science.gov (United States)

    Li, Lingling; Evans, Scott R; Uno, Hajime; Wei, L J

    2009-11-01

    Group sequential designs are often used in clinical trials to evaluate efficacy and/or futility. Many methods have been developed for different types of endpoints and scenarios. However, few of these methods convey information regarding effect sizes (e.g., treatment differences) and none uses prediction to convey information regarding potential effect size estimates and associated precision, with trial continuation. To address these limitations, Evans et al. (2007) proposed to use prediction and predicted intervals as a flexible and practical tool for quantitative monitoring of clinical trials. In this article, we reaffirm the importance and usefulness of this innovative approach and introduce a graphical summary, predicted interval plots (PIPS), to display the information obtained in the prediction process in a straightforward yet comprehensive manner. We outline the construction of PIPS and apply this method in two examples. The results and the interpretations of the PIPS are discussed.

  1. Combining Results from Distinct MicroRNA Target Prediction Tools Enhances the Performance of Analyses

    Directory of Open Access Journals (Sweden)

    Arthur C. Oliveira

    2017-05-01

    Full Text Available Target prediction is generally the first step toward recognition of bona fide microRNA (miRNA-target interactions in living cells. Several target prediction tools are now available, which use distinct criteria and stringency to provide the best set of candidate targets for a single miRNA or a subset of miRNAs. However, there are many false-negative predictions, and consensus about the optimum strategy to select and use the output information provided by the target prediction tools is lacking. We compared the performance of four tools cited in literature—TargetScan (TS, miRanda-mirSVR (MR, Pita, and RNA22 (R22, and we determined the most effective approach for analyzing target prediction data (individual, union, or intersection. For this purpose, we calculated the sensitivity, specificity, precision, and correlation of these approaches using 10 miRNAs (miR-1-3p, miR-17-5p, miR-21-5p, miR-24-3p, miR-29a-3p, miR-34a-5p, miR-124-3p, miR-125b-5p, miR-145-5p, and miR-155-5p and 1,400 genes (700 validated and 700 non-validated as targets of these miRNAs. The four tools provided a subset of high-quality predictions and returned few false-positive predictions; however, they could not identify several known true targets. We demonstrate that union of TS/MR and TS/MR/R22 enhanced the quality of in silico prediction analysis of miRNA targets. We conclude that the union rather than the intersection of the aforementioned tools is the best strategy for maximizing performance while minimizing the loss of time and resources in subsequent in vivo and in vitro experiments for functional validation of miRNA-target interactions.

  2. Comparative emergency department resource utilisation across age groups.

    Science.gov (United States)

    Burkett, Ellen; Martin-Khan, Melinda G; Gray, Leonard C

    2017-12-11

    Objectives The aim of the present study was to assess comparative emergency department (ED) resource utilisation across age groups. Methods A retrospective analysis of data collected in the National Non-admitted Patient Emergency Department Care Database was undertaken to assess comparative ED resource utilisation across six age groups (0-14, 15-35, 36-64, 65-74, 75-84 and ≥85 years) with previously used surrogate markers of ED resource utilisation. Results Older people had significantly higher resource utilisation for their individual ED episodes of care than younger people, with the effect increasing with advancing age. Conclusion With ED care of older people demonstrated to be more resource intensive than care for younger people, the projected increase in older person presentations anticipated with population aging will have a magnified effect on ED services. These predicted changes in demand for ED care will only be able to be optimally managed if Australian health policy, ED funding instruments and ED models of care are adjusted to take into account the specific care and resource needs of older people. What is known about the topic? Current Australian ED funding models do not adjust for patient age. Several regional studies have suggested higher resource utilisation of ED patients aged ≥65 years. Anticipated rapid population aging mandates that contribution of age to ED visit resource utilisation be further explored. What does this paper add? The present study of national Australian ED presentations compared ED resource utilisation across age groups using surrogate markers of ED cost. Older people were found to have significantly higher resource utilisation in the ED, with the effect increasing further with advancing age. What are the implications for practitioners? The higher resource utilisation of older people in the ED warrants a review of current ED funding models to ensure that they will continue to meet the needs of an aging population.

  3. The use of machine learning and nonlinear statistical tools for ADME prediction.

    Science.gov (United States)

    Sakiyama, Yojiro

    2009-02-01

    Absorption, distribution, metabolism and excretion (ADME)-related failure of drug candidates is a major issue for the pharmaceutical industry today. Prediction of ADME by in silico tools has now become an inevitable paradigm to reduce cost and enhance efficiency in pharmaceutical research. Recently, machine learning as well as nonlinear statistical tools has been widely applied to predict routine ADME end points. To achieve accurate and reliable predictions, it would be a prerequisite to understand the concepts, mechanisms and limitations of these tools. Here, we have devised a small synthetic nonlinear data set to help understand the mechanism of machine learning by 2D-visualisation. We applied six new machine learning methods to four different data sets. The methods include Naive Bayes classifier, classification and regression tree, random forest, Gaussian process, support vector machine and k nearest neighbour. The results demonstrated that ensemble learning and kernel machine displayed greater accuracy of prediction than classical methods irrespective of the data set size. The importance of interaction with the engineering field is also addressed. The results described here provide insights into the mechanism of machine learning, which will enable appropriate usage in the future.

  4. Rosetta Structure Prediction as a Tool for Solving Difficult Molecular Replacement Problems.

    Science.gov (United States)

    DiMaio, Frank

    2017-01-01

    Molecular replacement (MR), a method for solving the crystallographic phase problem using phases derived from a model of the target structure, has proven extremely valuable, accounting for the vast majority of structures solved by X-ray crystallography. However, when the resolution of data is low, or the starting model is very dissimilar to the target protein, solving structures via molecular replacement may be very challenging. In recent years, protein structure prediction methodology has emerged as a powerful tool in model building and model refinement for difficult molecular replacement problems. This chapter describes some of the tools available in Rosetta for model building and model refinement specifically geared toward difficult molecular replacement cases.

  5. Numerical tools to predict the environmental loads for offshore structures under extreme weather conditions

    Science.gov (United States)

    Wu, Yanling

    2018-05-01

    In this paper, the extreme waves were generated using the open source computational fluid dynamic (CFD) tools — OpenFOAM and Waves2FOAM — using linear and nonlinear NewWave input. They were used to conduct the numerical simulation of the wave impact process. Numerical tools based on first-order (with and without stretching) and second-order NewWave are investigated. The simulation to predict force loading for the offshore platform under the extreme weather condition is implemented and compared.

  6. The predictive accuracy of PREDICT: a personalized decision-making tool for Southeast Asian women with breast cancer.

    Science.gov (United States)

    Wong, Hoong-Seam; Subramaniam, Shridevi; Alias, Zarifah; Taib, Nur Aishah; Ho, Gwo-Fuang; Ng, Char-Hong; Yip, Cheng-Har; Verkooijen, Helena M; Hartman, Mikael; Bhoo-Pathy, Nirmala

    2015-02-01

    Web-based prognostication tools may provide a simple and economically feasible option to aid prognostication and selection of chemotherapy in early breast cancers. We validated PREDICT, a free online breast cancer prognostication and treatment benefit tool, in a resource-limited setting. All 1480 patients who underwent complete surgical treatment for stages I to III breast cancer from 1998 to 2006 were identified from the prospective breast cancer registry of University Malaya Medical Centre, Kuala Lumpur, Malaysia. Calibration was evaluated by comparing the model-predicted overall survival (OS) with patients' actual OS. Model discrimination was tested using receiver-operating characteristic (ROC) analysis. Median age at diagnosis was 50 years. The median tumor size at presentation was 3 cm and 54% of patients had lymph node-negative disease. About 55% of women had estrogen receptor-positive breast cancer. Overall, the model-predicted 5 and 10-year OS was 86.3% and 77.5%, respectively, whereas the observed 5 and 10-year OS was 87.6% (difference: -1.3%) and 74.2% (difference: 3.3%), respectively; P values for goodness-of-fit test were 0.18 and 0.12, respectively. The program was accurate in most subgroups of patients, but significantly overestimated survival in patients aged discrimination; areas under ROC curve were 0.78 (95% confidence interval [CI]: 0.74-0.81) and 0.73 (95% CI: 0.68-0.78) for 5 and 10-year OS, respectively. Based on its accurate performance in this study, PREDICT may be clinically useful in prognosticating women with breast cancer and personalizing breast cancer treatment in resource-limited settings.

  7. SU-D-BRB-01: A Predictive Planning Tool for Stereotactic Radiosurgery

    Energy Technology Data Exchange (ETDEWEB)

    Palefsky, S; Roper, J; Elder, E; Dhabaan, A [Winship Cancer Institute of Emory University, Atlanta, GA (United States)

    2015-06-15

    Purpose: To demonstrate the feasibility of a predictive planning tool which provides SRS planning guidance based on simple patient anatomical properties: PTV size, PTV shape and distance from critical structures. Methods: Ten framed SRS cases treated at Winship Cancer Institute of Emory University were analyzed to extract data on PTV size, sphericity (shape), and distance from critical structures such as the brainstem and optic chiasm. The cases consisted of five pairs. Each pair consisted of two cases with a similar diagnosis (such as pituitary adenoma or arteriovenous malformation) that were treated with different techniques: DCA, or IMRS. A Naive Bayes Classifier was trained on this data to establish the conditions under which each treatment modality was used. This model was validated by classifying ten other randomly-selected cases into DCA or IMRS classes, calculating the probability of each technique, and comparing results to the treated technique. Results: Of the ten cases used to validate the model, nine had their technique predicted correctly. The three cases treated with IMRS were all identified as such. Their probabilities of being treated with IMRS ranged between 59% and 100%. Six of the seven cases treated with DCA were correctly classified. These probabilities ranged between 51% and 95%. One case treated with DCA was incorrectly predicted to be an IMRS plan. The model’s confidence in this case was 91%. Conclusion: These findings indicate that a predictive planning tool based on simple patient anatomical properties can predict the SRS technique used for treatment. The algorithm operated with 90% accuracy. With further validation on larger patient populations, this tool may be used clinically to guide planners in choosing an appropriate treatment technique. The prediction algorithm could also be adapted to guide selection of treatment parameters such as treatment modality and number of fields for radiotherapy across anatomical sites.

  8. SU-D-BRB-01: A Predictive Planning Tool for Stereotactic Radiosurgery

    International Nuclear Information System (INIS)

    Palefsky, S; Roper, J; Elder, E; Dhabaan, A

    2015-01-01

    Purpose: To demonstrate the feasibility of a predictive planning tool which provides SRS planning guidance based on simple patient anatomical properties: PTV size, PTV shape and distance from critical structures. Methods: Ten framed SRS cases treated at Winship Cancer Institute of Emory University were analyzed to extract data on PTV size, sphericity (shape), and distance from critical structures such as the brainstem and optic chiasm. The cases consisted of five pairs. Each pair consisted of two cases with a similar diagnosis (such as pituitary adenoma or arteriovenous malformation) that were treated with different techniques: DCA, or IMRS. A Naive Bayes Classifier was trained on this data to establish the conditions under which each treatment modality was used. This model was validated by classifying ten other randomly-selected cases into DCA or IMRS classes, calculating the probability of each technique, and comparing results to the treated technique. Results: Of the ten cases used to validate the model, nine had their technique predicted correctly. The three cases treated with IMRS were all identified as such. Their probabilities of being treated with IMRS ranged between 59% and 100%. Six of the seven cases treated with DCA were correctly classified. These probabilities ranged between 51% and 95%. One case treated with DCA was incorrectly predicted to be an IMRS plan. The model’s confidence in this case was 91%. Conclusion: These findings indicate that a predictive planning tool based on simple patient anatomical properties can predict the SRS technique used for treatment. The algorithm operated with 90% accuracy. With further validation on larger patient populations, this tool may be used clinically to guide planners in choosing an appropriate treatment technique. The prediction algorithm could also be adapted to guide selection of treatment parameters such as treatment modality and number of fields for radiotherapy across anatomical sites

  9. Predictive analytics tools to adjust and monitor performance metrics for the ATLAS Production System

    CERN Document Server

    Barreiro Megino, Fernando Harald; The ATLAS collaboration

    2017-01-01

    Having information such as an estimation of the processing time or possibility of system outage (abnormal behaviour) helps to assist to monitor system performance and to predict its next state. The current cyber-infrastructure presents computing conditions in which contention for resources among high-priority data analysis happens routinely, that might lead to significant workload and data handling interruptions. The lack of the possibility to monitor and to predict the behaviour of the analysis process (its duration) and system’s state itself caused to focus on design of the built-in situational awareness analytic tools.

  10. German mires - Utilisation and protection

    International Nuclear Information System (INIS)

    Roderfeld, H.

    1996-01-01

    Mires in Germany are mainly used for agriculture. Peat mining is important regionally, but forest utilisation less so. Twenty years ago in the former West Germany, the first steps from peatland utilisation to peatland protection were taken. Bog protection programmes were developed first. Nowadays research directed to fen protection has begun, prompted by the decreasing importance of agriculture in Central Europe and an increasing environmental awareness. The situation regarding mire protection in Germany is presented for each Federal State individually. A rough estimate suggests 45 000 ha of protected bogs and 25 000 ha of protected fens. These areas include natural and semi-natural mires as well as rewetted mires. (30 refs.)

  11. Development of nonlinear acoustic propagation analysis tool toward realization of loud noise environment prediction in aeronautics

    Energy Technology Data Exchange (ETDEWEB)

    Kanamori, Masashi, E-mail: kanamori.masashi@jaxa.jp; Takahashi, Takashi, E-mail: takahashi.takashi@jaxa.jp; Aoyama, Takashi, E-mail: aoyama.takashi@jaxa.jp [Japan Aerospace Exploration Agency, 7-44-1, Jindaijihigashi-machi, Chofu, Tokyo (Japan)

    2015-10-28

    Shown in this paper is an introduction of a prediction tool for the propagation of loud noise with the application to the aeronautics in mind. The tool, named SPnoise, is based on HOWARD approach, which can express almost exact multidimensionality of the diffraction effect at the cost of back scattering. This paper argues, in particular, the prediction of the effect of atmospheric turbulence on sonic boom as one of the important issues in aeronautics. Thanks to the simple and efficient modeling of the atmospheric turbulence, SPnoise successfully re-creates the feature of the effect, which often emerges in the region just behind the front and rear shock waves in the sonic boom signature.

  12. Providing access to risk prediction tools via the HL7 XML-formatted risk web service.

    Science.gov (United States)

    Chipman, Jonathan; Drohan, Brian; Blackford, Amanda; Parmigiani, Giovanni; Hughes, Kevin; Bosinoff, Phil

    2013-07-01

    Cancer risk prediction tools provide valuable information to clinicians but remain computationally challenging. Many clinics find that CaGene or HughesRiskApps fit their needs for easy- and ready-to-use software to obtain cancer risks; however, these resources may not fit all clinics' needs. The HughesRiskApps Group and BayesMendel Lab therefore developed a web service, called "Risk Service", which may be integrated into any client software to quickly obtain standardized and up-to-date risk predictions for BayesMendel tools (BRCAPRO, MMRpro, PancPRO, and MelaPRO), the Tyrer-Cuzick IBIS Breast Cancer Risk Evaluation Tool, and the Colorectal Cancer Risk Assessment Tool. Software clients that can convert their local structured data into the HL7 XML-formatted family and clinical patient history (Pedigree model) may integrate with the Risk Service. The Risk Service uses Apache Tomcat and Apache Axis2 technologies to provide an all Java web service. The software client sends HL7 XML information containing anonymized family and clinical history to a Dana-Farber Cancer Institute (DFCI) server, where it is parsed, interpreted, and processed by multiple risk tools. The Risk Service then formats the results into an HL7 style message and returns the risk predictions to the originating software client. Upon consent, users may allow DFCI to maintain the data for future research. The Risk Service implementation is exemplified through HughesRiskApps. The Risk Service broadens the availability of valuable, up-to-date cancer risk tools and allows clinics and researchers to integrate risk prediction tools into their own software interface designed for their needs. Each software package can collect risk data using its own interface, and display the results using its own interface, while using a central, up-to-date risk calculator. This allows users to choose from multiple interfaces while always getting the latest risk calculations. Consenting users contribute their data for future

  13. Multirule Based Diagnostic Approach for the Fog Predictions Using WRF Modelling Tool

    Directory of Open Access Journals (Sweden)

    Swagata Payra

    2014-01-01

    Full Text Available The prediction of fog onset remains difficult despite the progress in numerical weather prediction. It is a complex process and requires adequate representation of the local perturbations in weather prediction models. It mainly depends upon microphysical and mesoscale processes that act within the boundary layer. This study utilizes a multirule based diagnostic (MRD approach using postprocessing of the model simulations for fog predictions. The empiricism involved in this approach is mainly to bridge the gap between mesoscale and microscale variables, which are related to mechanism of the fog formation. Fog occurrence is a common phenomenon during winter season over Delhi, India, with the passage of the western disturbances across northwestern part of the country accompanied with significant amount of moisture. This study implements the above cited approach for the prediction of occurrences of fog and its onset time over Delhi. For this purpose, a high resolution weather research and forecasting (WRF model is used for fog simulations. The study involves depiction of model validation and postprocessing of the model simulations for MRD approach and its subsequent application to fog predictions. Through this approach model identified foggy and nonfoggy days successfully 94% of the time. Further, the onset of fog events is well captured within an accuracy of 30–90 minutes. This study demonstrates that the multirule based postprocessing approach is a useful and highly promising tool in improving the fog predictions.

  14. Tools for Predicting Optical Damage on Inertial Confinement Fusion-Class Laser Systems

    International Nuclear Information System (INIS)

    Nostrand, M.C.; Carr, C.W.; Liao, Z.M.; Honig, J.; Spaeth, M.L.; Manes, K.R.; Johnson, M.A.; Adams, J.J.; Cross, D.A.; Negres, R.A.; Widmayer, C.C.; Williams, W.H.; Matthews, M.J.; Jancaitis, K.S.; Kegelmeyer, L.M.

    2010-01-01

    Operating a fusion-class laser to its full potential requires a balance of operating constraints. On the one hand, the total laser energy delivered must be high enough to give an acceptable probability for ignition success. On the other hand, the laser-induced optical damage levels must be low enough to be acceptably handled with the available infrastructure and budget for optics recycle. Our research goal was to develop the models, database structures, and algorithmic tools (which we collectively refer to as ''Loop Tools'') needed to successfully maintain this balance. Predictive models are needed to plan for and manage the impact of shot campaigns from proposal, to shot, and beyond, covering a time span of years. The cost of a proposed shot campaign must be determined from these models, and governance boards must decide, based on predictions, whether to incorporate a given campaign into the facility shot plan based upon available resources. Predictive models are often built on damage ''rules'' derived from small beam damage tests on small optics. These off-line studies vary the energy, pulse-shape and wavelength in order to understand how these variables influence the initiation of damage sites and how initiated damage sites can grow upon further exposure to UV light. It is essential to test these damage ''rules'' on full-scale optics exposed to the complex conditions of an integrated ICF-class laser system. Furthermore, monitoring damage of optics on an ICF-class laser system can help refine damage rules and aid in the development of new rules. Finally, we need to develop the algorithms and data base management tools for implementing these rules in the Loop Tools. The following highlights progress in the development of the loop tools and their implementation.

  15. Tools for Predicting Optical Damage on Inertial Confinement Fusion-Class Laser Systems

    Energy Technology Data Exchange (ETDEWEB)

    Nostrand, M C; Carr, C W; Liao, Z M; Honig, J; Spaeth, M L; Manes, K R; Johnson, M A; Adams, J J; Cross, D A; Negres, R A; Widmayer, C C; Williams, W H; Matthews, M J; Jancaitis, K S; Kegelmeyer, L M

    2010-12-20

    Operating a fusion-class laser to its full potential requires a balance of operating constraints. On the one hand, the total laser energy delivered must be high enough to give an acceptable probability for ignition success. On the other hand, the laser-induced optical damage levels must be low enough to be acceptably handled with the available infrastructure and budget for optics recycle. Our research goal was to develop the models, database structures, and algorithmic tools (which we collectively refer to as ''Loop Tools'') needed to successfully maintain this balance. Predictive models are needed to plan for and manage the impact of shot campaigns from proposal, to shot, and beyond, covering a time span of years. The cost of a proposed shot campaign must be determined from these models, and governance boards must decide, based on predictions, whether to incorporate a given campaign into the facility shot plan based upon available resources. Predictive models are often built on damage ''rules'' derived from small beam damage tests on small optics. These off-line studies vary the energy, pulse-shape and wavelength in order to understand how these variables influence the initiation of damage sites and how initiated damage sites can grow upon further exposure to UV light. It is essential to test these damage ''rules'' on full-scale optics exposed to the complex conditions of an integrated ICF-class laser system. Furthermore, monitoring damage of optics on an ICF-class laser system can help refine damage rules and aid in the development of new rules. Finally, we need to develop the algorithms and data base management tools for implementing these rules in the Loop Tools. The following highlights progress in the development of the loop tools and their implementation.

  16. Human Splicing Finder: an online bioinformatics tool to predict splicing signals

    OpenAIRE

    Desmet, Francois-Olivier; Hamroun, Dalil; Lalande, Marine; Collod-Beroud, Gwenaelle; Claustres, Mireille; Beroud, Christophe

    2009-01-01

    International audience; Thousands of mutations are identified yearly. Although many directly affect protein expression, an increasing proportion of mutations is now believed to influence mRNA splicing. They mostly affect existing splice sites, but synonymous, non-synonymous or nonsense mutations can also create or disrupt splice sites or auxiliary cis-splicing sequences. To facilitate the analysis of the different mutations, we designed Human Splicing Finder (HSF), a tool to predict the effec...

  17. NetH2pan: A Computational Tool to Guide MHC peptide prediction on Murine Tumors

    DEFF Research Database (Denmark)

    DeVette, Christa I; Andreatta, Massimo; Bardet, Wilfried

    2018-01-01

    With the advancement of personalized cancer immunotherapies, new tools are needed to identify tumor antigens and evaluate T-cell responses in model systems, specifically those that exhibit clinically relevant tumor progression. Key transgenic mouse models of breast cancer are generated and mainta......With the advancement of personalized cancer immunotherapies, new tools are needed to identify tumor antigens and evaluate T-cell responses in model systems, specifically those that exhibit clinically relevant tumor progression. Key transgenic mouse models of breast cancer are generated...... for evaluating antigen specificity in the murine FVB strain. Our study provides the first detailed molecular and immunoproteomic characterization of the FVB H-2q MHC Class I alleles, including >8500 unique peptide ligands, a multi-allele murine MHC peptide prediction tool, and in vivo validation of these data...

  18. Comparison of various tool wear prediction methods during end milling of metal matrix composite

    Science.gov (United States)

    Wiciak, Martyna; Twardowski, Paweł; Wojciechowski, Szymon

    2018-02-01

    In this paper, the problem of tool wear prediction during milling of hard-to-cut metal matrix composite Duralcan™ was presented. The conducted research involved the measurements of acceleration of vibrations during milling with constant cutting conditions, and evaluation of the flank wear. Subsequently, the analysis of vibrations in time and frequency domain, as well as the correlation of the obtained measures with the tool wear values were conducted. The validation of tool wear diagnosis in relation to selected diagnostic measures was carried out with the use of one variable and two variables regression models, as well as with the application of artificial neural networks (ANN). The comparative analysis of the obtained results enable.

  19. Tools for Consumer Rights Protection in the Prediction of Electronic Virtual Market and Technological Changes

    Directory of Open Access Journals (Sweden)

    Mikuláš Gangur

    2014-05-01

    Full Text Available Electronic virtual markets can serve as an alternative tool for collecting information that is spread among numerous experts. This is the principal market functionality from the operators’ point of view. On the other hand it is profits that are the main interest of the market participants. What they expect from the market is liquidity as high as possible and the opportunity for unrestricted trading. Both the operator and the electronic market participant can be considered consumers of this particular market with reference to the requirements for the accuracy of its outputs but also for the market liquidity. Both the above mentioned groups of consumers (the operators and the participants themselves expect protection of their specific consumer rights, i.e. securing the two above mentioned functionalities of the market. These functionalities of the electronic market are, however, influenced by many factors, among others by participants’ activity. The article deals with the motivation tools that may improve the quality of the prediction market. In the prediction electronic virtual market there may be situations in which the commonly used tools for increasing business activities described in the published literature are not significantly effective. For such situations we suggest a new type of motivation incentive consisting in penalizing the individual market participants whose funds are not placed in the market. The functionality of the proposed motivation incentive is presented on the example of the existing data gained from the electronic virtual prediction market which is actively operated.

  20. Bigger Data, Collaborative Tools and the Future of Predictive Drug Discovery

    Science.gov (United States)

    Clark, Alex M.; Swamidass, S. Joshua; Litterman, Nadia; Williams, Antony J.

    2014-01-01

    Over the past decade we have seen a growth in the provision of chemistry data and cheminformatics tools as either free websites or software as a service (SaaS) commercial offerings. These have transformed how we find molecule-related data and use such tools in our research. There have also been efforts to improve collaboration between researchers either openly or through secure transactions using commercial tools. A major challenge in the future will be how such databases and software approaches handle larger amounts of data as it accumulates from high throughput screening and enables the user to draw insights, enable predictions and move projects forward. We now discuss how information from some drug discovery datasets can be made more accessible and how privacy of data should not overwhelm the desire to share it at an appropriate time with collaborators. We also discuss additional software tools that could be made available and provide our thoughts on the future of predictive drug discovery in this age of big data. We use some examples from our own research on neglected diseases, collaborations, mobile apps and algorithm development to illustrate these ideas. PMID:24943138

  1. iPat: intelligent prediction and association tool for genomic research.

    Science.gov (United States)

    Chen, Chunpeng James; Zhang, Zhiwu

    2018-06-01

    The ultimate goal of genomic research is to effectively predict phenotypes from genotypes so that medical management can improve human health and molecular breeding can increase agricultural production. Genomic prediction or selection (GS) plays a complementary role to genome-wide association studies (GWAS), which is the primary method to identify genes underlying phenotypes. Unfortunately, most computing tools cannot perform data analyses for both GWAS and GS. Furthermore, the majority of these tools are executed through a command-line interface (CLI), which requires programming skills. Non-programmers struggle to use them efficiently because of the steep learning curves and zero tolerance for data formats and mistakes when inputting keywords and parameters. To address these problems, this study developed a software package, named the Intelligent Prediction and Association Tool (iPat), with a user-friendly graphical user interface. With iPat, GWAS or GS can be performed using a pointing device to simply drag and/or click on graphical elements to specify input data files, choose input parameters and select analytical models. Models available to users include those implemented in third party CLI packages such as GAPIT, PLINK, FarmCPU, BLINK, rrBLUP and BGLR. Users can choose any data format and conduct analyses with any of these packages. File conversions are automatically conducted for specified input data and selected packages. A GWAS-assisted genomic prediction method was implemented to perform genomic prediction using any GWAS method such as FarmCPU. iPat was written in Java for adaptation to multiple operating systems including Windows, Mac and Linux. The iPat executable file, user manual, tutorials and example datasets are freely available at http://zzlab.net/iPat. zhiwu.zhang@wsu.edu.

  2. Risk determination after an acute myocardial infarction: review of 3 clinical risk prediction tools.

    Science.gov (United States)

    Scruth, Elizabeth Ann; Page, Karen; Cheng, Eugene; Campbell, Michelle; Worrall-Carter, Linda

    2012-01-01

    The objective of the study was to provide comprehensive information for the clinical nurse specialist (CNS) on commonly used clinical prediction (risk assessment) tools used to estimate risk of a secondary cardiac or noncardiac event and mortality in patients undergoing primary percutaneous coronary intervention (PCI) for ST-elevation myocardial infarction (STEMI). The evolution and widespread adoption of primary PCI represent major advances in the treatment of acute myocardial infarction, specifically STEMI. The American College of Cardiology and the American Heart Association have recommended early risk stratification for patients presenting with acute coronary syndromes using several clinical risk scores to identify patients' mortality and secondary event risk after PCI. Clinical nurse specialists are integral to any performance improvement strategy. Their knowledge and understandings of clinical prediction tools will be essential in carrying out important assessment, identifying and managing risk in patients who have sustained a STEMI, and enhancing discharge education including counseling on medications and lifestyle changes. Over the past 2 decades, risk scores have been developed from clinical trials to facilitate risk assessment. There are several risk scores that can be used to determine in-hospital and short-term survival. This article critiques the most common tools: the Thrombolytic in Myocardial Infarction risk score, the Global Registry of Acute Coronary Events risk score, and the Controlled Abciximab and Device Investigation to Lower Late Angioplasty Complications risk score. The importance of incorporating risk screening assessment tools (that are important for clinical prediction models) to guide therapeutic management of patients cannot be underestimated. The ability to forecast secondary risk after a STEMI will assist in determining which patients would require the most aggressive level of treatment and monitoring postintervention including

  3. Academic Training Lecture Regular Programme: Predictive Monte Carlo tools for LHC physics (1/3)

    CERN Multimedia

    2012-01-01

    Predictive Monte Carlo tools for LHC physics (1/3), by Fabio Maltoni (Université Catholique de Louvain (BE)).   Wednesday, May 2, 2012 from 11:00 to 12:00 (Europe/Zurich) at CERN ( 503-1-001 - Council Chamber ) Simulations of events taking place at the LHC play key role in all experimental analyses. Starting from the basics concepts of QCD, we first review how accurate predictions can be obtained via fixed-order calculations at higher orders. Parton showers and event generation are then introduced as a means to achieve fully exclusive predictions. Finally  the recent merging and matching  techniques between fixed-order and fully exclusive simulations are  presented, as well as their implementations via the MLM/CKKW and MC@NLO/POWHEG methods. Organised by Mario Campanelli. More information here.

  4. Biodiversity in environmental assessment-current practice and tools for prediction

    International Nuclear Information System (INIS)

    Gontier, Mikael; Balfors, Berit; Moertberg, Ulla

    2006-01-01

    Habitat loss and fragmentation are major threats to biodiversity. Environmental impact assessment and strategic environmental assessment are essential instruments used in physical planning to address such problems. Yet there are no well-developed methods for quantifying and predicting impacts of fragmentation on biodiversity. In this study, a literature review was conducted on GIS-based ecological models that have potential as prediction tools for biodiversity assessment. Further, a review of environmental impact statements for road and railway projects from four European countries was performed, to study how impact prediction concerning biodiversity issues was addressed. The results of the study showed the existing gap between research in GIS-based ecological modelling and current practice in biodiversity assessment within environmental assessment

  5. The predictive and external validity of the STarT Back Tool in Danish primary care.

    Science.gov (United States)

    Morsø, Lars; Kent, Peter; Albert, Hanne B; Hill, Jonathan C; Kongsted, Alice; Manniche, Claus

    2013-08-01

    The STarT Back Tool (SBT) was recently translated into Danish and its concurrent validity described. This study tested the predictive validity of the Danish SBT. Danish primary care patients (n = 344) were compared to a UK cohort. SBT subgroup validity for predicting high activity limitation at 3 months' follow-up was assessed using descriptive proportions, relative risks, AUC and odds ratios. The SBT had a statistically similar predictive ability in Danish primary care as in UK primary care. Unadjusted relative risks for poor clinical outcome on activity limitation in the Danish cohort were 2.4 (1.7-3.4) for the medium-risk subgroup and 2.8 (1.8-3.8) for the high-risk subgroup versus 3.1 (2.5-3.9) and 4.5 (3.6-5.6) for the UK cohort. Adjusting for confounders appeared to explain the lower predictive ability of the Danish high-risk group. The Danish SBT distinguished between low- and medium-risk subgroups with a similar predictive ability of the UK SBT. That distinction is useful information for informing patients about their expected prognosis and may help guiding clinicians' choice of treatment. However, cross-cultural differences in the SBT psychosocial subscale may reduce the predictive ability of the high-risk subgroup in Danish primary care.

  6. Utility of Eating Assessment Tool-10 in Predicting Aspiration in Patients with Unilateral Vocal Fold Paralysis.

    Science.gov (United States)

    Zuniga, Steven A; Ebersole, Barbara; Jamal, Nausheen

    2018-03-01

    Objective Examine the incidence of penetration/aspiration in patients with unilateral vocal fold immobility and investigate the relationship with self-reported perception of dysphagia. Study Design Case series with chart review. Setting Academic cancer center. Subjects and Methods Adult patients with unilateral vocal fold immobility diagnosed between 2014 and 2016 were reviewed. Patients were stratified into an aspiration group and a nonaspiration group using objective findings on flexible endoscopic evaluation of swallowing, as scored using Rosenbek's Penetration Aspiration Scale. Objective findings were compared to patient perception of dysphagia. Bivariate linear correlation analysis was performed to evaluate correlation between Eating Assessment Tool-10 scores and presence of aspiration. Tests of diagnostic accuracy were calculated to investigate the predictive value of Eating Assessment Tool-10 scores >9 on aspiration risk. Results Of the 35 patients with new-onset unilateral vocal fold immobility were evaluated, 25.7% (9/35) demonstrated tracheal aspiration. Mean ± SD Eating Assessment Tool-10 scores were 19.2 ± 13.7 for aspirators and 7.0 ± 7.8 for nonaspirators ( P = .016). A statistically significant correlation was demonstrated between increasing Eating Assessment Tool-10 scores and Penetration Aspiration Scale scores ( r = 0.511, P = .002). Diagnostic accuracy analysis for aspiration risk in patients with an Eating Assessment Tool-10 score >9 revealed a sensitivity of 77.8% and a specificity of 73.1%. Conclusion Patient perception of swallowing difficulty may have utility in predicting aspiration risk. An EAT-10 of >9 in patients with unilateral vocal fold immobility may portend up to a 5 times greater risk of aspiration. Routine swallow testing to assess for penetration/aspiration may be indicated in patients with unilateral vocal fold immobility.

  7. Predicting complication risk in spine surgery: a prospective analysis of a novel risk assessment tool.

    Science.gov (United States)

    Veeravagu, Anand; Li, Amy; Swinney, Christian; Tian, Lu; Moraff, Adrienne; Azad, Tej D; Cheng, Ivan; Alamin, Todd; Hu, Serena S; Anderson, Robert L; Shuer, Lawrence; Desai, Atman; Park, Jon; Olshen, Richard A; Ratliff, John K

    2017-07-01

    OBJECTIVE The ability to assess the risk of adverse events based on known patient factors and comorbidities would provide more effective preoperative risk stratification. Present risk assessment in spine surgery is limited. An adverse event prediction tool was developed to predict the risk of complications after spine surgery and tested on a prospective patient cohort. METHODS The spinal Risk Assessment Tool (RAT), a novel instrument for the assessment of risk for patients undergoing spine surgery that was developed based on an administrative claims database, was prospectively applied to 246 patients undergoing 257 spinal procedures over a 3-month period. Prospectively collected data were used to compare the RAT to the Charlson Comorbidity Index (CCI) and the American College of Surgeons National Surgery Quality Improvement Program (ACS NSQIP) Surgical Risk Calculator. Study end point was occurrence and type of complication after spine surgery. RESULTS The authors identified 69 patients (73 procedures) who experienced a complication over the prospective study period. Cardiac complications were most common (10.2%). Receiver operating characteristic (ROC) curves were calculated to compare complication outcomes using the different assessment tools. Area under the curve (AUC) analysis showed comparable predictive accuracy between the RAT and the ACS NSQIP calculator (0.670 [95% CI 0.60-0.74] in RAT, 0.669 [95% CI 0.60-0.74] in NSQIP). The CCI was not accurate in predicting complication occurrence (0.55 [95% CI 0.48-0.62]). The RAT produced mean probabilities of 34.6% for patients who had a complication and 24% for patients who did not (p = 0.0003). The generated predicted values were stratified into low, medium, and high rates. For the RAT, the predicted complication rate was 10.1% in the low-risk group (observed rate 12.8%), 21.9% in the medium-risk group (observed 31.8%), and 49.7% in the high-risk group (observed 41.2%). The ACS NSQIP calculator consistently

  8. Perioperative Respiratory Adverse Events in Pediatric Ambulatory Anesthesia: Development and Validation of a Risk Prediction Tool.

    Science.gov (United States)

    Subramanyam, Rajeev; Yeramaneni, Samrat; Hossain, Mohamed Monir; Anneken, Amy M; Varughese, Anna M

    2016-05-01

    Perioperative respiratory adverse events (PRAEs) are the most common cause of serious adverse events in children receiving anesthesia. Our primary aim of this study was to develop and validate a risk prediction tool for the occurrence of PRAE from the onset of anesthesia induction until discharge from the postanesthesia care unit in children younger than 18 years undergoing elective ambulatory anesthesia for surgery and radiology. The incidence of PRAE was studied. We analyzed data from 19,059 patients from our department's quality improvement database. The predictor variables were age, sex, ASA physical status, morbid obesity, preexisting pulmonary disorder, preexisting neurologic disorder, and location of ambulatory anesthesia (surgery or radiology). Composite PRAE was defined as the presence of any 1 of the following events: intraoperative bronchospasm, intraoperative laryngospasm, postoperative apnea, postoperative laryngospasm, postoperative bronchospasm, or postoperative prolonged oxygen requirement. Development and validation of the risk prediction tool for PRAE were performed using a split sampling technique to split the database into 2 independent cohorts based on the year when the patient received ambulatory anesthesia for surgery and radiology using logistic regression. A risk score was developed based on the regression coefficients from the validation tool. The performance of the risk prediction tool was assessed by using tests of discrimination and calibration. The overall incidence of composite PRAE was 2.8%. The derivation cohort included 8904 patients, and the validation cohort included 10,155 patients. The risk of PRAE was 3.9% in the development cohort and 1.8% in the validation cohort. Age ≤ 3 years (versus >3 years), ASA physical status II or III (versus ASA physical status I), morbid obesity, preexisting pulmonary disorder, and surgery (versus radiology) significantly predicted the occurrence of PRAE in a multivariable logistic regression

  9. Development and Application of Predictive Tools for MHD Stability Limits in Tokamaks

    Energy Technology Data Exchange (ETDEWEB)

    Brennan, Dylan [Princeton Univ., NJ (United States); Miller, G. P. [Univ. of Tulsa, Tulsa, AZ (United States)

    2016-10-03

    This is a project to develop and apply analytic and computational tools to answer physics questions relevant to the onset of non-ideal magnetohydrodynamic (MHD) instabilities in toroidal magnetic confinement plasmas. The focused goal of the research is to develop predictive tools for these instabilities, including an inner layer solution algorithm, a resistive wall with control coils, and energetic particle effects. The production phase compares studies of instabilities in such systems using analytic techniques, PEST- III and NIMROD. Two important physics puzzles are targeted as guiding thrusts for the analyses. The first is to form an accurate description of the physics determining whether the resistive wall mode or a tearing mode will appear first as β is increased at low rotation and low error fields in DIII-D. The second is to understand the physical mechanism behind recent NIMROD results indicating strong damping and stabilization from energetic particle effects on linear resistive modes. The work seeks to develop a highly relevant predictive tool for ITER, advance the theoretical description of this physics in general, and analyze these instabilities in experiments such as ASDEX Upgrade, DIII-D, JET, JT-60U and NTSX. The awardee on this grant is the University of Tulsa. The research efforts are supervised principally by Dr. Brennan. Support is included for two graduate students, and a strong collaboration with Dr. John M. Finn of LANL. The work includes several ongoing collaborations with General Atomics, PPPL, and the NIMROD team, among others.

  10. Development and Application of Predictive Tools for MHD Stability Limits in Tokamaks

    International Nuclear Information System (INIS)

    Brennan, Dylan; Miller, G. P.

    2016-01-01

    This is a project to develop and apply analytic and computational tools to answer physics questions relevant to the onset of non-ideal magnetohydrodynamic (MHD) instabilities in toroidal magnetic confinement plasmas. The focused goal of the research is to develop predictive tools for these instabilities, including an inner layer solution algorithm, a resistive wall with control coils, and energetic particle effects. The production phase compares studies of instabilities in such systems using analytic techniques, PEST- III and NIMROD. Two important physics puzzles are targeted as guiding thrusts for the analyses. The first is to form an accurate description of the physics determining whether the resistive wall mode or a tearing mode will appear first as β is increased at low rotation and low error fields in DIII-D. The second is to understand the physical mechanism behind recent NIMROD results indicating strong damping and stabilization from energetic particle effects on linear resistive modes. The work seeks to develop a highly relevant predictive tool for ITER, advance the theoretical description of this physics in general, and analyze these instabilities in experiments such as ASDEX Upgrade, DIII-D, JET, JT-60U and NTSX. The awardee on this grant is the University of Tulsa. The research efforts are supervised principally by Dr. Brennan. Support is included for two graduate students, and a strong collaboration with Dr. John M. Finn of LANL. The work includes several ongoing collaborations with General Atomics, PPPL, and the NIMROD team, among others.

  11. A Multi-Center Prospective Derivation and Validation of a Clinical Prediction Tool for Severe Clostridium difficile Infection.

    LENUS (Irish Health Repository)

    Na, Xi

    2015-04-23

    Prediction of severe clinical outcomes in Clostridium difficile infection (CDI) is important to inform management decisions for optimum patient care. Currently, treatment recommendations for CDI vary based on disease severity but validated methods to predict severe disease are lacking. The aim of the study was to derive and validate a clinical prediction tool for severe outcomes in CDI.

  12. Structure Based Thermostability Prediction Models for Protein Single Point Mutations with Machine Learning Tools.

    Directory of Open Access Journals (Sweden)

    Lei Jia

    Full Text Available Thermostability issue of protein point mutations is a common occurrence in protein engineering. An application which predicts the thermostability of mutants can be helpful for guiding decision making process in protein design via mutagenesis. An in silico point mutation scanning method is frequently used to find "hot spots" in proteins for focused mutagenesis. ProTherm (http://gibk26.bio.kyutech.ac.jp/jouhou/Protherm/protherm.html is a public database that consists of thousands of protein mutants' experimentally measured thermostability. Two data sets based on two differently measured thermostability properties of protein single point mutations, namely the unfolding free energy change (ddG and melting temperature change (dTm were obtained from this database. Folding free energy change calculation from Rosetta, structural information of the point mutations as well as amino acid physical properties were obtained for building thermostability prediction models with informatics modeling tools. Five supervised machine learning methods (support vector machine, random forests, artificial neural network, naïve Bayes classifier, K nearest neighbor and partial least squares regression are used for building the prediction models. Binary and ternary classifications as well as regression models were built and evaluated. Data set redundancy and balancing, the reverse mutations technique, feature selection, and comparison to other published methods were discussed. Rosetta calculated folding free energy change ranked as the most influential features in all prediction models. Other descriptors also made significant contributions to increasing the accuracy of the prediction models.

  13. Development of a CME-associated geomagnetic storm intensity prediction tool

    Science.gov (United States)

    Wu, C. C.; DeHart, J. M.

    2015-12-01

    From 1995 to 2012, the Wind spacecraft recorded 168 magnetic cloud (MC) events. Among those events, 79 were found to have upstream shock waves and their source locations on the Sun were identified. Using a recipe of interplanetary magnetic field (IMF) Bz initial turning direction after shock (Wu et al., 1996, GRL), it is found that the north-south polarity of 66 (83.5%) out of the 79 events were accurately predicted. These events were tested and further analyzed, reaffirming that the Bz intial turning direction was accurate. The results also indicate that 37 of the 79 MCs originate from the north (of the Sun) averaged a Dst_min of -119 nT, whereas 42 of the MCs originating from the south (of the Sun) averaged -89 nT. In an effort to provide this research to others, a website was built that incorporated various tools and pictures to predict the intensity of the geomagnetic storms. The tool is capable of predicting geomagnetic storms with different ranges of Dst_min (from no-storm to gigantic storms). This work was supported by Naval Research Lab HBCU/MI Internship program and Chief of Naval Research.

  14. Calibration of Multiple In Silico Tools for Predicting Pathogenicity of Mismatch Repair Gene Missense Substitutions

    Science.gov (United States)

    Thompson, Bryony A.; Greenblatt, Marc S.; Vallee, Maxime P.; Herkert, Johanna C.; Tessereau, Chloe; Young, Erin L.; Adzhubey, Ivan A.; Li, Biao; Bell, Russell; Feng, Bingjian; Mooney, Sean D.; Radivojac, Predrag; Sunyaev, Shamil R.; Frebourg, Thierry; Hofstra, Robert M.W.; Sijmons, Rolf H.; Boucher, Ken; Thomas, Alun; Goldgar, David E.; Spurdle, Amanda B.; Tavtigian, Sean V.

    2015-01-01

    Classification of rare missense substitutions observed during genetic testing for patient management is a considerable problem in clinical genetics. The Bayesian integrated evaluation of unclassified variants is a solution originally developed for BRCA1/2. Here, we take a step toward an analogous system for the mismatch repair (MMR) genes (MLH1, MSH2, MSH6, and PMS2) that confer colon cancer susceptibility in Lynch syndrome by calibrating in silico tools to estimate prior probabilities of pathogenicity for MMR gene missense substitutions. A qualitative five-class classification system was developed and applied to 143 MMR missense variants. This identified 74 missense substitutions suitable for calibration. These substitutions were scored using six different in silico tools (Align-Grantham Variation Grantham Deviation, multivariate analysis of protein polymorphisms [MAPP], Mut-Pred, PolyPhen-2.1, Sorting Intolerant From Tolerant, and Xvar), using curated MMR multiple sequence alignments where possible. The output from each tool was calibrated by regression against the classifications of the 74 missense substitutions; these calibrated outputs are interpretable as prior probabilities of pathogenicity. MAPP was the most accurate tool and MAPP + PolyPhen-2.1 provided the best-combined model (R2 = 0.62 and area under receiver operating characteristic = 0.93). The MAPP + PolyPhen-2.1 output is sufficiently predictive to feed as a continuous variable into the quantitative Bayesian integrated evaluation for clinical classification of MMR gene missense substitutions. PMID:22949387

  15. Automated tool for virtual screening and pharmacology-based pathway prediction and analysis

    Directory of Open Access Journals (Sweden)

    Sugandh Kumar

    2017-10-01

    Full Text Available The virtual screening is an effective tool for the lead identification in drug discovery. However, there are limited numbers of crystal structures available as compared to the number of biological sequences which makes (Structure Based Drug Discovery SBDD a difficult choice. The current tool is an attempt to automate the protein structure modelling and automatic virtual screening followed by pharmacology-based prediction and analysis. Starting from sequence(s, this tool automates protein structure modelling, binding site identification, automated docking, ligand preparation, post docking analysis and identification of hits in the biological pathways that can be modulated by a group of ligands. This automation helps in the characterization of ligands selectivity and action of ligands on a complex biological molecular network as well as on individual receptor. The judicial combination of the ligands binding different receptors can be used to inhibit selective biological pathways in a disease. This tool also allows the user to systemically investigate network-dependent effects of a drug or drug candidate.

  16. A numerical tool for reproducing driver behaviour: experiments and predictive simulations.

    Science.gov (United States)

    Casucci, M; Marchitto, M; Cacciabue, P C

    2010-03-01

    This paper presents the simulation tool called SDDRIVE (Simple Simulation of Driver performance), which is the numerical computerised implementation of the theoretical architecture describing Driver-Vehicle-Environment (DVE) interactions, contained in Cacciabue and Carsten [Cacciabue, P.C., Carsten, O. A simple model of driver behaviour to sustain design and safety assessment of automated systems in automotive environments, 2010]. Following a brief description of the basic algorithms that simulate the performance of drivers, the paper presents and discusses a set of experiments carried out in a Virtual Reality full scale simulator for validating the simulation. Then the predictive potentiality of the tool is shown by discussing two case studies of DVE interactions, performed in the presence of different driver attitudes in similar traffic conditions.

  17. An Open-Source Web-Based Tool for Resource-Agnostic Interactive Translation Prediction

    Directory of Open Access Journals (Sweden)

    Daniel Torregrosa

    2014-09-01

    Full Text Available We present a web-based open-source tool for interactive translation prediction (ITP and describe its underlying architecture. ITP systems assist human translators by making context-based computer-generated suggestions as they type. Most of the ITP systems in literature are strongly coupled with a statistical machine translation system that is conveniently adapted to provide the suggestions. Our system, however, follows a resource-agnostic approach and suggestions are obtained from any unmodified black-box bilingual resource. This paper reviews our ITP method and describes the architecture of Forecat, a web tool, partly based on the recent technology of web components, that eases the use of our ITP approach in any web application requiring this kind of translation assistance. We also evaluate the performance of our method when using an unmodified Moses-based statistical machine translation system as the bilingual resource.

  18. Evaluation of an Automated Analysis Tool for Prostate Cancer Prediction Using Multiparametric Magnetic Resonance Imaging.

    Directory of Open Access Journals (Sweden)

    Matthias C Roethke

    Full Text Available To evaluate the diagnostic performance of an automated analysis tool for the assessment of prostate cancer based on multiparametric magnetic resonance imaging (mpMRI of the prostate.A fully automated analysis tool was used for a retrospective analysis of mpMRI sets (T2-weighted, T1-weighted dynamic contrast-enhanced, and diffusion-weighted sequences. The software provided a malignancy prediction value for each image pixel, defined as Malignancy Attention Index (MAI that can be depicted as a colour map overlay on the original images. The malignancy maps were compared to histopathology derived from a combination of MRI-targeted and systematic transperineal MRI/TRUS-fusion biopsies.In total, mpMRI data of 45 patients were evaluated. With a sensitivity of 85.7% (with 95% CI of 65.4-95.0, a specificity of 87.5% (with 95% CI of 69.0-95.7 and a diagnostic accuracy of 86.7% (with 95% CI of 73.8-93.8 for detection of prostate cancer, the automated analysis results corresponded well with the reported diagnostic accuracies by human readers based on the PI-RADS system in the current literature.The study revealed comparable diagnostic accuracies for the detection of prostate cancer of a user-independent MAI-based automated analysis tool and PI-RADS-scoring-based human reader analysis of mpMRI. Thus, the analysis tool could serve as a detection support system for less experienced readers. The results of the study also suggest the potential of MAI-based analysis for advanced lesion assessments, such as cancer extent and staging prediction.

  19. In silico site-directed mutagenesis informs species-specific predictions of chemical susceptibility derived from the Sequence Alignment to Predict Across Species Susceptibility (SeqAPASS) tool

    Science.gov (United States)

    The Sequence Alignment to Predict Across Species Susceptibility (SeqAPASS) tool was developed to address needs for rapid, cost effective methods of species extrapolation of chemical susceptibility. Specifically, the SeqAPASS tool compares the primary sequence (Level 1), functiona...

  20. TargetRNA: a tool for predicting targets of small RNA action in bacteria

    OpenAIRE

    Tjaden, Brian

    2008-01-01

    Many small RNA (sRNA) genes in bacteria act as posttranscriptional regulators of target messenger RNAs. Here, we present TargetRNA, a web tool for predicting mRNA targets of sRNA action in bacteria. TargetRNA takes as input a genomic sequence that may correspond to an sRNA gene. TargetRNA then uses a dynamic programming algorithm to search each annotated message in a specified genome for mRNAs that evince basepair-binding potential to the input sRNA sequence. Based on the calculated basepair-...

  1. Risk analysis for dengue suitability in Africa using the ArcGIS predictive analysis tools (PA tools).

    Science.gov (United States)

    Attaway, David F; Jacobsen, Kathryn H; Falconer, Allan; Manca, Germana; Waters, Nigel M

    2016-06-01

    Risk maps identifying suitable locations for infection transmission are important for public health planning. Data on dengue infection rates are not readily available in most places where the disease is known to occur. A newly available add-in to Esri's ArcGIS software package, the ArcGIS Predictive Analysis Toolset (PA Tools), was used to identify locations within Africa with environmental characteristics likely to be suitable for transmission of dengue virus. A more accurate, robust, and localized (1 km × 1 km) dengue risk map for Africa was created based on bioclimatic layers, elevation data, high-resolution population data, and other environmental factors that a search of the peer-reviewed literature showed to be associated with dengue risk. Variables related to temperature, precipitation, elevation, and population density were identified as good predictors of dengue suitability. Areas of high dengue suitability occur primarily within West Africa and parts of Central Africa and East Africa, but even in these regions the suitability is not homogenous. This risk mapping technique for an infection transmitted by Aedes mosquitoes draws on entomological, epidemiological, and geographic data. The method could be applied to other infectious diseases (such as Zika) in order to provide new insights for public health officials and others making decisions about where to increase disease surveillance activities and implement infection prevention and control efforts. The ability to map threats to human and animal health is important for tracking vectorborne and other emerging infectious diseases and modeling the likely impacts of climate change. Copyright © 2016 Elsevier B.V. All rights reserved.

  2. Tools and techniques for ageing predictions in nuclear reactors through condition monitoring

    International Nuclear Information System (INIS)

    Verma, R.M.P.

    1994-01-01

    To operate the nuclear reactors beyond their design predicted life is gaining importance because of huge replacement and decommissioning costs. But experience shows that nuclear plant safety and reliability may decline in the later years of plant life due to ageing degradation. Ageing of nuclear plant components, structures and systems, if unmitigated reduces their safety margins provided in the design and thus increases risks to public health and safety. These safety margins must be monitored throughout plant service life including any extended life. Condition monitoring of nuclear reactor components/equipment and systems can be done to study the effect of ageing, status of safety margins and effect of corrective and mitigating actions taken. The tools and techniques of condition monitoring are also important in failure trending, predictive maintenance, evaluation of scheduled maintenance, in mitigation of ageing, life extension and reliability studies. (author). 1 fig., 1 annexure

  3. Nonlinear Prediction As A Tool For Determining Parameters For Phase Space Reconstruction In Meteorology

    Science.gov (United States)

    Miksovsky, J.; Raidl, A.

    Time delays phase space reconstruction represents one of useful tools of nonlinear time series analysis, enabling number of applications. Its utilization requires the value of time delay to be known, as well as the value of embedding dimension. There are sev- eral methods how to estimate both these parameters. Typically, time delay is computed first, followed by embedding dimension. Our presented approach is slightly different - we reconstructed phase space for various combinations of mentioned parameters and used it for prediction by means of the nearest neighbours in the phase space. Then some measure of prediction's success was computed (correlation or RMSE, e.g.). The position of its global maximum (minimum) should indicate the suitable combination of time delay and embedding dimension. Several meteorological (particularly clima- tological) time series were used for the computations. We have also created a MS- Windows based program in order to implement this approach - its basic features will be presented as well.

  4. Prediction of ttt curves of cold working tool steels using support vector machine model

    Science.gov (United States)

    Pillai, Nandakumar; Karthikeyan, R., Dr.

    2018-04-01

    The cold working tool steels are of high carbon steels with metallic alloy additions which impart higher hardenability, abrasion resistance and less distortion in quenching. The microstructure changes occurring in tool steel during heat treatment is of very much importance as the final properties of the steel depends upon these changes occurred during the process. In order to obtain the desired performance the alloy constituents and its ratio plays a vital role as the steel transformation itself is complex in nature and depends very much upon the time and temperature. The proper treatment can deliver satisfactory results, at the same time process deviation can completely spoil the results. So knowing time temperature transformation (TTT) of phases is very critical which varies for each type depending upon its constituents and proportion range. To obtain adequate post heat treatment properties the percentage of retained austenite should be lower and metallic carbides obtained should be fine in nature. Support vector machine is a computational model which can learn from the observed data and use these to predict or solve using mathematical model. Back propagation feedback network will be created and trained for further solutions. The points on the TTT curve for the known transformations curves are used to plot the curves for different materials. These data will be trained to predict TTT curves for other steels having similar alloying constituents but with different proportion range. The proposed methodology can be used for prediction of TTT curves for cold working steels and can be used for prediction of phases for different heat treatment methods.

  5. XBeach-G: a tool for predicting gravel barrier response to extreme storm conditions

    Science.gov (United States)

    Masselink, Gerd; Poate, Tim; McCall, Robert; Roelvink, Dano; Russell, Paul; Davidson, Mark

    2014-05-01

    Gravel beaches protect low-lying back-barrier regions from flooding during storm events and their importance to society is widely acknowledged. Unfortunately, breaching and extensive storm damage has occurred at many gravel sites and this is likely to increase as a result of sea-level rise and enhanced storminess due to climate change. Limited scientific guidance is currently available to provide beach managers with operational management tools to predict the response of gravel beaches to storms. The New Understanding and Prediction of Storm Impacts on Gravel beaches (NUPSIG) project aims to improve our understanding of storm impacts on gravel coastal environments and to develop a predictive capability by modelling these impacts. The NUPSIG project uses a 5-pronged approach to address its aim: (1) analyse hydrodynamic data collected during a proto-type laboratory experiment on a gravel beach; (2) collect hydrodynamic field data on a gravel beach under a range of conditions, including storm waves with wave heights up to 3 m; (3) measure swash dynamics and beach response on 10 gravel beaches during extreme wave conditions with wave heights in excess of 3 m; (4) use the data collected under 1-3 to develop and validate a numerical model to model hydrodynamics and morphological response of gravel beaches under storm conditions; and (5) develop a tool for end-users, based on the model formulated under (4), for predicting storm response of gravel beaches and barriers. The aim of this presentation is to present the key results of the NUPSIG project and introduce the end-user tool for predicting storm response on gravel beaches. The model is based on the numerical model XBeach, and different forcing scenarios (wave and tides), barrier configurations (dimensions) and sediment characteristics are easily uploaded for model simulations using a Graphics User Interface (GUI). The model can be used to determine the vulnerability of gravel barriers to storm events, but can also be

  6. Limitations of polyethylene glycol-induced precipitation as predictive tool for protein solubility during formulation development.

    Science.gov (United States)

    Hofmann, Melanie; Winzer, Matthias; Weber, Christian; Gieseler, Henning

    2018-05-01

    Polyethylene glycol (PEG)-induced protein precipitation is often used to extrapolate apparent protein solubility at specific formulation compositions. The procedure was used for several fields of application such as protein crystal growth but also protein formulation development. Nevertheless, most studies focused on applicability in protein crystal growth. In contrast, this study focuses on applicability of PEG-induced precipitation during high-concentration protein formulation development. In this study, solubility of three different model proteins was investigated over a broad range of pH. Solubility values predicted by PEG-induced precipitation were compared to real solubility behaviour determined by either turbidity or content measurements. Predicted solubility by PEG-induced precipitation was confirmed for an Fc fusion protein and a monoclonal antibody. In contrast, PEG-induced precipitation failed to predict solubility of a single-domain antibody construct. Applicability of PEG-induced precipitation as indicator of protein solubility during formulation development was found to be not valid for one of three model molecules. Under certain conditions, PEG-induced protein precipitation is not valid for prediction of real protein solubility behaviour. The procedure should be used carefully as tool for formulation development, and the results obtained should be validated by additional investigations. © 2017 Royal Pharmaceutical Society.

  7. Predictive tools for the evaluation of microbial effects on drugs during gastrointestinal passage.

    Science.gov (United States)

    Pieper, Ines A; Bertau, Martin

    2010-06-01

    Predicting drug metabolism after oral administration is highly complex, yet indispensable. Hitherto, drug metabolism mainly focuses on hepatic processes. In the intestine, drug molecules encounter the metabolic activity of microorganisms prior to absorption through the gut wall. Drug biotransformation through the gastrointestinal microflora has the potential to evoke serious problems because the metabolites formed may cause unexpected and undesired side effects in patients. Hence, in the course of drug development, the question has to be addressed if microbially formed metabolites are physiologically active, pharmaceutically active or even toxic. In order to provide answers to these questions and to keep the number of laboratory tests needed low, predictive tools - in vivo as well as in silico - are invaluable. This review gives an outline of the current state of the art in the field of predicting the drug biotransformation through the gastrointestinal microflora on several levels of modelling. A comprehensive review of the literature with a thorough discussion on assets and drawbacks of the different modelling approaches. The impact of the gastrointestinal drug biotransformation on patients' health will grow with increasing complexity of drug entities. Predicting metabolic fates of drugs by combining in vitro and in silico models provides invaluable information which will be suitable to particularly reduce in vivo studies.

  8. Evaluation of the efficacy of six nutritional screening tools to predict malnutrition in the elderly.

    Science.gov (United States)

    Poulia, Kalliopi-Anna; Yannakoulia, Mary; Karageorgou, Dimitra; Gamaletsou, Maria; Panagiotakos, Demosthenes B; Sipsas, Nikolaos V; Zampelas, Antonis

    2012-06-01

    Malnutrition in the elderly is a multifactorial problem, more prevalent in hospitals and care homes. The absence of a gold standard in evaluating nutritional risk led us to evaluate the efficacy of six nutritional screening tools used in the elderly. Two hundred forty eight elderly patients (129 men, 119 female women, aged 75.2 ± 8.5 years) were examined. Nutritional screening was performed on admission using the following tools: Nutritional Risk Index (NRI), Geriatric Nutritional Risk Index (GNRI), Subjective Global Assessment (SGA), Mini Nutritional Assessment - Screening Form (MNA-SF), Malnutrition Universal Screening Tool (MUST) and Nutritional Risk Screening 2002 (NRS 2002). A combined index for malnutrition was also calculated. Nutritional risk and/or malnutrition varied greatly, ranging from 47.2 to 97.6%, depending on the nutritional screening tool used. MUST was the most valid screening tool (validity coefficient = 0.766, CI 95%: 0.690-0.841), while SGA was in better agreement with the combined index (κ = 0.707, p = 0.000). NRS 2002 although was the highest in sensitivity (99.4%), it was the lowest in specificity (6.1%) and positive predictive value (68.2%). MUST seem to be the most valid in the evaluation of the risk for malnutrition in the elderly upon admission to the hospital. NRS 2002 was found to overestimate nutritional risk in the elderly. Copyright © 2011 Elsevier Ltd and European Society for Clinical Nutrition and Metabolism. All rights reserved.

  9. Novel inter and intra prediction tools under consideration for the emerging AV1 video codec

    Science.gov (United States)

    Joshi, Urvang; Mukherjee, Debargha; Han, Jingning; Chen, Yue; Parker, Sarah; Su, Hui; Chiang, Angie; Xu, Yaowu; Liu, Zoe; Wang, Yunqing; Bankoski, Jim; Wang, Chen; Keyder, Emil

    2017-09-01

    Google started the WebM Project in 2010 to develop open source, royalty- free video codecs designed specifically for media on the Web. The second generation codec released by the WebM project, VP9, is currently served by YouTube, and enjoys billions of views per day. Realizing the need for even greater compression efficiency to cope with the growing demand for video on the web, the WebM team embarked on an ambitious project to develop a next edition codec AV1, in a consortium of major tech companies called the Alliance for Open Media, that achieves at least a generational improvement in coding efficiency over VP9. In this paper, we focus primarily on new tools in AV1 that improve the prediction of pixel blocks before transforms, quantization and entropy coding are invoked. Specifically, we describe tools and coding modes that improve intra, inter and combined inter-intra prediction. Results are presented on standard test sets.

  10. DR2DI: a powerful computational tool for predicting novel drug-disease associations

    Science.gov (United States)

    Lu, Lu; Yu, Hua

    2018-05-01

    Finding the new related candidate diseases for known drugs provides an effective method for fast-speed and low-risk drug development. However, experimental identification of drug-disease associations is expensive and time-consuming. This motivates the need for developing in silico computational methods that can infer true drug-disease pairs with high confidence. In this study, we presented a novel and powerful computational tool, DR2DI, for accurately uncovering the potential associations between drugs and diseases using high-dimensional and heterogeneous omics data as information sources. Based on a unified and extended similarity kernel framework, DR2DI inferred the unknown relationships between drugs and diseases using Regularized Kernel Classifier. Importantly, DR2DI employed a semi-supervised and global learning algorithm which can be applied to uncover the diseases (drugs) associated with known and novel drugs (diseases). In silico global validation experiments showed that DR2DI significantly outperforms recent two approaches for predicting drug-disease associations. Detailed case studies further demonstrated that the therapeutic indications and side effects of drugs predicted by DR2DI could be validated by existing database records and literature, suggesting that DR2DI can be served as a useful bioinformatic tool for identifying the potential drug-disease associations and guiding drug repositioning. Our software and comparison codes are freely available at https://github.com/huayu1111/DR2DI.

  11. Controller Strategies for Automation Tool Use under Varying Levels of Trajectory Prediction Uncertainty

    Science.gov (United States)

    Morey, Susan; Prevot, Thomas; Mercer, Joey; Martin, Lynne; Bienert, Nancy; Cabrall, Christopher; Hunt, Sarah; Homola, Jeffrey; Kraut, Joshua

    2013-01-01

    A human-in-the-loop simulation was conducted to examine the effects of varying levels of trajectory prediction uncertainty on air traffic controller workload and performance, as well as how strategies and the use of decision support tools change in response. This paper focuses on the strategies employed by two controllers from separate teams who worked in parallel but independently under identical conditions (airspace, arrival traffic, tools) with the goal of ensuring schedule conformance and safe separation for a dense arrival flow in en route airspace. Despite differences in strategy and methods, both controllers achieved high levels of schedule conformance and safe separation. Overall, results show that trajectory uncertainties introduced by wind and aircraft performance prediction errors do not affect the controllers' ability to manage traffic. Controller strategies were fairly robust to changes in error, though strategies were affected by the amount of delay to absorb (scheduled time of arrival minus estimated time of arrival). Using the results and observations, this paper proposes an ability to dynamically customize the display of information including delay time based on observed error to better accommodate different strategies and objectives.

  12. The East London glaucoma prediction score: web-based validation of glaucoma risk screening tool

    Science.gov (United States)

    Stephen, Cook; Benjamin, Longo-Mbenza

    2013-01-01

    AIM It is difficult for Optometrists and General Practitioners to know which patients are at risk. The East London glaucoma prediction score (ELGPS) is a web based risk calculator that has been developed to determine Glaucoma risk at the time of screening. Multiple risk factors that are available in a low tech environment are assessed to provide a risk assessment. This is extremely useful in settings where access to specialist care is difficult. Use of the calculator is educational. It is a free web based service. Data capture is user specific. METHOD The scoring system is a web based questionnaire that captures and subsequently calculates the relative risk for the presence of Glaucoma at the time of screening. Three categories of patient are described: Unlikely to have Glaucoma; Glaucoma Suspect and Glaucoma. A case review methodology of patients with known diagnosis is employed to validate the calculator risk assessment. RESULTS Data from the patient records of 400 patients with an established diagnosis has been captured and used to validate the screening tool. The website reports that the calculated diagnosis correlates with the actual diagnosis 82% of the time. Biostatistics analysis showed: Sensitivity = 88%; Positive predictive value = 97%; Specificity = 75%. CONCLUSION Analysis of the first 400 patients validates the web based screening tool as being a good method of screening for the at risk population. The validation is ongoing. The web based format will allow a more widespread recruitment for different geographic, population and personnel variables. PMID:23550097

  13. DR2DI: a powerful computational tool for predicting novel drug-disease associations

    Science.gov (United States)

    Lu, Lu; Yu, Hua

    2018-04-01

    Finding the new related candidate diseases for known drugs provides an effective method for fast-speed and low-risk drug development. However, experimental identification of drug-disease associations is expensive and time-consuming. This motivates the need for developing in silico computational methods that can infer true drug-disease pairs with high confidence. In this study, we presented a novel and powerful computational tool, DR2DI, for accurately uncovering the potential associations between drugs and diseases using high-dimensional and heterogeneous omics data as information sources. Based on a unified and extended similarity kernel framework, DR2DI inferred the unknown relationships between drugs and diseases using Regularized Kernel Classifier. Importantly, DR2DI employed a semi-supervised and global learning algorithm which can be applied to uncover the diseases (drugs) associated with known and novel drugs (diseases). In silico global validation experiments showed that DR2DI significantly outperforms recent two approaches for predicting drug-disease associations. Detailed case studies further demonstrated that the therapeutic indications and side effects of drugs predicted by DR2DI could be validated by existing database records and literature, suggesting that DR2DI can be served as a useful bioinformatic tool for identifying the potential drug-disease associations and guiding drug repositioning. Our software and comparison codes are freely available at https://github.com/huayu1111/DR2DI.

  14. TSSPlant: a new tool for prediction of plant Pol II promoters

    KAUST Repository

    Shahmuradov, Ilham A.

    2017-01-13

    Our current knowledge of eukaryotic promoters indicates their complex architecture that is often composed of numerous functional motifs. Most of known promoters include multiple and in some cases mutually exclusive transcription start sites (TSSs). Moreover, TSS selection depends on cell/tissue, development stage and environmental conditions. Such complex promoter structures make their computational identification notoriously difficult. Here, we present TSSPlant, a novel tool that predicts both TATA and TATA-less promoters in sequences of a wide spectrum of plant genomes. The tool was developed by using large promoter collections from ppdb and PlantProm DB. It utilizes eighteen significant compositional and signal features of plant promoter sequences selected in this study, that feed the artificial neural network-based model trained by the backpropagation algorithm. TSSPlant achieves significantly higher accuracy compared to the next best promoter prediction program for both TATA promoters (MCC≃0.84 and F1-score≃0.91 versus MCC≃0.51 and F1-score≃0.71) and TATA-less promoters (MCC≃0.80, F1-score≃0.89 versus MCC≃0.29 and F1-score≃0.50). TSSPlant is available to download as a standalone program at http://www.cbrc.kaust.edu.sa/download/.

  15. TSSPlant: a new tool for prediction of plant Pol II promoters

    KAUST Repository

    Shahmuradov, Ilham A.; Umarov, Ramzan; Solovyev, Victor V.

    2017-01-01

    Our current knowledge of eukaryotic promoters indicates their complex architecture that is often composed of numerous functional motifs. Most of known promoters include multiple and in some cases mutually exclusive transcription start sites (TSSs). Moreover, TSS selection depends on cell/tissue, development stage and environmental conditions. Such complex promoter structures make their computational identification notoriously difficult. Here, we present TSSPlant, a novel tool that predicts both TATA and TATA-less promoters in sequences of a wide spectrum of plant genomes. The tool was developed by using large promoter collections from ppdb and PlantProm DB. It utilizes eighteen significant compositional and signal features of plant promoter sequences selected in this study, that feed the artificial neural network-based model trained by the backpropagation algorithm. TSSPlant achieves significantly higher accuracy compared to the next best promoter prediction program for both TATA promoters (MCC≃0.84 and F1-score≃0.91 versus MCC≃0.51 and F1-score≃0.71) and TATA-less promoters (MCC≃0.80, F1-score≃0.89 versus MCC≃0.29 and F1-score≃0.50). TSSPlant is available to download as a standalone program at http://www.cbrc.kaust.edu.sa/download/.

  16. On-Line Flutter Prediction Tool for Wind Tunnel Flutter Testing using Parameter Varying Estimation Methodology, Phase I

    Data.gov (United States)

    National Aeronautics and Space Administration — ZONA Technology, Inc. (ZONA) proposes to develop an on-line flutter prediction tool for wind tunnel model using the parameter varying estimation (PVE) technique to...

  17. TACO: a general-purpose tool for predicting cell-type-specific transcription factor dimers.

    Science.gov (United States)

    Jankowski, Aleksander; Prabhakar, Shyam; Tiuryn, Jerzy

    2014-03-19

    Cooperative binding of transcription factor (TF) dimers to DNA is increasingly recognized as a major contributor to binding specificity. However, it is likely that the set of known TF dimers is highly incomplete, given that they were discovered using ad hoc approaches, or through computational analyses of limited datasets. Here, we present TACO (Transcription factor Association from Complex Overrepresentation), a general-purpose standalone software tool that takes as input any genome-wide set of regulatory elements and predicts cell-type-specific TF dimers based on enrichment of motif complexes. TACO is the first tool that can accommodate motif complexes composed of overlapping motifs, a characteristic feature of many known TF dimers. Our method comprehensively outperforms existing tools when benchmarked on a reference set of 29 known dimers. We demonstrate the utility and consistency of TACO by applying it to 152 DNase-seq datasets and 94 ChIP-seq datasets. Based on these results, we uncover a general principle governing the structure of TF-TF-DNA ternary complexes, namely that the flexibility of the complex is correlated with, and most likely a consequence of, inter-motif spacing.

  18. PBPK Modeling - A Predictive, Eco-Friendly, Bio-Waiver Tool for Drug Research.

    Science.gov (United States)

    De, Baishakhi; Bhandari, Koushik; Mukherjee, Ranjan; Katakam, Prakash; Adiki, Shanta K; Gundamaraju, Rohit; Mitra, Analava

    2017-01-01

    The world has witnessed growing complexities in disease scenario influenced by the drastic changes in host-pathogen- environment triadic relation. Pharmaceutical R&Ds are in constant search of novel therapeutic entities to hasten transition of drug molecules from lab bench to patient bedside. Extensive animal studies and human pharmacokinetics are still the "gold standard" in investigational new drug research and bio-equivalency studies. Apart from cost, time and ethical issues on animal experimentation, burning questions arise relating to ecological disturbances, environmental hazards and biodiversity issues. Grave concerns arises when the adverse outcomes of continued studies on one particular disease on environment gives rise to several other pathogenic agents finally complicating the total scenario. Thus Pharma R&Ds face a challenge to develop bio-waiver protocols. Lead optimization, drug candidate selection with favorable pharmacokinetics and pharmacodynamics, toxicity assessment are vital steps in drug development. Simulation tools like Gastro Plus™, PK Sim®, SimCyp find applications for the purpose. Advanced technologies like organ-on-a chip or human-on-a chip where a 3D representation of human organs and systems can mimic the related processes and activities, thereby linking them to major features of human biology can be successfully incorporated in the drug development tool box. PBPK provides the State of Art to serve as an optional of animal experimentation. PBPK models can successfully bypass bio-equivalency studies, predict bioavailability, drug interactions and on hyphenation with in vitro-in vivo correlation can be extrapolated to humans thus serving as bio-waiver. PBPK can serve as an eco-friendly bio-waiver predictive tool in drug development. Copyright© Bentham Science Publishers; For any queries, please email at epub@benthamscience.org.

  19. Identification of New Tools to Predict Surgical Performance of Novices using a Plastic Surgery Simulator.

    Science.gov (United States)

    Kazan, Roy; Viezel-Mathieu, Alex; Cyr, Shantale; Hemmerling, Thomas M; Lin, Samuel J; Gilardino, Mirko S

    2018-04-09

    To identify new tools capable of predicting surgical performance of novices on an augmentation mammoplasty simulator. The pace of technical skills acquisition varies between residents and may necessitate more time than that allotted by residency training before reaching competence. Identifying applicants with superior innate technical abilities might shorten learning curves and the time to reach competence. The objective of this study is to identify new tools that could predict surgical performance of novices on a mammoplasty simulator. We recruited 14 medical students and recorded their performance in 2 skill-games: Mikado and Perplexus Epic, and in 2 video games: Star War Racer (Sony Playstation 3) and Super Monkey Ball 2 (Nintendo Wii). Then, each participant performed an augmentation mammoplasty procedure on a Mammoplasty Part-task Trainer, which allows the simulation of the essential steps of the procedure. The average age of participants was 25.4 years. Correlation studies showed significant association between Perplexus Epic, Star Wars Racer, Super Monkey Ball scores and the modified OSATS score with r s = 0.8491 (p 41 (p = 0.005), and r s = 0.7309 (p < 0.003), but not with the Mikado score r s = -0.0255 (p = 0.9). Linear regressions were strongest for Perplexus Epic and Super Monkey Ball scores with coefficients of determination of 0.59 and 0.55, respectively. A combined score (Perplexus/Super-Monkey-Ball) was computed and showed a significant correlation with the modified OSATS score having an r s = 0.8107 (p < 0.001) and R 2 = 0.75, respectively. This study identified a combination of skill games that correlated to better performance of novices on a surgical simulator. With refinement, such tools could serve to help screen plastic surgery applicants and identify those with higher surgical performance predictors. Copyright © 2018 Association of Program Directors in Surgery. Published by Elsevier Inc. All rights reserved.

  20. Predictive Engineering Tools for Injection-Molded Long-Carbon-Thermoplastic Composites: Weight and Cost Analyses

    Energy Technology Data Exchange (ETDEWEB)

    Nguyen, Ba Nghiep [Pacific Northwest National Lab. (PNNL), Richland, WA (United States); Fifield, Leonard S. [Pacific Northwest National Lab. (PNNL), Richland, WA (United States); Gandhi, Umesh N. [Toyota Research Inst. North America, Ann Arbor, MI (United States); Mori, Steven [MAGNA Exteriors and Interiors Corporation, Aurora, ON (Canada); Wollan, Eric J. [PlastiComp, Inc., Winona, MN (United States)

    2016-08-01

    This project proposed to integrate, optimize and validate the fiber orientation and length distribution models previously developed and implemented in the Autodesk Simulation Moldflow Insight (ASMI) package for injection-molded long-carbon-fiber thermoplastic composites into a cohesive prediction capability. The current effort focused on rendering the developed models more robust and efficient for automotive industry part design to enable weight savings and cost reduction. The project goal has been achieved by optimizing the developed models, improving and integrating their implementations in ASMI, and validating them for a complex 3D LCF thermoplastic automotive part (Figure 1). Both PP and PA66 were used as resin matrices. After validating ASMI predictions for fiber orientation and fiber length for this complex part against the corresponding measured data, in collaborations with Toyota and Magna PNNL developed a method using the predictive engineering tool to assess LCF/PA66 complex part design in terms of stiffness performance. Structural three-point bending analyses of the complex part and similar parts in steel were then performed for this purpose, and the team has then demonstrated the use of stiffness-based complex part design assessment to evaluate weight savings relative to the body system target (≥ 35%) set in Table 2 of DE-FOA-0000648 (AOI #1). In addition, starting from the part-to-part analysis, the PE tools enabled an estimated weight reduction for the vehicle body system using 50 wt% LCF/PA66 parts relative to the current steel system. Also, from this analysis an estimate of the manufacturing cost including the material cost for making the equivalent part in steel has been determined and compared to the costs for making the LCF/PA66 part to determine the cost per “saved” pound.

  1. MP3: a software tool for the prediction of pathogenic proteins in genomic and metagenomic data.

    Science.gov (United States)

    Gupta, Ankit; Kapil, Rohan; Dhakan, Darshan B; Sharma, Vineet K

    2014-01-01

    The identification of virulent proteins in any de-novo sequenced genome is useful in estimating its pathogenic ability and understanding the mechanism of pathogenesis. Similarly, the identification of such proteins could be valuable in comparing the metagenome of healthy and diseased individuals and estimating the proportion of pathogenic species. However, the common challenge in both the above tasks is the identification of virulent proteins since a significant proportion of genomic and metagenomic proteins are novel and yet unannotated. The currently available tools which carry out the identification of virulent proteins provide limited accuracy and cannot be used on large datasets. Therefore, we have developed an MP3 standalone tool and web server for the prediction of pathogenic proteins in both genomic and metagenomic datasets. MP3 is developed using an integrated Support Vector Machine (SVM) and Hidden Markov Model (HMM) approach to carry out highly fast, sensitive and accurate prediction of pathogenic proteins. It displayed Sensitivity, Specificity, MCC and accuracy values of 92%, 100%, 0.92 and 96%, respectively, on blind dataset constructed using complete proteins. On the two metagenomic blind datasets (Blind A: 51-100 amino acids and Blind B: 30-50 amino acids), it displayed Sensitivity, Specificity, MCC and accuracy values of 82.39%, 97.86%, 0.80 and 89.32% for Blind A and 71.60%, 94.48%, 0.67 and 81.86% for Blind B, respectively. In addition, the performance of MP3 was validated on selected bacterial genomic and real metagenomic datasets. To our knowledge, MP3 is the only program that specializes in fast and accurate identification of partial pathogenic proteins predicted from short (100-150 bp) metagenomic reads and also performs exceptionally well on complete protein sequences. MP3 is publicly available at http://metagenomics.iiserb.ac.in/mp3/index.php.

  2. Screening Tool for Early Postnatal Prediction of Retinopathy of Prematurity in Preterm Newborns (STEP-ROP).

    Science.gov (United States)

    Ricard, Caroline A; Dammann, Christiane E L; Dammann, Olaf

    2017-01-01

    Retinopathy of prematurity (ROP) is a disorder of the preterm newborn characterized by neurovascular disruption in the immature retina that may cause visual impairment and blindness. To develop a clinical screening tool for early postnatal prediction of ROP in preterm newborns based on risk information available within the first 48 h of postnatal life. Using data submitted to the Vermont Oxford Network (VON) between 1995 and 2015, we created logistic regression models based on infants born <28 completed weeks gestational age. We developed a model with 60% of the data and identified birth weight, gestational age, respiratory distress syndrome, non-Hispanic ethnicity, and multiple gestation as predictors of ROP. We tested the model in the remaining 40%, performed tenfold cross-validation, and tested the score in ELGAN study data. Of the 1,052 newborns in the VON database, 627 recorded an ROP status. Forty percent had no ROP, 40% had mild ROP (stages 1 and 2), and 20% had severe ROP (stages 3-5). We created a weighted score to predict any ROP based on the multivariable regression model. A cutoff score of 5 had the best sensitivity (95%, 95% CI 93-97), while maintaining a strong positive predictive value (63%, 95% CI 57-68). When applied to the ELGAN data, sensitivity was lower (72%, 95% CI 69-75), but PPV was higher (80%, 95% CI 77-83). STEP-ROP is a promising screening tool. It is easy to calculate, does not rely on extensive postnatal data collection, and can be calculated early after birth. Early ROP screening may help physicians limit patient exposure to additional risk factors, and may be useful for risk stratification in clinical trials aimed at reducing ROP. © 2017 S. Karger AG, Basel.

  3. Experimental and Mathematical Modeling for Prediction of Tool Wear on the Machining of Aluminium 6061 Alloy by High Speed Steel Tools

    Directory of Open Access Journals (Sweden)

    Okokpujie Imhade Princess

    2017-12-01

    Full Text Available In recent machining operation, tool life is one of the most demanding tasks in production process, especially in the automotive industry. The aim of this paper is to study tool wear on HSS in end milling of aluminium 6061 alloy. The experiments were carried out to investigate tool wear with the machined parameters and to developed mathematical model using response surface methodology. The various machining parameters selected for the experiment are spindle speed (N, feed rate (f, axial depth of cut (a and radial depth of cut (r. The experiment was designed using central composite design (CCD in which 31 samples were run on SIEG 3/10/0010 CNC end milling machine. After each experiment the cutting tool was measured using scanning electron microscope (SEM. The obtained optimum machining parameter combination are spindle speed of 2500 rpm, feed rate of 200 mm/min, axial depth of cut of 20 mm, and radial depth of cut 1.0mm was found out to achieved the minimum tool wear as 0.213 mm. The mathematical model developed predicted the tool wear with 99.7% which is within the acceptable accuracy range for tool wear prediction.

  4. Experimental and Mathematical Modeling for Prediction of Tool Wear on the Machining of Aluminium 6061 Alloy by High Speed Steel Tools

    Science.gov (United States)

    Okokpujie, Imhade Princess; Ikumapayi, Omolayo M.; Okonkwo, Ugochukwu C.; Salawu, Enesi Y.; Afolalu, Sunday A.; Dirisu, Joseph O.; Nwoke, Obinna N.; Ajayi, Oluseyi O.

    2017-12-01

    In recent machining operation, tool life is one of the most demanding tasks in production process, especially in the automotive industry. The aim of this paper is to study tool wear on HSS in end milling of aluminium 6061 alloy. The experiments were carried out to investigate tool wear with the machined parameters and to developed mathematical model using response surface methodology. The various machining parameters selected for the experiment are spindle speed (N), feed rate (f), axial depth of cut (a) and radial depth of cut (r). The experiment was designed using central composite design (CCD) in which 31 samples were run on SIEG 3/10/0010 CNC end milling machine. After each experiment the cutting tool was measured using scanning electron microscope (SEM). The obtained optimum machining parameter combination are spindle speed of 2500 rpm, feed rate of 200 mm/min, axial depth of cut of 20 mm, and radial depth of cut 1.0mm was found out to achieved the minimum tool wear as 0.213 mm. The mathematical model developed predicted the tool wear with 99.7% which is within the acceptable accuracy range for tool wear prediction.

  5. Sediment predictions in Wadi Al-Naft using soil water assessment tool

    Directory of Open Access Journals (Sweden)

    Alwan Imzahim Abdulkareem

    2018-01-01

    Full Text Available Sediment production is the amount of sediment in the unit area that is transported through the basin by water transfer over a specified period of time. The main aim of present study is to predict sediment yield of Wadi, Al-Naft watershed with 8820 Km2area, that is located in the North-East of Diyala Governorate in Iraq, using Soil-Water Assessment Tool, (SWAT and to predict the impact of land management and the input data including the land use, soil type, and soil texture maps which are obtained from Landsat-8 satellite image. Digital Elevation Model,(DEM with resolution (14 14 meter is used to delineate the watershed with the aid of model. Three Land-sat images were used to cover the study area which were mosaic processed and the study area masked- up from the mosaic, image. The area of study has been registries by Arc-GIS 10.2 and digitized the soil hydrologic group through assistant of Soil Plant Assistant Water Model, (SPAW which was progressed by USDA, Agricultural, Research Service, using the data of soil textural and organic matter from Food and Agriculture Organization (FAO, the available water content, saturated hydraulic conductivity, and bulk density. The results of average, sediment depth and the maximum upland sediment for simulation period (2010-2020 were predicted to be (1.7 mm, and (12.57 Mg/ha, respectively.

  6. A Tool for Predicting Regulatory Approval After Phase II Testing of New Oncology Compounds.

    Science.gov (United States)

    DiMasi, J A; Hermann, J C; Twyman, K; Kondru, R K; Stergiopoulos, S; Getz, K A; Rackoff, W

    2015-11-01

    We developed an algorithm (ANDI) for predicting regulatory marketing approval for new cancer drugs after phase II testing has been conducted, with the objective of providing a tool to improve drug portfolio decision-making. We examined 98 oncology drugs from the top 50 pharmaceutical companies (2006 sales) that first entered clinical development from 1999 to 2007, had been taken to at least phase II development, and had a known final outcome (research abandonment or regulatory marketing approval). Data on safety, efficacy, operational, market, and company characteristics were obtained from public sources. Logistic regression and machine-learning methods were used to provide an unbiased approach to assess overall predictability and to identify the most important individual predictors. We found that a simple four-factor model (activity, number of patients in the pivotal phase II trial, phase II duration, and a prevalence-related measure) had high sensitivity and specificity for predicting regulatory marketing approval. © 2015 American Society for Clinical Pharmacology and Therapeutics.

  7. Human Splicing Finder: an online bioinformatics tool to predict splicing signals.

    Science.gov (United States)

    Desmet, François-Olivier; Hamroun, Dalil; Lalande, Marine; Collod-Béroud, Gwenaëlle; Claustres, Mireille; Béroud, Christophe

    2009-05-01

    Thousands of mutations are identified yearly. Although many directly affect protein expression, an increasing proportion of mutations is now believed to influence mRNA splicing. They mostly affect existing splice sites, but synonymous, non-synonymous or nonsense mutations can also create or disrupt splice sites or auxiliary cis-splicing sequences. To facilitate the analysis of the different mutations, we designed Human Splicing Finder (HSF), a tool to predict the effects of mutations on splicing signals or to identify splicing motifs in any human sequence. It contains all available matrices for auxiliary sequence prediction as well as new ones for binding sites of the 9G8 and Tra2-beta Serine-Arginine proteins and the hnRNP A1 ribonucleoprotein. We also developed new Position Weight Matrices to assess the strength of 5' and 3' splice sites and branch points. We evaluated HSF efficiency using a set of 83 intronic and 35 exonic mutations known to result in splicing defects. We showed that the mutation effect was correctly predicted in almost all cases. HSF could thus represent a valuable resource for research, diagnostic and therapeutic (e.g. therapeutic exon skipping) purposes as well as for global studies, such as the GEN2PHEN European Project or the Human Variome Project.

  8. Advanced Online Survival Analysis Tool for Predictive Modelling in Clinical Data Science.

    Science.gov (United States)

    Montes-Torres, Julio; Subirats, José Luis; Ribelles, Nuria; Urda, Daniel; Franco, Leonardo; Alba, Emilio; Jerez, José Manuel

    2016-01-01

    One of the prevailing applications of machine learning is the use of predictive modelling in clinical survival analysis. In this work, we present our view of the current situation of computer tools for survival analysis, stressing the need of transferring the latest results in the field of machine learning to biomedical researchers. We propose a web based software for survival analysis called OSA (Online Survival Analysis), which has been developed as an open access and user friendly option to obtain discrete time, predictive survival models at individual level using machine learning techniques, and to perform standard survival analysis. OSA employs an Artificial Neural Network (ANN) based method to produce the predictive survival models. Additionally, the software can easily generate survival and hazard curves with multiple options to personalise the plots, obtain contingency tables from the uploaded data to perform different tests, and fit a Cox regression model from a number of predictor variables. In the Materials and Methods section, we depict the general architecture of the application and introduce the mathematical background of each of the implemented methods. The study concludes with examples of use showing the results obtained with public datasets.

  9. CaFE: a tool for binding affinity prediction using end-point free energy methods.

    Science.gov (United States)

    Liu, Hui; Hou, Tingjun

    2016-07-15

    Accurate prediction of binding free energy is of particular importance to computational biology and structure-based drug design. Among those methods for binding affinity predictions, the end-point approaches, such as MM/PBSA and LIE, have been widely used because they can achieve a good balance between prediction accuracy and computational cost. Here we present an easy-to-use pipeline tool named Calculation of Free Energy (CaFE) to conduct MM/PBSA and LIE calculations. Powered by the VMD and NAMD programs, CaFE is able to handle numerous static coordinate and molecular dynamics trajectory file formats generated by different molecular simulation packages and supports various force field parameters. CaFE source code and documentation are freely available under the GNU General Public License via GitHub at https://github.com/huiliucode/cafe_plugin It is a VMD plugin written in Tcl and the usage is platform-independent. tingjunhou@zju.edu.cn. © The Author 2016. Published by Oxford University Press. All rights reserved. For Permissions, please e-mail: journals.permissions@oup.com.

  10. Optical coherence tomography: a potential tool to predict premature rupture of fetal membranes.

    Science.gov (United States)

    Micili, Serap C; Valter, Markus; Oflaz, Hakan; Ozogul, Candan; Linder, Peter; Föckler, Nicole; Artmann, Gerhard M; Digel, Ilya; Artmann, Aysegul T

    2013-04-01

    A fundamental question addressed in this study was the feasibility of preterm birth prediction based on a noncontact investigation of fetal membranes in situ. Although the phenomena of preterm birth and the premature rupture of the fetal membrane are well known, currently, there are no diagnostic tools for their prediction. The aim of this study was to assess whether optical coherence tomography could be used for clinical investigations of high-risk pregnancies. The thickness of fetal membranes was measured in parallel by optical coherence tomography and histological techniques for the following types of birth: normal births, preterm births without premature ruptures and births at full term with premature rupture of membrane. Our study revealed that the membrane thickness correlates with the birth type. Normal births membranes were statistically significantly thicker than those belonging to the other two groups. Thus, in spite of almost equal duration of gestation of the normal births and the births at full term with premature rupture, the corresponding membrane thicknesses differed. This difference is possibly related to previously reported water accumulation in the membranes. The optical coherence tomography results were encouraging, suggesting that this technology could be used in future to predict and distinguish between different kinds of births.

  11. Soil and Water Assessment Tool model predictions of annual maximum pesticide concentrations in high vulnerability watersheds.

    Science.gov (United States)

    Winchell, Michael F; Peranginangin, Natalia; Srinivasan, Raghavan; Chen, Wenlin

    2018-05-01

    Recent national regulatory assessments of potential pesticide exposure of threatened and endangered species in aquatic habitats have led to increased need for watershed-scale predictions of pesticide concentrations in flowing water bodies. This study was conducted to assess the ability of the uncalibrated Soil and Water Assessment Tool (SWAT) to predict annual maximum pesticide concentrations in the flowing water bodies of highly vulnerable small- to medium-sized watersheds. The SWAT was applied to 27 watersheds, largely within the midwest corn belt of the United States, ranging from 20 to 386 km 2 , and evaluated using consistent input data sets and an uncalibrated parameterization approach. The watersheds were selected from the Atrazine Ecological Exposure Monitoring Program and the Heidelberg Tributary Loading Program, both of which contain high temporal resolution atrazine sampling data from watersheds with exceptionally high vulnerability to atrazine exposure. The model performance was assessed based upon predictions of annual maximum atrazine concentrations in 1-d and 60-d durations, predictions critical in pesticide-threatened and endangered species risk assessments when evaluating potential acute and chronic exposure to aquatic organisms. The simulation results showed that for nearly half of the watersheds simulated, the uncalibrated SWAT model was able to predict annual maximum pesticide concentrations within a narrow range of uncertainty resulting from atrazine application timing patterns. An uncalibrated model's predictive performance is essential for the assessment of pesticide exposure in flowing water bodies, the majority of which have insufficient monitoring data for direct calibration, even in data-rich countries. In situations in which SWAT over- or underpredicted the annual maximum concentrations, the magnitude of the over- or underprediction was commonly less than a factor of 2, indicating that the model and uncalibrated parameterization

  12. Pediatric Eating Assessment Tool-10 as an indicator to predict aspiration in children with esophageal atresia.

    Science.gov (United States)

    Soyer, Tutku; Yalcin, Sule; Arslan, Selen Serel; Demir, Numan; Tanyel, Feridun Cahit

    2017-10-01

    Airway aspiration is a common problem in children with esophageal atresia (EA). Pediatric Eating Assessment Tool-10 (pEAT-10) is a self-administered questionnaire to evaluate dysphagia symptoms in children. A prospective study was performed to evaluate the validity of pEAT-10 to predict aspiration in children with EA. Patients with EA were evaluated for age, sex, type of atresia, presence of associated anomalies, type of esophageal repair, time of definitive treatment, and the beginning of oral feeding. Penetration-aspiration score (PAS) was evaluated with videofluoroscopy (VFS) and parents were surveyed for pEAT-10, dysphagia score (DS) and functional oral intake scale (FOIS). PAS scores greater than 7 were considered as risk of aspiration. EAT-10 values greater than 3 were assessed as abnormal. Higher DS scores shows dysphagia whereas higher FOIS shows better feeding abilities. Forty patients were included. Children with PAS greater than 7 were assessed as PAS+ group, and scores less than 7 were constituted as PAS- group. Demographic features and results of surgical treatments showed no difference between groups (p>0.05). The median values of PAS, pEAT-10 and DS scores were significantly higher in PAS+ group when compared to PAS- group (p<0.05). The sensitivity and specificity of pEAT-10 to predict aspiration were 88% and 77%, and the positive and negative predictive values were 22% and 11%, respectively. Type-C cases had better pEAT-10 and FOIS scores with respect to type-A cases, and both scores were statistically more reliable in primary repair than delayed repair (p<0.05). Among the postoperative complications, only leakage had impact on DS, pEAT-10, PAS and FOIS scores (p<0.05). The pEAT-10 is a valid, simple and reliable tool to predict aspiration in children. Patients with higher pEAT-10 scores should undergo detailed evaluation of deglutitive functions and assessment of risks of aspiration to improve safer feeding strategies. Level II (Development of

  13. Model predictive control as a tool for improving the process operation of MSW combustion plants

    International Nuclear Information System (INIS)

    Leskens, M.; Kessel, L.B.M. van; Bosgra, O.H.

    2005-01-01

    In this paper a feasibility study is presented on the application of the advanced control strategy called model predictive control (MPC) as a tool for obtaining improved process operation performance for municipal solid waste (MSW) combustion plants. The paper starts with a discussion of the operational objectives and control of such plants, from which a motivation follows for applying MPC to them. This is followed by a discussion on the basic idea behind this advanced control strategy. After that, an MPC-based combustion control system is proposed aimed at tackling a typical MSW combustion control problem and, using this proposed control system, an assessment is made of the improvement in performance that an MPC-based MSW combustion control system can provide in comparison to conventional MSW combustion control systems. This assessment is based on simulations using an experimentally obtained process and disturbance model of a real-life large-scale MSW combustion plant

  14. A model of integration among prediction tools: applied study to road freight transportation

    Directory of Open Access Journals (Sweden)

    Henrique Dias Blois

    Full Text Available Abstract This study has developed a scenery analysis model which has integrated decision-making tools on investments: prospective scenarios (Grumbach Method and systems dynamics (hard modeling, with the innovated multivariate analysis of experts. It was designed through analysis and simulation scenarios and showed which are the most striking events in the study object as well as highlighted the actions could redirect the future of the analyzed system. Moreover, predictions are likely to be developed through the generated scenarios. The model has been validated empirically with road freight transport data from state of Rio Grande do Sul, Brazil. The results showed that the model contributes to the analysis of investment because it identifies probabilities of events that impact on decision making, and identifies priorities for action, reducing uncertainties in the future. Moreover, it allows an interdisciplinary discussion that correlates different areas of knowledge, fundamental when you wish more consistency in creating scenarios.

  15. A low-dimensional tool for predicting force decomposition coefficients for varying inflow conditions

    KAUST Repository

    Ghommem, Mehdi; Akhtar, Imran; Hajj, M. R.

    2013-01-01

    We develop a low-dimensional tool to predict the effects of unsteadiness in the inflow on force coefficients acting on a circular cylinder using proper orthogonal decomposition (POD) modes from steady flow simulations. The approach is based on combining POD and linear stochastic estimator (LSE) techniques. We use POD to derive a reduced-order model (ROM) to reconstruct the velocity field. To overcome the difficulty of developing a ROM using Poisson's equation, we relate the pressure field to the velocity field through a mapping function based on LSE. The use of this approach to derive force decomposition coefficients (FDCs) under unsteady mean flow from basis functions of the steady flow is illustrated. For both steady and unsteady cases, the final outcome is a representation of the lift and drag coefficients in terms of velocity and pressure temporal coefficients. Such a representation could serve as the basis for implementing control strategies or conducting uncertainty quantification. Copyright © 2013 Inderscience Enterprises Ltd.

  16. Research-Based Monitoring, Prediction, and Analysis Tools of the Spacecraft Charging Environment for Spacecraft Users

    Science.gov (United States)

    Zheng, Yihua; Kuznetsova, Maria M.; Pulkkinen, Antti A.; Maddox, Marlo M.; Mays, Mona Leila

    2015-01-01

    The Space Weather Research Center (http://swrc. gsfc.nasa.gov) at NASA Goddard, part of the Community Coordinated Modeling Center (http://ccmc.gsfc.nasa.gov), is committed to providing research-based forecasts and notifications to address NASA's space weather needs, in addition to its critical role in space weather education. It provides a host of services including spacecraft anomaly resolution, historical impact analysis, real-time monitoring and forecasting, tailored space weather alerts and products, and weekly summaries and reports. In this paper, we focus on how (near) real-time data (both in space and on ground), in combination with modeling capabilities and an innovative dissemination system called the integrated Space Weather Analysis system (http://iswa.gsfc.nasa.gov), enable monitoring, analyzing, and predicting the spacecraft charging environment for spacecraft users. Relevant tools and resources are discussed.

  17. Predicting risk and outcomes for frail older adults: an umbrella review of frailty screening tools

    Science.gov (United States)

    Apóstolo, João; Cooke, Richard; Bobrowicz-Campos, Elzbieta; Santana, Silvina; Marcucci, Maura; Cano, Antonio; Vollenbroek-Hutten, Miriam; Germini, Federico; Holland, Carol

    2017-01-01

    EXECUTIVE SUMMARY Background A scoping search identified systematic reviews on diagnostic accuracy and predictive ability of frailty measures in older adults. In most cases, research was confined to specific assessment measures related to a specific clinical model. Objectives To summarize the best available evidence from systematic reviews in relation to reliability, validity, diagnostic accuracy and predictive ability of frailty measures in older adults. Inclusion criteria Population Older adults aged 60 years or older recruited from community, primary care, long-term residential care and hospitals. Index test Available frailty measures in older adults. Reference test Cardiovascular Health Study phenotype model, the Canadian Study of Health and Aging cumulative deficit model, Comprehensive Geriatric Assessment or other reference tests. Diagnosis of interest Frailty defined as an age-related state of decreased physiological reserves characterized by an increased risk of poor clinical outcomes. Types of studies Quantitative systematic reviews. Search strategy A three-step search strategy was utilized to find systematic reviews, available in English, published between January 2001 and October 2015. Methodological quality Assessed by two independent reviewers using the Joanna Briggs Institute critical appraisal checklist for systematic reviews and research synthesis. Data extraction Two independent reviewers extracted data using the standardized data extraction tool designed for umbrella reviews. Data synthesis Data were only presented in a narrative form due to the heterogeneity of included reviews. Results Five reviews with a total of 227,381 participants were included in this umbrella review. Two reviews focused on reliability, validity and diagnostic accuracy; two examined predictive ability for adverse health outcomes; and one investigated validity, diagnostic accuracy and predictive ability. In total, 26 questionnaires and brief assessments and eight frailty

  18. Acceptability of the Predicting Abusive Head Trauma (PredAHT) clinical prediction tool: A qualitative study with child protection professionals.

    Science.gov (United States)

    Cowley, Laura E; Maguire, Sabine; Farewell, Daniel M; Quinn-Scoggins, Harriet D; Flynn, Matthew O; Kemp, Alison M

    2018-05-09

    The validated Predicting Abusive Head Trauma (PredAHT) tool estimates the probability of abusive head trauma (AHT) based on combinations of six clinical features: head/neck bruising; apnea; seizures; rib/long-bone fractures; retinal hemorrhages. We aimed to determine the acceptability of PredAHT to child protection professionals. We conducted qualitative semi-structured interviews with 56 participants: clinicians (25), child protection social workers (10), legal practitioners (9, including 4 judges), police officers (8), and pathologists (4), purposively sampled across southwest United Kingdom. Interviews were recorded, transcribed and imported into NVivo for thematic analysis (38% double-coded). We explored participants' evaluations of PredAHT, their opinions about the optimal way to present the calculated probabilities, and their interpretation of probabilities in the context of suspected AHT. Clinicians, child protection social workers and police thought PredAHT would be beneficial as an objective adjunct to their professional judgment, to give them greater confidence in their decisions. Lawyers and pathologists appreciated its value for prompting multidisciplinary investigations, but were uncertain of its usefulness in court. Perceived disadvantages included: possible over-reliance and false reassurance from a low score. Interpretations regarding which percentages equate to 'low', 'medium' or 'high' likelihood of AHT varied; participants preferred a precise % probability over these general terms. Participants would use PredAHT with provisos: if they received multi-agency training to define accepted risk thresholds for consistent interpretation; with knowledge of its development; if it was accepted by colleagues. PredAHT may therefore increase professionals' confidence in their decision-making when investigating suspected AHT, but may be of less value in court. Copyright © 2018 Elsevier Ltd. All rights reserved.

  19. External validation of approaches to prediction of falls during hospital rehabilitation stays and development of a new simpler tool

    Directory of Open Access Journals (Sweden)

    Angela Vratsistas-Curto

    2017-12-01

    Full Text Available Objectives: To test the external validity of 4 approaches to fall prediction in a rehabilitation setting (Predict_FIRST, Ontario Modified STRATIFY (OMS, physiotherapists’ judgement of fall risk (PT_Risk, and falls in the past year (Past_Falls, and to develop and test the validity of a simpler tool for fall prediction in rehabilitation (Predict_CM2. Participants: A total of 300 consecutively-admitted rehabilitation inpatients. Methods: Prospective inception cohort study. Falls during the rehabilitation stay were monitored. Potential predictors were extracted from medical records. Results: Forty-one patients (14% fell during their rehabilitation stay. The external validity, area under the receiver operating characteristic curve (AUC, for predicting future fallers was: 0.71 (95% confidence interval (95% CI: 0.61–0.81 for OMS (Total_Score; 0.66 (95% CI: 0.57–0.74 for Predict_FIRST; 0.65 (95% CI 0.57–0.73 for PT_Risk; and 0.52 for Past_Falls (95% CI: 0.46–0.60. A simple 3-item tool (Predict_CM2 was developed from the most predictive individual items (impaired mobility/transfer ability, impaired cognition, and male sex. The accuracy of Predict_CM2 was 0.73 (95% CI: 0.66–0.81, comparable to OMS (Total_Score (p = 0.52, significantly better than Predict_FIRST (p = 0.04, and Past_Falls (p < 0.001, and approaching significantly better than PT_Risk (p = 0.09. Conclusion: Predict_CM2 is a simpler screening tool with similar accuracy for predicting fallers in rehabilitation to OMS (Total_Score and better accuracy than Predict_FIRST or Past_Falls. External validation of Predict_CM2 is required.

  20. Application of the Streamflow Prediction Tool to Estimate Sediment Dredging Volumes in Texas Coastal Waterways

    Science.gov (United States)

    Yeates, E.; Dreaper, G.; Afshari, S.; Tavakoly, A. A.

    2017-12-01

    Over the past six fiscal years, the United States Army Corps of Engineers (USACE) has contracted an average of about a billion dollars per year for navigation channel dredging. To execute these funds effectively, USACE Districts must determine which navigation channels need to be dredged in a given year. Improving this prioritization process results in more efficient waterway maintenance. This study uses the Streamflow Prediction Tool, a runoff routing model based on global weather forecast ensembles, to estimate dredged volumes. This study establishes regional linear relationships between cumulative flow and dredged volumes over a long-term simulation covering 30 years (1985-2015), using drainage area and shoaling parameters. The study framework integrates the National Hydrography Dataset (NHDPlus Dataset) with parameters from the Corps Shoaling Analysis Tool (CSAT) and dredging record data from USACE District records. Results in the test cases of the Houston Ship Channel and the Sabine and Port Arthur Harbor waterways in Texas indicate positive correlation between the simulated streamflows and actual dredging records.

  1. Computational tools for genome-wide miRNA prediction and study

    KAUST Repository

    Malas, T.B.; Ravasi, Timothy

    2012-01-01

    MicroRNAs (miRNAs) are single-stranded non-coding RNA susually of 22 nucleotidesin length that play an important post-transcriptional regulation role in many organisms. MicroRNAs bind a seed sequence to the 3-untranslated region (UTR) region of the target messenger RNA (mRNA), inducing degradation or inhibition of translation and resulting in a reduction in the protein level. This regulatory mechanism is central to many biological processes and perturbation could lead to diseases such as cancer. Given the biological importance, of miRNAs, there is a great need to identify and study their targets and functions. However, miRNAs are very difficult to clone in the lab and this has hindered the identification of novel miRNAs. Next-generation sequencing coupled with new computational tools has recently evolved to help researchers efficiently identify large numbers of novel miRNAs. In this review, we describe recent miRNA prediction tools and discuss their priorities, advantages and disadvantages. Malas and Ravasi.

  2. Ecotoxicity on a stick: A novel analytical tool for predicting the ecotoxicity of petroleum contaminated samples

    International Nuclear Information System (INIS)

    Parkerton, T.F.; Stone, M.A.

    1995-01-01

    Hydrocarbons generally elicit toxicity via a nonpolar narcotic mechanism. Recent research suggests that chemicals acting by this mode invoke ecotoxicity when the molar concentration in organisms lipid exceeds a critical threshold. Since ecotoxicity of nonpolar narcotic mixtures appears to be additive, the ecotoxicity of hydrocarbon mixtures thus depends upon: (1) the partitioning of individual hydrocarbons comprising the mixture from the environment to lipids and (2) the total molar sum of the constituent hydrocarbons in lipids. These insights have led previous investigators to advance the concept of biomimetic extraction as a novel tool for assessing potential narcosis-type or baseline ecotoxicity in aqueous samples. Drawing from this earlier work, the authors have developed a method to quantify Bioavailable Petroleum Hydrocarbons (BPHS) in hydrocarbon-contaminated aqueous and soil/sediment samples. A sample is equilibrated with a solid phase microextraction (SPME) fiber that serves as a surrogate for organism lipids. The total moles of hydrocarbons that partition to the SPME fiber is then quantified using a simple GC/FID procedure. Research conducted to support the development and initial validation of this method will be presented. Results suggest that BPH analyses provide a promising, cost-effective approach for predicting the ecotoxicity of environmental samples contaminated with hydrocarbon mixtures. Consequently, BPH analyses may provide a valuable analytical screening tool for ecotoxicity assessment in product and effluent testing, environmental monitoring and site remediation applications

  3. Tool-life prediction under multi-cycle loading during metal forming: a feasibility study

    Directory of Open Access Journals (Sweden)

    Hu Yiran

    2015-01-01

    Full Text Available In the present research, the friction and wear behaviour of a hard coating were studied by using ball-on-disc tests to simulate the wear process of the coated tools for sheet metal forming process. The evolution of the friction coefficient followed a typical dual-plateau pattern, i.e. at the initial stage of sliding, the friction coefficient was relatively low, followed by a sharp increase due to the breakdown of the coatings after a certain number of cyclic dynamic loadings. This phenomenon was caused by the interactive response between the friction and wear from a coating tribo-system, which is often neglected by metal forming researchers, and constant friction coefficient values are normally used in the finite element (FE simulations to represent the complex tribological nature at the contact interfaces. Meanwhile, most of the current FE simulations consider single-cycle loading processes, whereas many metal-forming operations are conducted in a form of multi-cycle loading. Therefore, a novel friction/wear interactive friction model was developed to, simultaneously, characterise the evolutions of friction coefficient and the remaining thickness of the coating layer, to enable the wear life of coated tooling to be predicted. The friction model was then implemented into the FE simulation of a sheet metal forming process for feasibility study.

  4. Tool life prediction under multi-cycle loading conditions: A feasibility study

    Directory of Open Access Journals (Sweden)

    Yuan Xi

    2015-01-01

    Full Text Available In the present research, the friction and wear behaviour of a hard coating were studied by using ball-on-disc tests to simulate the wear process of the coated tools for sheet metal forming process. The evolution of the friction coefficient followed a typical dual-plateau pattern, i.e. at the initial stage of sliding, the friction coefficient was relatively low, followed by a sharp increase due to the breakdown of the coatings after a certain number of cyclic dynamic loadings. This phenomenon was caused by the interactive response between the friction and wear from a coating tribo-system, which has not been addressed so far by metal forming researchers, and constant friction coefficient values are normally used in the FE simulations to represent the complex tribological nature at the contact interfaces. Meanwhile, most of the current FE simulations are single cycle, whereas most sheet metal forming operations are conducted as multi-cycle. Therefore, a novel friction/wear interactive friction model was developed to, simultaneously, characterise the evolutions of friction coefficient and the remaining thickness of the coating layer, to enable the wear life of coated tooling to be predicted. The friction model was then implemented into the FE simulation of a sheet metal forming process for feasibility study.

  5. Computational tools for genome-wide miRNA prediction and study

    KAUST Repository

    Malas, T.B.

    2012-11-02

    MicroRNAs (miRNAs) are single-stranded non-coding RNA susually of 22 nucleotidesin length that play an important post-transcriptional regulation role in many organisms. MicroRNAs bind a seed sequence to the 3-untranslated region (UTR) region of the target messenger RNA (mRNA), inducing degradation or inhibition of translation and resulting in a reduction in the protein level. This regulatory mechanism is central to many biological processes and perturbation could lead to diseases such as cancer. Given the biological importance, of miRNAs, there is a great need to identify and study their targets and functions. However, miRNAs are very difficult to clone in the lab and this has hindered the identification of novel miRNAs. Next-generation sequencing coupled with new computational tools has recently evolved to help researchers efficiently identify large numbers of novel miRNAs. In this review, we describe recent miRNA prediction tools and discuss their priorities, advantages and disadvantages. Malas and Ravasi.

  6. ProBiS tools (algorithm, database, and web servers) for predicting and modeling of biologically interesting proteins.

    Science.gov (United States)

    Konc, Janez; Janežič, Dušanka

    2017-09-01

    ProBiS (Protein Binding Sites) Tools consist of algorithm, database, and web servers for prediction of binding sites and protein ligands based on the detection of structurally similar binding sites in the Protein Data Bank. In this article, we review the operations that ProBiS Tools perform, provide comments on the evolution of the tools, and give some implementation details. We review some of its applications to biologically interesting proteins. ProBiS Tools are freely available at http://probis.cmm.ki.si and http://probis.nih.gov. Copyright © 2017 Elsevier Ltd. All rights reserved.

  7. IPMP Global Fit - A one-step direct data analysis tool for predictive microbiology.

    Science.gov (United States)

    Huang, Lihan

    2017-12-04

    The objective of this work is to develop and validate a unified optimization algorithm for performing one-step global regression analysis of isothermal growth and survival curves for determination of kinetic parameters in predictive microbiology. The algorithm is incorporated with user-friendly graphical interfaces (GUIs) to develop a data analysis tool, the USDA IPMP-Global Fit. The GUIs are designed to guide the users to easily navigate through the data analysis process and properly select the initial parameters for different combinations of mathematical models. The software is developed for one-step kinetic analysis to directly construct tertiary models by minimizing the global error between the experimental observations and mathematical models. The current version of the software is specifically designed for constructing tertiary models with time and temperature as the independent model parameters in the package. The software is tested with a total of 9 different combinations of primary and secondary models for growth and survival of various microorganisms. The results of data analysis show that this software provides accurate estimates of kinetic parameters. In addition, it can be used to improve the experimental design and data collection for more accurate estimation of kinetic parameters. IPMP-Global Fit can be used in combination with the regular USDA-IPMP for solving the inverse problems and developing tertiary models in predictive microbiology. Published by Elsevier B.V.

  8. Reducing Risky Security Behaviours: Utilising Affective Feedback to Educate Users

    Directory of Open Access Journals (Sweden)

    Lynsay A. Shepherd

    2014-11-01

    Full Text Available Despite the number of tools created to help end-users reduce risky security behaviours, users are still falling victim to online attacks. This paper proposes a browser extension utilising affective feedback to provide warnings on detection of risky behaviour. The paper provides an overview of behaviour considered to be risky, explaining potential threats users may face online. Existing tools developed to reduce risky security behaviours in end-users have been compared, discussing the success rates of various methodologies. Ongoing research is described which attempts to educate users regarding the risks and consequences of poor security behaviour by providing the appropriate feedback on the automatic recognition of risky behaviour. The paper concludes that a solution utilising a browser extension is a suitable method of monitoring potentially risky security behaviour. Ultimately, future work seeks to implement an affective feedback mechanism within the browser extension with the aim of improving security awareness.

  9. Same admissions tools, different outcomes: a critical perspective on predictive validity in three undergraduate medical schools.

    Science.gov (United States)

    Edwards, Daniel; Friedman, Tim; Pearce, Jacob

    2013-12-27

    Admission to medical school is one of the most highly competitive entry points in higher education. Considerable investment is made by universities to develop selection processes that aim to identify the most appropriate candidates for their medical programs. This paper explores data from three undergraduate medical schools to offer a critical perspective of predictive validity in medical admissions. This study examined 650 undergraduate medical students from three Australian universities as they progressed through the initial years of medical school (accounting for approximately 25 per cent of all commencing undergraduate medical students in Australia in 2006 and 2007). Admissions criteria (aptitude test score based on UMAT, school result and interview score) were correlated with GPA over four years of study. Standard regression of each of the three admissions variables on GPA, for each institution at each year level was also conducted. Overall, the data found positive correlations between performance in medical school, school achievement and UMAT, but not interview. However, there were substantial differences between schools, across year levels, and within sections of UMAT exposed. Despite this, each admission variable was shown to add towards explaining course performance, net of other variables. The findings suggest the strength of multiple admissions tools in predicting outcomes of medical students. However, they also highlight the large differences in outcomes achieved by different schools, thus emphasising the pitfalls of generalising results from predictive validity studies without recognising the diverse ways in which they are designed and the variation in the institutional contexts in which they are administered. The assumption that high-positive correlations are desirable (or even expected) in these studies is also problematised.

  10. Landscape capability models as a tool to predict fine-scale forest bird occupancy and abundance

    Science.gov (United States)

    Loman, Zachary G.; DeLuca, William; Harrison, Daniel J.; Loftin, Cynthia S.; Rolek, Brian W.; Wood, Petra B.

    2018-01-01

    ContextSpecies-specific models of landscape capability (LC) can inform landscape conservation design. Landscape capability is “the ability of the landscape to provide the environment […] and the local resources […] needed for survival and reproduction […] in sufficient quantity, quality and accessibility to meet the life history requirements of individuals and local populations.” Landscape capability incorporates species’ life histories, ecologies, and distributions to model habitat for current and future landscapes and climates as a proactive strategy for conservation planning.ObjectivesWe tested the ability of a set of LC models to explain variation in point occupancy and abundance for seven bird species representative of spruce-fir, mixed conifer-hardwood, and riparian and wooded wetland macrohabitats.MethodsWe compiled point count data sets used for biological inventory, species monitoring, and field studies across the northeastern United States to create an independent validation data set. Our validation explicitly accounted for underestimation in validation data using joint distance and time removal sampling.ResultsBlackpoll warbler (Setophaga striata), wood thrush (Hylocichla mustelina), and Louisiana (Parkesia motacilla) and northern waterthrush (P. noveboracensis) models were validated as predicting variation in abundance, although this varied from not biologically meaningful (1%) to strongly meaningful (59%). We verified all seven species models [including ovenbird (Seiurus aurocapilla), blackburnian (Setophaga fusca) and cerulean warbler (Setophaga cerulea)], as all were positively related to occupancy data.ConclusionsLC models represent a useful tool for conservation planning owing to their predictive ability over a regional extent. As improved remote-sensed data become available, LC layers are updated, which will improve predictions.

  11. SRUNs - sustainable resource utilisation networks for regions

    International Nuclear Information System (INIS)

    Niemetz, N.

    2015-01-01

    Nowadays it cannot be denied that fossil resources will approach or over-run their maximum global production rate within the 21st century. In addition to this resource constraints climate change has to be considered in parallel, requiring a drastic reduction in carbon emissions. These two trends clearly show that a fundamental shift is needed within the next decades, from fossil towards renewable resources. This transition gives rise to a change in the supply chains: while fossil fuels are typically exploited from point sources, nearly all renewable resources depend, either directly or indirectly, on solar radiation and area is required for their provision. This poses a new challenge for political, economic and social actors who can decide about land use. Within this thesis a conceptual framework of so called SRUNs – sustainable resource utilisation networks for regions - is developed. Regions have a responsibility in providing goods and services for the society within sustainable networks and bring the spatial dimension into consideration as well. The way how these networks are constructed is described in detail covering spatial planning, the stakeholder process, drivers and barriers as well as elements and features for SRUNs. Using the Process Network Synthesis (PNS) as an optimisation tool, the economic optimum of a network can be found and different scenarios compared. To show the ecological pressure of an established network an evaluation with the Sustainable Process Index (SPI) is carried out. Both computer tools are described and their application is shown in several case studies which are the versatility of the methods in practical implementation and application. Decision support tools offer the possibility for regional actors to analyse their region and to get a feeling about SRUNs. These tools provide an insight into the necessary changes which are needed to manage the shift towards a low carbon and sustainable society. (author) [de

  12. Benefits for wind energy in electricity markets from using short term wind power prediction tools: a simulation study

    International Nuclear Information System (INIS)

    Usaola, J.; Ravelo, O.; Gonzalez, G.; Soto, F.; Davila, M.C.; Diaz-Guerra, B.

    2004-01-01

    One of the characteristics of wind energy, from the grid point of view, is its non-dispatchability, i.e. generation cannot be ordered, hence integration in electrical networks may be difficult. Short-term wind power prediction-tools could make this integration easier, either by their use by the grid System Operator, or by promoting the participation of wind farms in the electricity markets and using prediction tools to make their bids in the market. In this paper, the importance of a short-term wind power-prediction tool for the participation of wind energy systems in electricity markets is studied. Simulations, according to the current Spanish market rules, have been performed to the production of different wind farms, with different degrees of accuracy in the prediction tool. It may be concluded that income from participation in electricity markets is increased using a short-term wind power prediction-tool of average accuracy. This both marginally increases income and also reduces the impact on system operation with the improved forecasts. (author)

  13. Utilisation of factorial experiments for the UV/H O process in a batch ...

    African Journals Online (AJOL)

    drinie

    2001-10-04

    Oct 4, 2001 ... The predictions given by the factorial experiments model were confirmed by the ... studies have given the effect of initial H2O2 concentration, initial concentration of the ... This mathematical model may be utilised to explain.

  14. Validity of a simple Internet-based outcome-prediction tool in patients with total hip replacement: a pilot study.

    Science.gov (United States)

    Stöckli, Cornel; Theiler, Robert; Sidelnikov, Eduard; Balsiger, Maria; Ferrari, Stephen M; Buchzig, Beatus; Uehlinger, Kurt; Riniker, Christoph; Bischoff-Ferrari, Heike A

    2014-04-01

    We developed a user-friendly Internet-based tool for patients undergoing total hip replacement (THR) due to osteoarthritis to predict their pain and function after surgery. In the first step, the key questions were identified by statistical modelling in a data set of 375 patients undergoing THR. Based on multiple regression, we identified the two most predictive WOMAC questions for pain and the three most predictive WOMAC questions for functional outcome, while controlling for comorbidity, body mass index, age, gender and specific comorbidities relevant to the outcome. In the second step, a pilot study was performed to validate the resulting tool against the full WOMAC questionnaire among 108 patients undergoing THR. The mean difference between observed (WOMAC) and model-predicted value was -1.1 points (95% confidence interval, CI -3.8, 1.5) for pain and -2.5 points (95% CI -5.3, 0.3) for function. The model-predicted value was within 20% of the observed value in 48% of cases for pain and in 57% of cases for function. The tool demonstrated moderate validity, but performed weakly for patients with extreme levels of pain and extreme functional limitations at 3 months post surgery. This may have been partly due to early complications after surgery. However, the outcome-prediction tool may be useful in helping patients to become better informed about the realistic outcome of their THR.

  15. The car parking used as control tool of individual motor traffic. Good practices of european towns; Le stationnement utilise comme outil de regulation des deplacements individuels motorises. Bonnes pratiques de villes europeennes

    Energy Technology Data Exchange (ETDEWEB)

    Cahn, M; Vallar, J P

    2001-07-01

    This study aims to identify and present significant actions of european towns in the domain of local parking policy as a control tool of motor traffic. Some cases are presented to illustrate the study and six axis of actions have been identified: parking restriction measures to protect the town center and encourage people to use other transport systems; urban areas regulations; initiatives in little towns; tariffs of parking; assistance to disabled persons and actions realized in outlying areas. (A.L.B.)

  16. Evaluation of professional practices and quality management: use of some tools in nuclear medicine; Evaluation des pratiques professionnelles et demarche qualite: analyse de quelques exemples d'outils utilisables en medecine nucleaire

    Energy Technology Data Exchange (ETDEWEB)

    Israel, J.M. [GIE scintigraphie de la Plaine-de-France, 93 - Tremblay-en-France (France)

    2009-02-15

    The walk to continue quality improvement is an obligation in all French health structures and the departments of nuclear medicine have to take part in, particularly for the evaluation of professional practices (E.P.P.). This work meant to show how simple and classical tools (process analysis, scheme of Ishikawa, '5M' method, logo-grams, EFQM or Shortell methods) could be used to impulse an E.P.P. according to the rules proposed by the 'Haute Autorite de sante'. We try to show how these tools can be used to analyze and solve practical as managerial problems. For example, we set out a process analysis of nuclear medicine acts in several levels of discrimination. Lastly, we expose some classical problems in the application of the E.P.P.. The aim of this article is not to offer ready-made solutions, but only to show what is possible with these user-friendly tools. (author)

  17. Effectiveness of Cooperative Learning Instructional Tools With Predict-Observe-Explain Strategy on the Topic of Cuboid and Cube Volume

    Science.gov (United States)

    Nurhuda; Lukito, A.; Masriyah

    2018-01-01

    This study aims to develop instructional tools and implement it to see the effectiveness. The method used in this research referred to Designing Effective Instruction. Experimental research with two-group pretest-posttest design method was conducted. The instructional tools have been developed is cooperative learning model with predict-observe-explain strategy on the topic of cuboid and cube volume which consist of lesson plans, POE tasks, and Tests. Instructional tools were of good quality by criteria of validity, practicality, and effectiveness. These instructional tools was very effective for teaching the volume of cuboid and cube. Cooperative instructional tool with predict-observe-explain (POE) strategy was good of quality because the teacher was easy to implement the steps of learning, students easy to understand the material and students’ learning outcomes completed classically. Learning by using this instructional tool was effective because learning activities were appropriate and students were very active. Students’ learning outcomes were completed classically and better than conventional learning. This study produced a good instructional tool and effectively used in learning. Therefore, these instructional tools can be used as an alternative to teach volume of cuboid and cube topics.

  18. Tool Sequence Trends in Minimally Invasive Surgery: Statistical Analysis and Implications for Predictive Control of Multifunction Instruments

    Directory of Open Access Journals (Sweden)

    Carl A. Nelson

    2012-01-01

    Full Text Available This paper presents an analysis of 67 minimally invasive surgical procedures covering 11 different procedure types to determine patterns of tool use. A new graph-theoretic approach was taken to organize and analyze the data. Through grouping surgeries by type, trends of common tool changes were identified. Using the concept of signal/noise ratio, these trends were found to be statistically strong. The tool-use trends were used to generate tool placement patterns for modular (multi-tool, cartridge-type surgical tool systems, and the same 67 surgeries were numerically simulated to determine the optimality of these tool arrangements. The results indicate that aggregated tool-use data (by procedure type can be employed to predict tool-use sequences with good accuracy, and also indicate the potential for artificial intelligence as a means of preoperative and/or intraoperative planning. Furthermore, this suggests that the use of multifunction surgical tools can be optimized to streamline surgical workflow.

  19. Analysis of the uranium price predicted to 24 months, implementing neural networks and the Monte Carlo method like predictive tools

    International Nuclear Information System (INIS)

    Esquivel E, J.; Ramirez S, J. R.; Palacios H, J. C.

    2011-11-01

    The present work shows predicted prices of the uranium, using a neural network. The importance of predicting financial indexes of an energy resource, in this case, allows establishing budgetary measures, as well as the costs of the resource to medium period. The uranium is part of the main energy generating fuels and as such, its price rebounds in the financial analyses, due to this is appealed to predictive methods to obtain an outline referent to the financial behaviour that will have in a certain time. In this study, two methodologies are used for the prediction of the uranium price: the Monte Carlo method and the neural networks. These methods allow predicting the indexes of monthly costs, for a two years period, starting from the second bimonthly of 2011. For the prediction the uranium costs are used, registered from the year 2005. (Author)

  20. Augmenting Predictive Modeling Tools with Clinical Insights for Care Coordination Program Design and Implementation.

    Science.gov (United States)

    Johnson, Tracy L; Brewer, Daniel; Estacio, Raymond; Vlasimsky, Tara; Durfee, Michael J; Thompson, Kathy R; Everhart, Rachel M; Rinehart, Deborath J; Batal, Holly

    2015-01-01

    The Center for Medicare and Medicaid Innovation (CMMI) awarded Denver Health's (DH) integrated, safety net health care system $19.8 million to implement a "population health" approach into the delivery of primary care. This major practice transformation builds on the Patient Centered Medical Home (PCMH) and Wagner's Chronic Care Model (CCM) to achieve the "Triple Aim": improved health for populations, care to individuals, and lower per capita costs. This paper presents a case study of how DH integrated published predictive models and front-line clinical judgment to implement a clinically actionable, risk stratification of patients. This population segmentation approach was used to deploy enhanced care team staff resources and to tailor care-management services to patient need, especially for patients at high risk of avoidable hospitalization. Developing, implementing, and gaining clinical acceptance of the Health Information Technology (HIT) solution for patient risk stratification was a major grant objective. In addition to describing the Information Technology (IT) solution itself, we focus on the leadership and organizational processes that facilitated its multidisciplinary development and ongoing iterative refinement, including the following: team composition, target population definition, algorithm rule development, performance assessment, and clinical-workflow optimization. We provide examples of how dynamic business intelligence tools facilitated clinical accessibility for program design decisions by enabling real-time data views from a population perspective down to patient-specific variables. We conclude that population segmentation approaches that integrate clinical perspectives with predictive modeling results can better identify high opportunity patients amenable to medical home-based, enhanced care team interventions.

  1. Thermogravimetric analysis coupled with chemometrics as a powerful predictive tool for ß-thalassemia screening.

    Science.gov (United States)

    Risoluti, Roberta; Materazzi, Stefano; Sorrentino, Francesco; Maffei, Laura; Caprari, Patrizia

    2016-10-01

    β-Thalassemia is a hemoglobin genetic disorder characterized by the absence or reduced β-globin chain synthesis, one of the constituents of the adult hemoglobin tetramer. In this study the possibility of using thermogravimetric analysis (TGA) followed by chemometrics as a new approach for β-thalassemia detection is proposed. Blood samples from patients with β-thalassemia were analyzed by the TG7 thermobalance and the resulting curves were compared to those typical of healthy individuals. Principal Component Analysis (PCA) was used to evaluate the correlation between the hematological parameters and the thermogravimetric results. The thermogravimetric profiles of blood samples from β-thalassemia patients were clearly distinct from those of healthy individuals as result of the different quantities of water content and corpuscular fraction. The hematological overview showed significant decreases in the values of red blood cell indices and an increase in red cell distribution width value in thalassemia subjects when compared with those of healthy subjects. The implementation of a predictive model based on Partial Least Square Discriminant Analysis (PLS-DA) for β-thalassemia diagnosis, was performed and validated. This model permitted the discrimination of anemic patients and healthy individuals and was able to detect thalassemia in clinically heterogeneous patients as in the presence of δβ-thalassemia and β-thalassemia combined with Hb Lepore. TGA and Chemometrics are capable of predicting ß-thalassemia syndromes using only a few microliters of blood without any pretreatment and with an hour of analysis time. A fast, rapid and cost-effective diagnostic tool for the β-thalassemia screening is proposed. Copyright © 2016 Elsevier B.V. All rights reserved.

  2. Creating pharmacy staffing-to-demand models: predictive tools used at two institutions.

    Science.gov (United States)

    Krogh, Paul; Ernster, Jason; Knoer, Scott

    2012-09-15

    The creation and implementation of data-driven staffing-to-demand models at two institutions are described. Predictive workload tools provide a guideline for pharmacy managers to adjust staffing needs based on hospital volume metrics. At Abbott Northwestern Hospital, management worked with the department's staff and labor management committee to clearly outline the productivity monitoring system and the process for reducing hours. Reference charts describing the process for reducing hours and a form to track the hours of involuntary reductions for each employee were created to further enhance communication, explain the rationale behind the new process, and promote transparency. The University of Minnesota Medical Center-Fairview, found a strong correlation between measured pharmacy workload and an adjusted census formula. If the daily census and admission report indicate that the adjusted census will provide enough workload for the fully staffed department, no further action is needed. If the census report indicates the adjusted census is less than the breakeven point, staff members are asked to leave work, either voluntarily or involuntarily. The opposite holds true for days when the adjusted census is higher than the breakeven point, at which time additional staff are required to synchronize worked hours with predicted workload. Successful staffing-to- demand models were implemented in two hospital pharmacies. Financial savings, as indicated by decreased labor costs secondary to reduction of staffed shifts, were approximately $42,000 and $45,500 over a three-month period for Abbott Northwestern Hospital and the University of Minnesota Medical Center-Fairview, respectively. Maintenance of 100% productively allowed the departments to continue to replace vacant positions and avoid permanent staff reductions.

  3. Efficacy of a tool to predict short-term mortality in older people presenting at emergency departments: Protocol for a multi-centre cohort study.

    Science.gov (United States)

    Cardona, Magnolia; Lewis, Ebony T; Turner, Robin M; Alkhouri, Hatem; Asha, Stephen; Mackenzie, John; Perkins, Margaret; Suri, Sam; Holdgate, Anna; Winoto, Luis; Chang, Chan-Wei; Gallego-Luxan, Blanca; McCarthy, Sally; Kristensen, Mette R; O'Sullivan, Michael; Skjøt-Arkil, Helene; Ekmann, Anette A; Nygaard, Hanne H; Jensen, Jonas J; Jensen, Rune O; Pedersen, Jonas L; Breen, Dorothy; Petersen, John A; Jensen, Birgitte N; Mogensen, Christian Backer; Hillman, Ken; Brabrand, Mikkel

    Prognostic uncertainty inhibits clinicians from initiating timely end-of-life discussions and advance care planning. This study evaluates the efficacy of the CriSTAL (Criteria for Screening and Triaging to Appropriate aLternative care) checklist in emergency departments. Prospective cohort study of patients aged ≥65 years with any diagnosis admitted via emergency departments in ten hospitals in Australia, Denmark and Ireland. Electronic and paper clinical records will be used to extract risk factors such as nursing home residency, physiological deterioration warranting a rapid response call, personal history of active chronic disease, history of hospitalisations or intensive care unit admission in the past year, evidence of proteinuria or ECG abnormalities, and evidence of frailty to be concurrently measured with Fried Score and Clinical Frailty Scale. Patients or their informal caregivers will be contacted by telephone around three months after initial assessment to ascertain survival, self-reported health, post-discharge frailty and health service utilisation since discharge. Logistic regression and bootstrapping techniques and AUROC curves will be used to test the predictive accuracy of CriSTAL for death within 90 days of admission and in-hospital death. The CriSTAL checklist is an objective and practical tool for use in emergency departments among older patients to determine individual probability of death in the short-term. Its validation in this cohort is expected to reduce clinicians' prognostic uncertainty on the time to patients' death and encourage timely end-of-life conversations to support clinical decisions with older frail patients and their families about their imminent or future care choices. Copyright © 2018 Elsevier B.V. All rights reserved.

  4. piRNA analysis framework from small RNA-Seq data by a novel cluster prediction tool - PILFER.

    Science.gov (United States)

    Ray, Rishav; Pandey, Priyanka

    2017-12-19

    With the increasing number of studies focusing on PIWI-interacting RNA (piRNAs), it is now pertinent to develop efficient tools dedicated towards piRNA analysis. We have developed a novel cluster prediction tool called PILFER (PIrna cLuster FindER), which can accurately predict piRNA clusters from small RNA sequencing data. PILFER is an open source, easy to use tool, and can be executed even on a personal computer with minimum resources. It uses a sliding-window mechanism by integrating the expression of the reads along with the spatial information to predict the piRNA clusters. We have additionally defined a piRNA analysis pipeline incorporating PILFER to detect and annotate piRNAs and their clusters from raw small RNA sequencing data and implemented it on publicly available data from healthy germline and somatic tissues. We compared PILFER with other existing piRNA cluster prediction tools and found it to be statistically more accurate and superior in many aspects such as the robustness of PILFER clusters is higher and memory efficiency is more. Overall, PILFER provides a fast and accurate solution to piRNA cluster prediction. Copyright © 2017 Elsevier Inc. All rights reserved.

  5. Prediction of the maximum absorption wavelength of azobenzene dyes by QSPR tools

    Science.gov (United States)

    Xu, Xuan; Luan, Feng; Liu, Huitao; Cheng, Jianbo; Zhang, Xiaoyun

    2011-12-01

    The maximum absorption wavelength ( λmax) of a large data set of 191 azobenzene dyes was predicted by quantitative structure-property relationship (QSPR) tools. The λmax was correlated with the 4 molecular descriptors calculated from the structure of the dyes alone. The multiple linear regression method (MLR) and the non-linear radial basis function neural network (RBFNN) method were applied to develop the models. The statistical parameters provided by the MLR model were R2 = 0.893, Radj2=0.893, qLOO2=0.884, F = 1214.871, RMS = 11.6430 for the training set; and R2 = 0.849, Radj2=0.845, qext2=0.846, F = 207.812, RMS = 14.0919 for the external test set. The RBFNN model gave even improved statistical results: R2 = 0.920, Radj2=0.919, qLOO2=0.898, F = 1664.074, RMS = 9.9215 for the training set, and R2 = 0.895, Radj2=0.892, qext2=0.895, F = 314.256, RMS = 11.6427 for the external test set. This theoretical method provides a simple, precise and an alternative method to obtain λmax of azobenzene dyes.

  6. Moduli dynamics as a predictive tool for thermal maximally supersymmetric Yang-Mills at large N

    Energy Technology Data Exchange (ETDEWEB)

    Morita, Takeshi [Department of Physics, Shizuoka University,836 Ohya, Suruga-ku, Shizuoka 422-8529 (Japan); Department of Physics and Astronomy, University of Kentucky,Lexington, KY 40506 (United States); Shiba, Shotaro [Maskawa Institute for Science and Culture, Kyoto Sangyo University,Kamigamo-Motoyama, Kita-ku, Kyoto 603-8555 (Japan); Wiseman, Toby [Theoretical Physics Group, Blackett Laboratory, Imperial College,Exhibition Road, London SW7 2AZ (United Kingdom); Withers, Benjamin [Mathematical Sciences and STAG Research Centre, University of Southampton,Highfield, Southampton SO17 1BJ (United Kingdom)

    2015-07-09

    Maximally supersymmetric (p+1)-dimensional Yang-Mills theory at large N and finite temperature, with possibly compact spatial directions, has a rich phase structure. Strongly coupled phases may have holographic descriptions as black branes in various string duality frames, or there may be no gravity dual. In this paper we provide tools in the gauge theory which give a simple and unified picture of the various strongly coupled phases, and transitions between them. Building on our previous work we consider the effective theory describing the moduli of the gauge theory, which can be computed precisely when it is weakly coupled far out on the Coulomb branch. Whilst for perturbation theory naive extrapolation from weak coupling to strong gives little information, for this moduli theory naive extrapolation from its weakly to its strongly coupled regime appears to encode a surprising amount of information about the various strongly coupled phases. We argue it encodes not only the parametric form of thermodynamic quantities for these strongly coupled phases, but also certain transcendental factors with a geometric origin, and allows one to deduce transitions between the phases. We emphasise it also gives predictions for the behaviour of other observables in these phases.

  7. New tools and new ideas for HR practitioners. Structural and predictive validity of weighted satisfaction questionnaire

    Directory of Open Access Journals (Sweden)

    Lorenzo Revuelto Taboada

    2012-12-01

    Full Text Available One of the fundamental tasks for an Human Resource Management (HRM practitioner consists in designing a reward system that can be broadly understood and can influence the attitudes and, subsequently, the behavior of individuals to permit achievement of organizational objectives. To do so, appropriate tools are necessary to allow key actions to be identified in terms of motivating employees; thereby, avoiding opportunistic costs derived from allocating resources needed to close the gap in employee satisfaction, with regard to non-priority factors for workers in satisfying their own personal needs. This article, thus, presents a dual assessment scale consisting of 44 items, categorized into six dimensions, which firstly evaluates the importance of motivation and, secondly, the level of satisfaction with the current situation for each of the 44 factors considered. Using a sample of 801 individuals, we analyzedthe internal consistency, face validity, and predictive validity of the measuring scales, obtaining a series of results that were, to say the least, promising

  8. JASSA: a comprehensive tool for prediction of SUMOylation sites and SIMs.

    Science.gov (United States)

    Beauclair, Guillaume; Bridier-Nahmias, Antoine; Zagury, Jean-François; Saïb, Ali; Zamborlini, Alessia

    2015-11-01

    Post-translational modification by the Small Ubiquitin-like Modifier (SUMO) proteins, a process termed SUMOylation, is involved in many fundamental cellular processes. SUMO proteins are conjugated to a protein substrate, creating an interface for the recruitment of cofactors harboring SUMO-interacting motifs (SIMs). Mapping both SUMO-conjugation sites and SIMs is required to study the functional consequence of SUMOylation. To define the best candidate sites for experimental validation we designed JASSA, a Joint Analyzer of SUMOylation site and SIMs. JASSA is a predictor that uses a scoring system based on a Position Frequency Matrix derived from the alignment of experimental SUMOylation sites or SIMs. Compared with existing web-tools, JASSA displays on par or better performances. Novel features were implemented towards a better evaluation of the prediction, including identification of database hits matching the query sequence and representation of candidate sites within the secondary structural elements and/or the 3D fold of the protein of interest, retrievable from deposited PDB files. JASSA is freely accessible at http://www.jassa.fr/. Website is implemented in PHP and MySQL, with all major browsers supported. guillaume.beauclair@inserm.fr Supplementary data are available at Bioinformatics online. © The Author 2015. Published by Oxford University Press. All rights reserved. For Permissions, please e-mail: journals.permissions@oup.com.

  9. An Approach to Flooding Inundation Combining the Streamflow Prediction Tool (SPT) and Downscaled Soil Moisture

    Science.gov (United States)

    Cotterman, K. A.; Follum, M. L.; Pradhan, N. R.; Niemann, J. D.

    2017-12-01

    Flooding impacts numerous aspects of society, from localized flash floods to continental-scale flood events. Many numerical flood models focus solely on riverine flooding, with some capable of capturing both localized and continental-scale flood events. However, these models neglect flooding away from channels that are related to excessive ponding, typically found in areas with flat terrain and poorly draining soils. In order to obtain a holistic view of flooding, we combine flood results from the Streamflow Prediction Tool (SPT), a riverine flood model, with soil moisture downscaling techniques to determine if a better representation of flooding is obtained. This allows for a more holistic understanding of potential flood prone areas, increasing the opportunity for more accurate warnings and evacuations during flooding conditions. Thirty-five years of near-global historical streamflow is reconstructed with continental-scale flow routing of runoff from global land surface models. Elevation data was also obtained worldwide, to establish a relationship between topographic attributes and soil moisture patterns. Derived soil moisture data is validated against observed soil moisture, increasing confidence in the ability to accurately capture soil moisture patterns. Potential flooding situations can be examined worldwide, with this study focusing on the United States, Central America, and the Philippines.

  10. Predictive validity of the post-enrolment English language assessment tool for commencing undergraduate nursing students.

    Science.gov (United States)

    Glew, Paul J; Hillege, Sharon P; Salamonson, Yenna; Dixon, Kathleen; Good, Anthony; Lombardo, Lien

    2015-12-01

    Nursing students with English as an additional language (EAL) may underperform academically. The post-enrolment English language assessment (PELA) is used in literacy support, but its predictive validity in identifying those at risk of underperformance remains unknown. To validate a PELA, as a predictor of academic performance. Prospective survey design. The study was conducted at a university located in culturally and linguistically diverse areas of western Sydney, Australia. Commencing undergraduate nursing students who were Australian-born (n=1323, 49.6%) and born outside of Australia (n=1346, 50.4%) were recruited for this study. The 2669 (67% of 3957) participants provided consent and completed a first year nursing unit that focussed on developing literacy skills. Between 2010 and 2013, commencing students completed the PELA and English language acculturation scale (ELAS), a previously validated instrument. The grading levels of the PELA tool were: Level 1 (proficient), Level 2 (borderline), and Level 3 (poor, and requiring additional support). Participants with a PELA Level 2 or 3 were more likely to be: a) non-Australian-born (χ(2): 520.6, df: 2, pstudent (χ(2): 225.6, df: 2, pstudents who are at risk of academic underachievement. Crown Copyright © 2015. Published by Elsevier Ltd. All rights reserved.

  11. A low-dimensional tool for predicting force decomposition coefficients for varying inflow conditions

    KAUST Repository

    Ghommem, Mehdi

    2013-01-01

    We develop a low-dimensional tool to predict the effects of unsteadiness in the inflow on force coefficients acting on a circular cylinder using proper orthogonal decomposition (POD) modes from steady flow simulations. The approach is based on combining POD and linear stochastic estimator (LSE) techniques. We use POD to derive a reduced-order model (ROM) to reconstruct the velocity field. To overcome the difficulty of developing a ROM using Poisson\\'s equation, we relate the pressure field to the velocity field through a mapping function based on LSE. The use of this approach to derive force decomposition coefficients (FDCs) under unsteady mean flow from basis functions of the steady flow is illustrated. For both steady and unsteady cases, the final outcome is a representation of the lift and drag coefficients in terms of velocity and pressure temporal coefficients. Such a representation could serve as the basis for implementing control strategies or conducting uncertainty quantification. Copyright © 2013 Inderscience Enterprises Ltd.

  12. Moduli dynamics as a predictive tool for thermal maximally supersymmetric Yang-Mills at large N

    International Nuclear Information System (INIS)

    Morita, Takeshi; Shiba, Shotaro; Wiseman, Toby; Withers, Benjamin

    2015-01-01

    Maximally supersymmetric (p+1)-dimensional Yang-Mills theory at large N and finite temperature, with possibly compact spatial directions, has a rich phase structure. Strongly coupled phases may have holographic descriptions as black branes in various string duality frames, or there may be no gravity dual. In this paper we provide tools in the gauge theory which give a simple and unified picture of the various strongly coupled phases, and transitions between them. Building on our previous work we consider the effective theory describing the moduli of the gauge theory, which can be computed precisely when it is weakly coupled far out on the Coulomb branch. Whilst for perturbation theory naive extrapolation from weak coupling to strong gives little information, for this moduli theory naive extrapolation from its weakly to its strongly coupled regime appears to encode a surprising amount of information about the various strongly coupled phases. We argue it encodes not only the parametric form of thermodynamic quantities for these strongly coupled phases, but also certain transcendental factors with a geometric origin, and allows one to deduce transitions between the phases. We emphasise it also gives predictions for the behaviour of other observables in these phases.

  13. bTSSfinder: a novel tool for the prediction of promoters in Cyanobacteria andEscherichia coli

    KAUST Repository

    Shahmuradov, Ilham; Mohamad Razali, Rozaimi; Bougouffa, Salim; Radovanovic, Aleksandar; Bajic, Vladimir B.

    2016-01-01

    Results: Here, we introduce bTSSfinder, a novel tool that predicts putative promoters for five classes of σ factors in Cyanobacteria (σA, σC, σH, σG and σF) and for five classes of sigma factors in E. coli (σ70, σ38, σ32, σ28 and σ24). Comparing to currently available tools, bTSSfinder achieves higher accuracy (MCC=0.86, F1-score=0.93) compared to the next best tool with MCC=0.59, F1-score=0.79) and covers multiple classes of promoters.

  14. Dementia Population Risk Tool (DemPoRT): study protocol for a predictive algorithm assessing dementia risk in the community

    OpenAIRE

    Fisher, Stacey; Hsu, Amy; Mojaverian, Nassim; Taljaard, Monica; Huyer, Gregory; Manuel, Douglas G; Tanuseputro, Peter

    2017-01-01

    Introduction The burden of disease from dementia is a growing global concern as incidence increases dramatically with age, and average life expectancy has been increasing around the world. Planning for an ageing population requires reliable projections of dementia prevalence; however, existing population projections are simple and have poor predictive accuracy. The Dementia Population Risk Tool (DemPoRT) will predict incidence of dementia in the population setting using multivariable modellin...

  15. The hemorrhagic transformation index score: a prediction tool in middle cerebral artery ischemic stroke.

    Science.gov (United States)

    Kalinin, Mikhail N; Khasanova, Dina R; Ibatullin, Murat M

    2017-09-07

    We aimed to develop a tool, the hemorrhagic transformation (HT) index (HTI), to predict any HT within 14 days after middle cerebral artery (MCA) stroke onset regardless of the intravenous recombinant tissue plasminogen activator (IV rtPA) use. That is especially important in the light of missing evidence-based data concerning the timing of anticoagulant resumption after stroke in patients with atrial fibrillation (AF). We retrospectively analyzed 783 consecutive MCA stroke patients. Clinical and brain imaging data at admission were recorded. A follow-up period was 2 weeks after admission. The patients were divided into derivation (DC) and validation (VC) cohorts by generating Bernoulli variates with probability parameter 0.7. Univariate/multivariate logistic regression, and factor analysis were used to extract independent predictors. Validation was performed with internal consistency reliability and receiver operating characteristic (ROC) analysis. Bootstrapping was used to reduce bias. The HTI was composed of 4 items: Alberta Stroke Program Early CT score (ASPECTS), National Institutes of Health Stroke Scale (NIHSS), hyperdense MCA (HMCA) sign, and AF on electrocardiogram (ECG) at admission. According to the predicted probability (PP) range, scores were allocated to ASPECTS as follows: 10-7 = 0; 6-5 = 1; 4-3 = 2; 2-0 = 3; to NIHSS: 0-11 = 0; 12-17 = 1; 18-23 = 2; >23 = 3; to HMCA sign: yes = 1; to AF on ECG: yes = 1. The HTI score varied from 0 to 8. For each score, adjusted PP of any HT with 95% confidence intervals (CI) was as follows: 0 = 0.027 (0.011-0.042); 1 = 0.07 (0.043-0.098); 2 = 0.169 (0.125-0.213); 3 = 0.346 (0.275-0.417); 4 = 0.571 (0.474-0.668); 5 = 0.768 (0.676-0.861); 6 = 0.893 (0.829-0.957); 7 = 0.956 (0.92-0.992); 8 = 0.983 (0.965-1.0). The optimal cutpoint score to differentiate between HT-positive and negative groups was 2 (95% normal-based CI, 1-3) for the DC and VC alike. ROC area

  16. Comparison of four modeling tools for the prediction of potential distribution for non-indigenous weeds in the United States

    Science.gov (United States)

    Magarey, Roger; Newton, Leslie; Hong, Seung C.; Takeuchi, Yu; Christie, Dave; Jarnevich, Catherine S.; Kohl, Lisa; Damus, Martin; Higgins, Steven I.; Miller, Leah; Castro, Karen; West, Amanda; Hastings, John; Cook, Gericke; Kartesz, John; Koop, Anthony

    2018-01-01

    This study compares four models for predicting the potential distribution of non-indigenous weed species in the conterminous U.S. The comparison focused on evaluating modeling tools and protocols as currently used for weed risk assessment or for predicting the potential distribution of invasive weeds. We used six weed species (three highly invasive and three less invasive non-indigenous species) that have been established in the U.S. for more than 75 years. The experiment involved providing non-U. S. location data to users familiar with one of the four evaluated techniques, who then developed predictive models that were applied to the United States without knowing the identity of the species or its U.S. distribution. We compared a simple GIS climate matching technique known as Proto3, a simple climate matching tool CLIMEX Match Climates, the correlative model MaxEnt, and a process model known as the Thornley Transport Resistance (TTR) model. Two experienced users ran each modeling tool except TTR, which had one user. Models were trained with global species distribution data excluding any U.S. data, and then were evaluated using the current known U.S. distribution. The influence of weed species identity and modeling tool on prevalence and sensitivity effects was compared using a generalized linear mixed model. Each modeling tool itself had a low statistical significance, while weed species alone accounted for 69.1 and 48.5% of the variance for prevalence and sensitivity, respectively. These results suggest that simple modeling tools might perform as well as complex ones in the case of predicting potential distribution for a weed not yet present in the United States. Considerations of model accuracy should also be balanced with those of reproducibility and ease of use. More important than the choice of modeling tool is the construction of robust protocols and testing both new and experienced users under blind test conditions that approximate operational conditions.

  17. Utilisation of chemically treated coal

    International Nuclear Information System (INIS)

    Bezovska, M.

    2002-01-01

    The numerous application of coal with high content of humic substances are known. They are used in many branches of industry. The complex study of the composition of coal from upper Nitra mines has directed research to its application in the field of ecology and agriculture. The effective sorption layers of this coal and their humic acids can trap a broad spectrum of toxic harmful substances present in industrial wastes, particularly heavy metals. A major source of humic acids is coal - the most abundant and predominant product of plant residue coalification. All ranks of coal contain humic acids but lignite from Novaky deposit represents the most easily available and concentrated from of humic acids. The possibilities of utilisation of humic acids to remove heavy metals from waste waters was studied. The residual concentrations of the investigated metals in the aqueous phase were determined by AAs. From the results follows that the samples of coals humic acids can be used for the heavy metal removal from metal solutions and the real acid mine water. Oxidised coal with high content of humic acids and nitrogen is used in agriculture as fertilizer. Humic acids are active component in coal and can help to utilize almost quantitatively nitrogen in soil. The humic substances block and stabilize toxic metal residues already present in soil. (author)

  18. Utilisation of chemically treated coal

    Directory of Open Access Journals (Sweden)

    Bežovská Mária

    2002-03-01

    Full Text Available The numerous application of coal with high content of humic substances are known. They are used in many branches of industry. The complex study of the composition of coal from upper Nitra mines has directed research to its application in the field of ecology and agriculture. The effective sorption layers of this coal and their humic acids can to trap a broad spectrum of toxic harmful substances present in industrial wastes, particularly heavy metals. A major source of humic acids is coal - the most abundant and predominant product of plant residue coalification. All ranks of coal containt humic acids but lignite from Nováky deposit represents the most easily available and concentrated form of humic acids. Deep oxidation of coal by HNO3 oxidation - degradation has been performed to produce water-soluble-organic acids. The possibilities of utilisation of oxidised coal and humic acids to remove heavy metals from waste waters was studied. The residual concentrations of the investigated metals in the aqueous phase were determined by AAs. From the results follows that the samples of oxidised coal and theirs humic acids can be used for the heavy metal removal from metal solutions and the real acid mine water.Oxidised coal with a high content of humic acids and nitrogen is used in agriculture a fertilizer. Humic acids are active component in coal and help to utilize almost quantitatively nitrogen in soil. The humic substances block and stabiliz toxic metal residues already present in soil.

  19. Performance of in silico prediction tools for the classification of rare BRCA1/2 missense variants in clinical diagnostics.

    Science.gov (United States)

    Ernst, Corinna; Hahnen, Eric; Engel, Christoph; Nothnagel, Michael; Weber, Jonas; Schmutzler, Rita K; Hauke, Jan

    2018-03-27

    The use of next-generation sequencing approaches in clinical diagnostics has led to a tremendous increase in data and a vast number of variants of uncertain significance that require interpretation. Therefore, prediction of the effects of missense mutations using in silico tools has become a frequently used approach. Aim of this study was to assess the reliability of in silico prediction as a basis for clinical decision making in the context of hereditary breast and/or ovarian cancer. We tested the performance of four prediction tools (Align-GVGD, SIFT, PolyPhen-2, MutationTaster2) using a set of 236 BRCA1/2 missense variants that had previously been classified by expert committees. However, a major pitfall in the creation of a reliable evaluation set for our purpose is the generally accepted classification of BRCA1/2 missense variants using the multifactorial likelihood model, which is partially based on Align-GVGD results. To overcome this drawback we identified 161 variants whose classification is independent of any previous in silico prediction. In addition to the performance as stand-alone tools we examined the sensitivity, specificity, accuracy and Matthews correlation coefficient (MCC) of combined approaches. PolyPhen-2 achieved the lowest sensitivity (0.67), specificity (0.67), accuracy (0.67) and MCC (0.39). Align-GVGD achieved the highest values of specificity (0.92), accuracy (0.92) and MCC (0.73), but was outperformed regarding its sensitivity (0.90) by SIFT (1.00) and MutationTaster2 (1.00). All tools suffered from poor specificities, resulting in an unacceptable proportion of false positive results in a clinical setting. This shortcoming could not be bypassed by combination of these tools. In the best case scenario, 138 families would be affected by the misclassification of neutral variants within the cohort of patients of the German Consortium for Hereditary Breast and Ovarian Cancer. We show that due to low specificities state-of-the-art in silico

  20. Prediction of the wear and evolution of cutting tools in a carbide / titanium-aluminum-vanadium machining tribosystem by volumetric tool wear characterization and modeling

    Science.gov (United States)

    Kuttolamadom, Mathew Abraham

    The objective of this research work is to create a comprehensive microstructural wear mechanism-based predictive model of tool wear in the tungsten carbide / Ti-6Al-4V machining tribosystem, and to develop a new topology characterization method for worn cutting tools in order to validate the model predictions. This is accomplished by blending first principle wear mechanism models using a weighting scheme derived from scanning electron microscopy (SEM) imaging and energy dispersive x-ray spectroscopy (EDS) analysis of tools worn under different operational conditions. In addition, the topology of worn tools is characterized through scanning by white light interferometry (WLI), and then application of an algorithm to stitch and solidify data sets to calculate the volume of the tool worn away. The methodology was to first combine and weight dominant microstructural wear mechanism models, to be able to effectively predict the tool volume worn away. Then, by developing a new metrology method for accurately quantifying the bulk-3D wear, the model-predicted wear was validated against worn tool volumes obtained from corresponding machining experiments. On analyzing worn crater faces using SEM/EDS, adhesion was found dominant at lower surface speeds, while dissolution wear dominated with increasing speeds -- this is in conformance with the lower relative surface speed requirement for micro welds to form and rupture, essentially defining the mechanical load limit of the tool material. It also conforms to the known dominance of high temperature-controlled wear mechanisms with increasing surface speed, which is known to exponentially increase temperatures especially when machining Ti-6Al-4V due to its low thermal conductivity. Thus, straight tungsten carbide wear when machining Ti-6Al-4V is mechanically-driven at low surface speeds and thermally-driven at high surface speeds. Further, at high surface speeds, craters were formed due to carbon diffusing to the tool surface and

  1. Ability of different screening tools to predict positive effect on nutritional intervention among the elderly in primary health care

    DEFF Research Database (Denmark)

    Beck, Anne Marie; Beermann, Tina; Kjær, Stine

    2013-01-01

    Routine identification of nutritional risk screening is paramount as the first stage in nutritional treatment of the elderly. The major focus of former validation studies of screening tools has been on the ability to predict undernutrition. The aim of this study was to validate Mini Nutritional A...

  2. Using exposure prediction tools to link exposure and dosimetry for risk-based decisions: A case study with phthalates

    Science.gov (United States)

    A few different exposure prediction tools were evaluated for use in the new in vitro-based safety assessment paradigm using di-2-ethylhexyl phthalate (DEHP) and dibutyl phthalate (DnBP) as case compounds. Daily intake of each phthalate was estimated using both high-throughput (HT...

  3. SING FACEBOOK AS A TEACHING TOOL IN TEACHING FRENCH: EXAMPLE OF UNIVERSITY OF MERSIN / UTILISATION DE FACEBOOK COMME OUTIL DIDACTIQUE EN ENSEIGNEMENT DU FRANÇAIS: EXEMPLE D’UNIVERSITE DE MERSIN

    Directory of Open Access Journals (Sweden)

    Erdinç ASLAN

    2016-04-01

    Full Text Available This study is performed to show the effects of Facebook in French language instruction as a tool. It seeks to analyze the activities of using Facebook in a position of learning French. To conduct this study, we set up a Facebook group with 23 students in French Preparatory Program and three teachers who lead the program in the Department of Translation and Interpreting, Faculty of Letters and Sciences, University of Mersin, in the study period 2014-2015 fall semester. This closed group, teachers shared topics of their courses, images, writings, videos and they want their students to answer to their questions in the group. The students are also participated to the activities making comments under the sharing, giving answers to asked questions, making themselves sharing on this platform. The course announcements were also shared in the group. Moreover, in the current application portion exemplary, extracts the students' work were made anonymous to protect the right to privacy. For the protection of the names and photos profiles have hidden. In the period of the application, the observation technique was used. At the end of the application a student survey was submitted as part of the copy of this study. For the analysis of survey data the content analysis technique was used.

  4. Assessment of Lightning Transients on a De-Iced Rotor Blade with Predictive Tools and Coaxial Return Measurements

    Science.gov (United States)

    Guillet, S.; Gosmain, A.; Ducoux, W.; Ponçon, M.; Fontaine, G.; Desseix, P.; Perraud, P.

    2012-05-01

    The increasing use of composite materials in aircrafts primary structures has led to different problematics in the field of safety of flight in lightning conditions. The consequences of this technological mutation, which occurs in a parallel context of extension of electrified critical functions, are addressed by aircraft manufacturers through the enhancement of their available assessment means of lightning transient. On the one hand, simulation tools, provided an accurate description of aircraft design, are today valuable assessment tools, in both predictive and operative terms. On the other hand, in-house test means allow confirmation and consolidation of design office hardening solutions. The combined use of predictive simulation tools and in- house test means offers an efficient and reliable support for all aircraft developments in their various life-time stages. The present paper provides PREFACE research project results that illustrate the above introduced strategy on the de-icing system of the NH90 composite main rotor blade.

  5. Prediction Of Tensile And Shear Strength Of Friction Surfaced Tool Steel Deposit By Using Artificial Neural Networks

    Science.gov (United States)

    Manzoor Hussain, M.; Pitchi Raju, V.; Kandasamy, J.; Govardhan, D.

    2018-04-01

    Friction surface treatment is well-established solid technology and is used for deposition, abrasion and corrosion protection coatings on rigid materials. This novel process has wide range of industrial applications, particularly in the field of reclamation and repair of damaged and worn engineering components. In this paper, we present the prediction of tensile and shear strength of friction surface treated tool steel using ANN for simulated results of friction surface treatment. This experiment was carried out to obtain tool steel coatings of low carbon steel parts by changing contribution process parameters essentially friction pressure, rotational speed and welding speed. The simulation is performed by a 33-factor design that takes into account the maximum and least limits of the experimental work performed with the 23-factor design. Neural network structures, such as the Feed Forward Neural Network (FFNN), were used to predict tensile and shear strength of tool steel sediments caused by friction.

  6. A novel tool to predict food intake: the Visual Meal Creator.

    Science.gov (United States)

    Holliday, Adrian; Batey, Chris; Eves, Frank F; Blannin, Andrew K

    2014-08-01

    Subjective appetite is commonly measured using an abstract visual analogue scale (VAS) technique, that provides no direct information about desired portion size or food choice. The purpose of this investigation was to develop and validate a user-friendly tool - the Visual Meal Creator (VIMEC) - that would allow for independent, repeated measures of subjective appetite and provide a prediction of food intake. Twelve participants experienced dietary control over a 5-hour period to manipulate hunger state on three occasions (small breakfast (SB) vs. large breakfast (LB) vs. large breakfast + snacks (LB+S)). Appetite measures were obtained every 60 minutes using the VIMEC and VAS. At 4.5 hours, participants were presented with an ad libitum test meal, from which energy intake (EI) was measured. The efficacy of the VIMEC was assessed by its ability to detect expected patterns of appetite and its strength as a predictor of energy intake. Day-to-day reproducibility and test-retest repeatability were assessed. Between- and within-condition differences in VAS and VIMEC scores (represented as mm and kcal of the "created" meal, respectively) were significantly correlated with one another throughout. Between- and within-condition changes in appetite scores obtained with the VIMEC exhibited a stronger correlation with EI at the test meal than those obtained with VAS. Pearson correlation coefficients for within-condition comparisons were 0.951, 0.914 and 0.875 (all p < 0.001) for SB, LB and LB+S respectively. Correlation coefficients for between-condition differences in VIMEC and EI were 0.273, 0.940 (p < 0.001) and 0.525 (p < 0.05) for SB - LB+S, SB - LB and LB - LB+S respectively. The VIMEC exhibited a similar degree of reproducibility to VAS. These findings suggest that the VIMEC appears to be a stronger predictor of energy intake than VAS. Copyright © 2014 Elsevier Ltd. All rights reserved.

  7. Mechanisms, Prediction, and Prevention of ACL Injuries: Cut Risk With Three Sharpened and Validated Tools

    Science.gov (United States)

    Hewett, Timothy E.; Myer, Gregory D.; Ford, Kevin R.; Paterno, Mark V.; Quatman, Carmen E.

    2017-01-01

    Economic and societal pressures influence modern medical practice to develop and implement prevention strategies. Anterior cruciate ligament (ACL) injury devastates the knee joint leading to short term disability and long term sequelae. Due to the high risk of long term osteoarthritis in all treatment populations following ACL injury, prevention is the only effective intervention for this life-altering disruption in knee health. The “Sequence of Prevention” Model provides a framework to monitor progress towards the ultimate goal of preventing ACL injuries. Utilizing this model, our multidisciplinary collaborative research team has spent the last decade working to delineate injury mechanisms, identify injury risk factors, predict which athletes are at-risk for injury, and develop ACL injury prevention programs. Within this model of injury prevention, modifiable factors (biomechanical and neuromuscular) related to injury mechanisms likely provide the best opportunity for intervention strategies aimed to decrease the risk of ACL injury, particularly in female athletes. Knowledge advancements have led to the development of potential solutions that allow athletes to compete with lowered risk of ACL injury. Design and integration of personalized clinical assessment tools and targeted prevention strategies for athletes at high risk for ACL injury may transform current prevention practices and ultimately significantly reduce ACL injury incidence. This 2016 OREF Clinical Research Award focuses on the authors' work and contributions to the field. The author's acknowledge the many research groups who have contributed to the current state of knowledge in the fields of ACL injury mechanisms, injury risk screening and injury prevention strategies. PMID:27612195

  8. Assessment of the predictive accuracy of five in silico prediction tools, alone or in combination, and two metaservers to classify long QT syndrome gene mutations.

    Science.gov (United States)

    Leong, Ivone U S; Stuckey, Alexander; Lai, Daniel; Skinner, Jonathan R; Love, Donald R

    2015-05-13

    Long QT syndrome (LQTS) is an autosomal dominant condition predisposing to sudden death from malignant arrhythmia. Genetic testing identifies many missense single nucleotide variants of uncertain pathogenicity. Establishing genetic pathogenicity is an essential prerequisite to family cascade screening. Many laboratories use in silico prediction tools, either alone or in combination, or metaservers, in order to predict pathogenicity; however, their accuracy in the context of LQTS is unknown. We evaluated the accuracy of five in silico programs and two metaservers in the analysis of LQTS 1-3 gene variants. The in silico tools SIFT, PolyPhen-2, PROVEAN, SNPs&GO and SNAP, either alone or in all possible combinations, and the metaservers Meta-SNP and PredictSNP, were tested on 312 KCNQ1, KCNH2 and SCN5A gene variants that have previously been characterised by either in vitro or co-segregation studies as either "pathogenic" (283) or "benign" (29). The accuracy, sensitivity, specificity and Matthews Correlation Coefficient (MCC) were calculated to determine the best combination of in silico tools for each LQTS gene, and when all genes are combined. The best combination of in silico tools for KCNQ1 is PROVEAN, SNPs&GO and SIFT (accuracy 92.7%, sensitivity 93.1%, specificity 100% and MCC 0.70). The best combination of in silico tools for KCNH2 is SIFT and PROVEAN or PROVEAN, SNPs&GO and SIFT. Both combinations have the same scores for accuracy (91.1%), sensitivity (91.5%), specificity (87.5%) and MCC (0.62). In the case of SCN5A, SNAP and PROVEAN provided the best combination (accuracy 81.4%, sensitivity 86.9%, specificity 50.0%, and MCC 0.32). When all three LQT genes are combined, SIFT, PROVEAN and SNAP is the combination with the best performance (accuracy 82.7%, sensitivity 83.0%, specificity 80.0%, and MCC 0.44). Both metaservers performed better than the single in silico tools; however, they did not perform better than the best performing combination of in silico

  9. Modelling energy utilisation in broiler breeder hens.

    Science.gov (United States)

    Rabello, C B V; Sakomura, N K; Longo, F A; Couto, H P; Pacheco, C R; Fernandes, J B K

    2006-10-01

    1. The objective of this study was to determine a metabolisable energy (ME) requirement model for broiler breeder hens. The influence of temperature on ME requirements for maintenance was determined in experiments conducted in three environmental rooms with temperatures kept constant at 13, 21 and 30 degrees C using a comparative slaughter technique. The energy requirements for weight gain were determined based upon body energy content and efficiency of energy utilisation for weight gain. The energy requirements for egg production were determined on the basis of egg energy content and efficiency of energy deposition in the eggs. 2. The following model was developed using these results: ME = kgW0.75(806.53-26.45T + 0.50T2) + 31.90G + 10.04EM, where kgW0.75 is body weight (kg) raised to the power 0.75, T is temperature ( degrees C), G is weight gain (g) and EM is egg mass (g). 3. A feeding trial was conducted using 400 Hubbard Hi-Yield broiler breeder hens and 40 Peterson males from 31 to 46 weeks of age in order to compare use of the model with a recommended feeding programme for this strain of bird. The application of the model in breeder hens provided good productive and reproductive performance and better results in feed and energy conversion than in hens fed according to strain recommendation. In conclusion, the model evaluated predicted an ME intake which matched breeder hens' requirements.

  10. Numerical Weather Prediction Models on Linux Boxes as tools in meteorological education in Hungary

    Science.gov (United States)

    Gyongyosi, A. Z.; Andre, K.; Salavec, P.; Horanyi, A.; Szepszo, G.; Mille, M.; Tasnadi, P.; Weidiger, T.

    2012-04-01

    . Numerical modeling became a common tool in the daily practice of weather experts forecasters due to the i) increasing user demands for weather data by the costumers, ii) the growth in computer resources, iii) numerical weather prediction systems available for integration on affordable, off the shelf computers and iv) available input data (from ECMWF or NCEP) for model integrations. Beside learning the theoretical basis, since the last year. Students in their MSc or BSc Thesis Research or in Student's Research ProjectsStudent's Research Projects h have the opportunity to run numerical models and to analyze the outputs for different purposes including wind energy estimation, simulation of the dynamics of a polar low, and subtropical cyclones, analysis of the isentropic potential vorticity field, examination of coupled atmospheric dispersion models, etc. A special course in the application of numerical modeling has been held (is being announced for the upcoming semester) (is being announced for the upcoming semester) for our students in order to improve their skills on this field. Several numerical model (NRIPR ETA and WRF) systems have been adapted in the University and integrated WRF have been tested and used for the geographical region of the Carpathian Basin (NRIPR, ETA and WRF). Recently ALADIN/CHAPEAU the academic version of the ARPEGE ALADIN cy33t1 meso-scale numerical weather prediction model system (which is the operational forecasting tool of our National Weather Service) has been installed at our Institute. ALADIN is the operational forecasting model of the Hungarian Meteorological Service and developed in the framework of the international ALADIN co-operation. Our main objectives are i) the analysis of different typical weather situations, ii) fine tuning of parameterization schemes and the iii) comparison of the ALADIN/CHAPEAU and WRF model outputs based on case studies. The necessary hardware and software innovations has have been done. In the presentation the

  11. Parametric Optimization and Prediction Tool for Excavation and Prospecting Tasks, Phase II

    Data.gov (United States)

    National Aeronautics and Space Administration — Honeybee Robotics therefore proposed to develop a software tool for facilitating prospecting and excavation system trades in support of selecting an optimal...

  12. Utilisation of the buffy coat technique and an antibody-detection ELISA as tools for assessing the impact of trypanosomosis on health and productivity of N'Dama cattle

    International Nuclear Information System (INIS)

    Faye, J.A.; Mattioli, R.C.

    2000-01-01

    The buffy coat technique (BCT), a parasitological test, and an indirect antibody ELISA (Ab-ELISA) were used to detect trypanosome infections in blood and serum samples, respectively, collected on N'Dama cattle exposed to natural high tsetse challenge. These two diagnostic tools were also utilized to assess trypanosomal status in sequentially collected blood and serum samples from two groups composed of 5 N'Dama cattle each experimentally challenged with Trypanosoma congolense and T. vivax, In both studies, packed red cell volume (PCV) and live weight were measured. The specificity of the Ab-ELISA was computed by testing approximately 70 serum samples obtained from a cattle population kept under zero tsetse challenge. The specificity was found to be 95.8% for T. vivax and 97. 1 % for T. congolense. In the field study, 3.9% (12/310) of blood samples was parasitologically positive. In corresponding serum samples the prevalence of positive trypanosome sero-reactors was 54.8% (170/310). However, antibodies against trypanosomes persisted in serum when blood samples were no longer parasitologically positive. In both blood and serum samples, T. vivax was found to be the main infecting species. The sensitivity of the Ab-ELISA for T. vivax was 81.8%. Due to the extremely low numbers of T. congolense infection (only one), as detected by BCT, the sensitivity for that trypanosome species was not computed. In the experimentally challenged cattle, 80% (24/30) and 33.3% (10/30) of blood samples were BCT positive for T. congolense and T. vivax, respectively. Antibodies in corresponding sera were present in 69% (20/29) and 96.3% (26/27) of animals challenged with T. congolense and T. vivax, respectively. The serological assay for T. congolense antibody detection exhibited high cross-reactivity with T. vivax antigens, as assessed in sera collected from T. vivax infected animals. In the field study, cattle showing the presence of antibodies against T. congolense and/or T. vivax had

  13. Effective utilisation of generation Y Quantity Surveyors

    African Journals Online (AJOL)

    together and tested by means of open interview discussions with senior QS professionals. ... employers could better utilise generation Y employees. 2. Literature review .... Literature was reviewed by using search engines (Emerald, Business.

  14. Development of Web tools to predict axillary lymph node metastasis and pathological response to neoadjuvant chemotherapy in breast cancer patients.

    Science.gov (United States)

    Sugimoto, Masahiro; Takada, Masahiro; Toi, Masakazu

    2014-12-09

    Nomograms are a standard computational tool to predict the likelihood of an outcome using multiple available patient features. We have developed a more powerful data mining methodology, to predict axillary lymph node (AxLN) metastasis and response to neoadjuvant chemotherapy (NAC) in primary breast cancer patients. We developed websites to use these tools. The tools calculate the probability of AxLN metastasis (AxLN model) and pathological complete response to NAC (NAC model). As a calculation algorithm, we employed a decision tree-based prediction model known as the alternative decision tree (ADTree), which is an analog development of if-then type decision trees. An ensemble technique was used to combine multiple ADTree predictions, resulting in higher generalization abilities and robustness against missing values. The AxLN model was developed with training datasets (n=148) and test datasets (n=143), and validated using an independent cohort (n=174), yielding an area under the receiver operating characteristic curve (AUC) of 0.768. The NAC model was developed and validated with n=150 and n=173 datasets from a randomized controlled trial, yielding an AUC of 0.787. AxLN and NAC models require users to input up to 17 and 16 variables, respectively. These include pathological features, including human epidermal growth factor receptor 2 (HER2) status and imaging findings. Each input variable has an option of "unknown," to facilitate prediction for cases with missing values. The websites developed facilitate the use of these tools, and serve as a database for accumulating new datasets.

  15. Developing Lightning Prediction Tools for the CCAFS Dual-Polarimetric Radar

    Science.gov (United States)

    Petersen, W. A.; Carey, L. D.; Deierling, W.; Johnson, E.; Bateman, M.

    2009-01-01

    NASA Marshall Space Flight Center and the University of Alabama Huntsville are collaborating with the 45th Weather Squadron (45WS) to develop improved lightning prediction capabilities for the new C-band dual-polarimetric weather radar being acquired for use by 45WS and launch weather forecasters at Cape Canaveral Air Force Station (CCAFS). In particular, these algorithms will focus on lightning onset, cessation and combined lightning-radar applications for convective winds assessment. Research using radar reflectivity (Z) data for prediction of lightning onset has been extensively discussed in the literature and subsequently applied by launch weather forecasters as it pertains to lightning nowcasting. Currently the forecasters apply a relatively straight forward but effective temperature-Z threshold algorithm for assessing the likelihood of lightning onset in a given storm. In addition, a layered VIL above the freezing level product is used as automated guidance for the onset of lightning. Only limited research and field work has been conducted on lightning cessation using Z and vertically-integrated Z for determining cessation. Though not used operationally vertically-integrated Z (basis for VIL) has recently shown promise as a tool for use in nowcasting lightning cessation. The work discussed herein leverages and expands upon these and similar reflectivity-threshold approaches via the application/addition of over two decades of polarimetric radar research focused on distinct multi-parameter radar signatures of ice/mixed-phase initiation and ice-crystal orientation in highly electrified convective clouds. Specifically, our approach is based on numerous previous studies that have observed repeatable patterns in the behavior of the vertical hydrometeor column as it relates to the temporal evolution of differential reflectivity and depolarization (manifested in either LDR or p(sub hv)), development of in-situ mixed and ice phase microphysics, electric fields, and

  16. Become the PPUPET Master: Mastering Pressure Ulcer Risk Assessment With the Pediatric Pressure Ulcer Prediction and Evaluation Tool (PPUPET).

    Science.gov (United States)

    Sterken, David J; Mooney, JoAnn; Ropele, Diana; Kett, Alysha; Vander Laan, Karen J

    2015-01-01

    Hospital acquired pressure ulcers (HAPU) are serious, debilitating, and preventable complications in all inpatient populations. Despite evidence of the development of pressure ulcers in the pediatric population, minimal research has been done. Based on observations gathered during quarterly HAPU audits, bedside nursing staff recognized trends in pressure ulcer locations that were not captured using current pressure ulcer risk assessment tools. Together, bedside nurses and nursing leadership created and conducted multiple research studies to investigate the validity and reliability of the Pediatric Pressure Ulcer Prediction and Evaluation Tool (PPUPET). Copyright © 2015 Elsevier Inc. All rights reserved.

  17. Environmental assessment of incinerator residue utilisation

    OpenAIRE

    Toller, Susanna

    2008-01-01

     In Sweden, utilisation of incinerator residues outside disposal areas is restricted by environmental concerns, as such residues commonly contain greater amounts of potentially toxic trace elements than the natural materials they replace. On the other hand, utilisation can also provide environmental benefits by decreasing the need for landfill and reducing raw material extraction. This thesis provides increased knowledge and proposes better approaches for environmental assessment of incinerat...

  18. Enhancing Cloud Resource Utilisation using Statistical Analysis

    OpenAIRE

    Sijin He; Li Guo; Yike Guo

    2014-01-01

    Resource provisioning based on virtual machine (VM) has been widely accepted and adopted in cloud computing environments. A key problem resulting from using static scheduling approaches for allocating VMs on different physical machines (PMs) is that resources tend to be not fully utilised. Although some existing cloud reconfiguration algorithms have been developed to address the problem, they normally result in high migration costs and low resource utilisation due to ignoring the multi-dimens...

  19. Tool Wear Prediction in Ti-6Al-4V Machining through Multiple Sensor Monitoring and PCA Features Pattern Recognition

    Directory of Open Access Journals (Sweden)

    Alessandra Caggiano

    2018-03-01

    Full Text Available Machining of titanium alloys is characterised by extremely rapid tool wear due to the high cutting temperature and the strong adhesion at the tool-chip and tool-workpiece interface, caused by the low thermal conductivity and high chemical reactivity of Ti alloys. With the aim to monitor the tool conditions during dry turning of Ti-6Al-4V alloy, a machine learning procedure based on the acquisition and processing of cutting force, acoustic emission and vibration sensor signals during turning is implemented. A number of sensorial features are extracted from the acquired sensor signals in order to feed machine learning paradigms based on artificial neural networks. To reduce the large dimensionality of the sensorial features, an advanced feature extraction methodology based on Principal Component Analysis (PCA is proposed. PCA allowed to identify a smaller number of features (k = 2 features, the principal component scores, obtained through linear projection of the original d features into a new space with reduced dimensionality k = 2, sufficient to describe the variance of the data. By feeding artificial neural networks with the PCA features, an accurate diagnosis of tool flank wear (VBmax was achieved, with predicted values very close to the measured tool wear values.

  20. Tool Wear Prediction in Ti-6Al-4V Machining through Multiple Sensor Monitoring and PCA Features Pattern Recognition.

    Science.gov (United States)

    Caggiano, Alessandra

    2018-03-09

    Machining of titanium alloys is characterised by extremely rapid tool wear due to the high cutting temperature and the strong adhesion at the tool-chip and tool-workpiece interface, caused by the low thermal conductivity and high chemical reactivity of Ti alloys. With the aim to monitor the tool conditions during dry turning of Ti-6Al-4V alloy, a machine learning procedure based on the acquisition and processing of cutting force, acoustic emission and vibration sensor signals during turning is implemented. A number of sensorial features are extracted from the acquired sensor signals in order to feed machine learning paradigms based on artificial neural networks. To reduce the large dimensionality of the sensorial features, an advanced feature extraction methodology based on Principal Component Analysis (PCA) is proposed. PCA allowed to identify a smaller number of features ( k = 2 features), the principal component scores, obtained through linear projection of the original d features into a new space with reduced dimensionality k = 2, sufficient to describe the variance of the data. By feeding artificial neural networks with the PCA features, an accurate diagnosis of tool flank wear ( VB max ) was achieved, with predicted values very close to the measured tool wear values.

  1. Tool Wear Prediction in Ti-6Al-4V Machining through Multiple Sensor Monitoring and PCA Features Pattern Recognition

    Science.gov (United States)

    2018-01-01

    Machining of titanium alloys is characterised by extremely rapid tool wear due to the high cutting temperature and the strong adhesion at the tool-chip and tool-workpiece interface, caused by the low thermal conductivity and high chemical reactivity of Ti alloys. With the aim to monitor the tool conditions during dry turning of Ti-6Al-4V alloy, a machine learning procedure based on the acquisition and processing of cutting force, acoustic emission and vibration sensor signals during turning is implemented. A number of sensorial features are extracted from the acquired sensor signals in order to feed machine learning paradigms based on artificial neural networks. To reduce the large dimensionality of the sensorial features, an advanced feature extraction methodology based on Principal Component Analysis (PCA) is proposed. PCA allowed to identify a smaller number of features (k = 2 features), the principal component scores, obtained through linear projection of the original d features into a new space with reduced dimensionality k = 2, sufficient to describe the variance of the data. By feeding artificial neural networks with the PCA features, an accurate diagnosis of tool flank wear (VBmax) was achieved, with predicted values very close to the measured tool wear values. PMID:29522443

  2. Airport Gate Activity Monitoring Tool Suite for Improved Turnaround Prediction, Phase I

    Data.gov (United States)

    National Aeronautics and Space Administration — The objective of this research is to create a suite of tools for monitoring airport gate activities with the objective of improving aircraft turnaround. Airport ramp...

  3. Integrated Decision Tools for Sustainable Watershed/Ground Water and Crop Health using Predictive Weather, Remote Sensing, and Irrigation Decision Tools

    Science.gov (United States)

    Jones, A. S.; Andales, A.; McGovern, C.; Smith, G. E. B.; David, O.; Fletcher, S. J.

    2017-12-01

    US agricultural and Govt. lands have a unique co-dependent relationship, particularly in the Western US. More than 30% of all irrigated US agricultural output comes from lands sustained by the Ogallala Aquifer in the western Great Plains. Six US Forest Service National Grasslands reside within the aquifer region, consisting of over 375,000 ha (3,759 km2) of USFS managed lands. Likewise, National Forest lands are the headwaters to many intensive agricultural regions. Our Ogallala Aquifer team is enhancing crop irrigation decision tools with predictive weather and remote sensing data to better manage water for irrigated crops within these regions. An integrated multi-model software framework is used to link irrigation decision tools, resulting in positive management benefits on natural water resources. Teams and teams-of-teams can build upon these multi-disciplinary multi-faceted modeling capabilities. For example, the CSU Catalyst for Innovative Partnerships program has formed a new multidisciplinary team that will address "Rural Wealth Creation" focusing on the many integrated links between economic, agricultural production and management, natural resource availabilities, and key social aspects of govt. policy recommendations. By enhancing tools like these with predictive weather and other related data (like in situ measurements, hydrologic models, remotely sensed data sets, and (in the near future) linking to agro-economic and life cycle assessment models) this work demonstrates an integrated data-driven future vision of inter-meshed dynamic systems that can address challenging multi-system problems. We will present the present state of the work and opportunities for future involvement.

  4. A Metric Tool for Predicting Source Code Quality from a PDL Design

    OpenAIRE

    Henry, Sallie M.; Selig, Calvin

    1987-01-01

    The software crisis has increased the demand for automated tools to assist software developers in the production of quality software. Quality metrics have given software developers a tool to measure software quality. These measurements, however, are available only after the software has been produced. Due to high cost, software managers are reluctant, to redesign and reimplement low quality software. Ideally, a life cycle which allows early measurement of software quality is a necessary ingre...

  5. PCTFPeval: a web tool for benchmarking newly developed algorithms for predicting cooperative transcription factor pairs in yeast.

    Science.gov (United States)

    Lai, Fu-Jou; Chang, Hong-Tsun; Wu, Wei-Sheng

    2015-01-01

    Computational identification of cooperative transcription factor (TF) pairs helps understand the combinatorial regulation of gene expression in eukaryotic cells. Many advanced algorithms have been proposed to predict cooperative TF pairs in yeast. However, it is still difficult to conduct a comprehensive and objective performance comparison of different algorithms because of lacking sufficient performance indices and adequate overall performance scores. To solve this problem, in our previous study (published in BMC Systems Biology 2014), we adopted/proposed eight performance indices and designed two overall performance scores to compare the performance of 14 existing algorithms for predicting cooperative TF pairs in yeast. Most importantly, our performance comparison framework can be applied to comprehensively and objectively evaluate the performance of a newly developed algorithm. However, to use our framework, researchers have to put a lot of effort to construct it first. To save researchers time and effort, here we develop a web tool to implement our performance comparison framework, featuring fast data processing, a comprehensive performance comparison and an easy-to-use web interface. The developed tool is called PCTFPeval (Predicted Cooperative TF Pair evaluator), written in PHP and Python programming languages. The friendly web interface allows users to input a list of predicted cooperative TF pairs from their algorithm and select (i) the compared algorithms among the 15 existing algorithms, (ii) the performance indices among the eight existing indices, and (iii) the overall performance scores from two possible choices. The comprehensive performance comparison results are then generated in tens of seconds and shown as both bar charts and tables. The original comparison results of each compared algorithm and each selected performance index can be downloaded as text files for further analyses. Allowing users to select eight existing performance indices and 15

  6. LoopIng: a template-based tool for predicting the structure of protein loops.

    KAUST Repository

    Messih, Mario Abdel; Lepore, Rosalba; Tramontano, Anna

    2015-01-01

    ) and significant enhancements for long loops (11-20 residues). The quality of the predictions is robust to errors that unavoidably affect the stem regions when these are modeled. The method returns a confidence score for the predicted template loops and has

  7. Omics AnalySIs System for PRecision Oncology (OASISPRO): A Web-based Omics Analysis Tool for Clinical Phenotype Prediction.

    Science.gov (United States)

    Yu, Kun-Hsing; Fitzpatrick, Michael R; Pappas, Luke; Chan, Warren; Kung, Jessica; Snyder, Michael

    2017-09-12

    Precision oncology is an approach that accounts for individual differences to guide cancer management. Omics signatures have been shown to predict clinical traits for cancer patients. However, the vast amount of omics information poses an informatics challenge in systematically identifying patterns associated with health outcomes, and no general-purpose data-mining tool exists for physicians, medical researchers, and citizen scientists without significant training in programming and bioinformatics. To bridge this gap, we built the Omics AnalySIs System for PRecision Oncology (OASISPRO), a web-based system to mine the quantitative omics information from The Cancer Genome Atlas (TCGA). This system effectively visualizes patients' clinical profiles, executes machine-learning algorithms of choice on the omics data, and evaluates the prediction performance using held-out test sets. With this tool, we successfully identified genes strongly associated with tumor stage, and accurately predicted patients' survival outcomes in many cancer types, including mesothelioma and adrenocortical carcinoma. By identifying the links between omics and clinical phenotypes, this system will facilitate omics studies on precision cancer medicine and contribute to establishing personalized cancer treatment plans. This web-based tool is available at http://tinyurl.com/oasispro ;source codes are available at http://tinyurl.com/oasisproSourceCode . © The Author (2017). Published by Oxford University Press. All rights reserved. For Permissions, please email: journals.permissions@oup.com

  8. bTSSfinder: a novel tool for the prediction of promoters in Cyanobacteria andEscherichia coli

    KAUST Repository

    Shahmuradov, Ilham Ayub

    2016-09-29

    Motivation: The computational search for promoters in prokaryotes remains an attractive problem in bioinformatics. Despite the attention it has received for many years, the problem has not been addressed satisfactorily. In any bacterial genome, the transcription start site is chosen mostly by the sigma (σ) factor proteins, which control the gene activation. The majority of published bacterial promoter prediction tools target σ70 promoters in Escherichia coli. Moreover, no σ-specific classification of promoters is available for prokaryotes other than for E. coli. Results: Here, we introduce bTSSfinder, a novel tool that predicts putative promoters for five classes of σ factors in Cyanobacteria (σA, σC, σH, σG and σF) and for five classes of sigma factors in E. coli (σ70, σ38, σ32, σ28 and σ24). Comparing to currently available tools, bTSSfinder achieves higher accuracy (MCC=0.86, F1-score=0.93) compared to the next best tool with MCC=0.59, F1-score=0.79) and covers multiple classes of promoters.

  9. Congruency in the prediction of pathogenic missense mutations: state-of-the-art web-based tools.

    Science.gov (United States)

    Castellana, Stefano; Mazza, Tommaso

    2013-07-01

    A remarkable degree of genetic variation has been found in the protein-encoding regions of DNA through deep sequencing of samples obtained from thousands of subjects from several populations. Approximately half of the 20 000 single nucleotide polymorphisms present, even in normal healthy subjects, are nonsynonymous amino acid substitutions that could potentially affect protein function. The greatest challenges currently facing investigators are data interpretation and the development of strategies to identify the few gene-coding variants that actually cause or confer susceptibility to disease. A confusing array of options is available to address this problem. Unfortunately, the overall accuracy of these tools at ultraconserved positions is low, and predictions generated by current computational tools may mislead researchers involved in downstream experimental and clinical studies. First, we have presented an updated review of these tools and their primary functionalities, focusing on those that are naturally prone to analyze massive variant sets, to infer some interesting similarities among their results. Additionally, we have evaluated the prediction congruency for real whole-exome sequencing data in a proof-of-concept study on some of these web-based tools.

  10. Modeling and evaluating of surface roughness prediction in micro-grinding on soda-lime glass considering tool characterization

    Science.gov (United States)

    Cheng, Jun; Gong, Yadong; Wang, Jinsheng

    2013-11-01

    The current research of micro-grinding mainly focuses on the optimal processing technology for different materials. However, the material removal mechanism in micro-grinding is the base of achieving high quality processing surface. Therefore, a novel method for predicting surface roughness in micro-grinding of hard brittle materials considering micro-grinding tool grains protrusion topography is proposed in this paper. The differences of material removal mechanism between convention grinding process and micro-grinding process are analyzed. Topography characterization has been done on micro-grinding tools which are fabricated by electroplating. Models of grain density generation and grain interval are built, and new predicting model of micro-grinding surface roughness is developed. In order to verify the precision and application effect of the surface roughness prediction model proposed, a micro-grinding orthogonally experiment on soda-lime glass is designed and conducted. A series of micro-machining surfaces which are 78 nm to 0.98 μm roughness of brittle material is achieved. It is found that experimental roughness results and the predicting roughness data have an evident coincidence, and the component variable of describing the size effects in predicting model is calculated to be 1.5×107 by reverse method based on the experimental results. The proposed model builds a set of distribution to consider grains distribution densities in different protrusion heights. Finally, the characterization of micro-grinding tools which are used in the experiment has been done based on the distribution set. It is concluded that there is a significant coincidence between surface prediction data from the proposed model and measurements from experiment results. Therefore, the effectiveness of the model is demonstrated. This paper proposes a novel method for predicting surface roughness in micro-grinding of hard brittle materials considering micro-grinding tool grains protrusion

  11. Predictive value of the DASH tool for predicting return to work of injured workers with musculoskeletal disorders of the upper extremity.

    Science.gov (United States)

    Armijo-Olivo, Susan; Woodhouse, Linda J; Steenstra, Ivan A; Gross, Douglas P

    2016-12-01

    To determine whether the Disabilities of the Arm, Shoulder, and Hand (DASH) tool added to the predictive ability of established prognostic factors, including patient demographic and clinical outcomes, to predict return to work (RTW) in injured workers with musculoskeletal (MSK) disorders of the upper extremity. A retrospective cohort study using a population-based database from the Workers' Compensation Board of Alberta (WCB-Alberta) that focused on claimants with upper extremity injuries was used. Besides the DASH, potential predictors included demographic, occupational, clinical and health usage variables. Outcome was receipt of compensation benefits after 3 months. To identify RTW predictors, a purposeful logistic modelling strategy was used. A series of receiver operating curve analyses were performed to determine which model provided the best discriminative ability. The sample included 3036 claimants with upper extremity injuries. The final model for predicting RTW included the total DASH score in addition to other established predictors. The area under the curve for this model was 0.77, which is interpreted as fair discrimination. This model was statistically significantly different than the model of established predictors alone (pmodels (p=0.34). The DASH tool together with other established predictors significantly helped predict RTW after 3 months in participants with upper extremity MSK disorders. An appealing result for clinicians and busy researchers is that DASH item 23 has equal predictive ability to the total DASH score. Published by the BMJ Publishing Group Limited. For permission to use (where not already granted under a licence) please go to http://www.bmj.com/company/products-services/rights-and-licensing/.

  12. Neuro-Simulation Tool for Enhanced Oil Recovery Screening and Reservoir Performance Prediction

    Directory of Open Access Journals (Sweden)

    Soheil Bahrekazemi

    2017-09-01

    Full Text Available Assessment of the suitable enhanced oil recovery method in an oilfield is one of the decisions which are made prior to the natural drive production mechanism. In some cases, having in-depth knowledge about reservoir’s rock, fluid properties, and equipment is needed as well as economic evaluation. Both putting such data into simulation and its related consequent processes are generally very time consuming and costly.  In order to reduce study cases, an appropriate tool is required for primary screening prior to any operations being performed, to which leads reduction of time in design of ether pilot section or production under field condition. In this research, two different and useful screening tools are presented through a graphical user interface. The output of just over 900 simulations and verified screening criteria tables were employed to design the mentioned tools. Moreover, by means of gathered data and development of artificial neural networks, two dissimilar screening tools for proper assessment of suitable enhanced oil recovery method were finally introduced. The first tool is about the screening of enhanced oil recovery process based on published tables/charts and the second one which is Neuro-Simulation tool, concerns economical evaluation of miscible and immiscible injection of carbon dioxide, nitrogen and natural gas into the reservoir. Both of designed tools are provided in the form of a graphical user interface by which the user, can perceive suitable method through plot of oil recovery graph during 20 years of production, costs of gas injection per produced barrel, cumulative oil production, and finally, design the most efficient scenario.

  13. A benchmarking tool to evaluate computer tomography perfusion infarct core predictions against a DWI standard.

    Science.gov (United States)

    Cereda, Carlo W; Christensen, Søren; Campbell, Bruce Cv; Mishra, Nishant K; Mlynash, Michael; Levi, Christopher; Straka, Matus; Wintermark, Max; Bammer, Roland; Albers, Gregory W; Parsons, Mark W; Lansberg, Maarten G

    2016-10-01

    Differences in research methodology have hampered the optimization of Computer Tomography Perfusion (CTP) for identification of the ischemic core. We aim to optimize CTP core identification using a novel benchmarking tool. The benchmarking tool consists of an imaging library and a statistical analysis algorithm to evaluate the performance of CTP. The tool was used to optimize and evaluate an in-house developed CTP-software algorithm. Imaging data of 103 acute stroke patients were included in the benchmarking tool. Median time from stroke onset to CT was 185 min (IQR 180-238), and the median time between completion of CT and start of MRI was 36 min (IQR 25-79). Volumetric accuracy of the CTP-ROIs was optimal at an rCBF threshold of benchmarking tool can play an important role in optimizing CTP software as it provides investigators with a novel method to directly compare the performance of alternative CTP software packages. © The Author(s) 2015.

  14. FSPP: A Tool for Genome-Wide Prediction of smORF-Encoded Peptides and Their Functions

    Directory of Open Access Journals (Sweden)

    Hui Li

    2018-04-01

    Full Text Available smORFs are small open reading frames of less than 100 codons. Recent low throughput experiments showed a lot of smORF-encoded peptides (SEPs played crucial rule in processes such as regulation of transcription or translation, transportation through membranes and the antimicrobial activity. In order to gather more functional SEPs, it is necessary to have access to genome-wide prediction tools to give profound directions for low throughput experiments. In this study, we put forward a functional smORF-encoded peptides predictor (FSPP which tended to predict authentic SEPs and their functions in a high throughput method. FSPP used the overlap of detected SEPs from Ribo-seq and mass spectrometry as target objects. With the expression data on transcription and translation levels, FSPP built two co-expression networks. Combing co-location relations, FSPP constructed a compound network and then annotated SEPs with functions of adjacent nodes. Tested on 38 sequenced samples of 5 human cell lines, FSPP successfully predicted 856 out of 960 annotated proteins. Interestingly, FSPP also highlighted 568 functional SEPs from these samples. After comparison, the roles predicted by FSPP were consistent with known functions. These results suggest that FSPP is a reliable tool for the identification of functional small peptides. FSPP source code can be acquired at https://www.bioinfo.org/FSPP.

  15. Predicting the Abrasion Resistance of Tool Steels by Means of Neurofuzzy Model

    Directory of Open Access Journals (Sweden)

    Dragutin Lisjak

    2013-07-01

    Full Text Available This work considers use neurofuzzy set theory for estimate abrasion wear resistance of steels based on chemical composition, heat treatment (austenitising temperature, quenchant and tempering temperature, hardness after hardening and different tempering temperature and volume loss of materials according to ASTM G 65-94. Testing of volume loss for the following group of materials as fuzzy data set was taken: carbon tool steels, cold work tool steels, hot work tools steels, high-speed steels. Modelled adaptive neuro fuzzy inference system (ANFIS is compared to statistical model of multivariable non-linear regression (MNLR. From the results it could be concluded that it is possible well estimate abrasion wear resistance for steel whose volume loss is unknown and thus eliminate unnecessary testing.

  16. Predictive Models and Tools for Assessing Chemicals under the Toxic Substances Control Act (TSCA)

    Science.gov (United States)

    EPA has developed databases and predictive models to help evaluate the hazard, exposure, and risk of chemicals released to the environment and how workers, the general public, and the environment may be exposed to and affected by them.

  17. A New Tool for CME Arrival Time Prediction using Machine Learning Algorithms: CAT-PUMA

    Science.gov (United States)

    Liu, Jiajia; Ye, Yudong; Shen, Chenglong; Wang, Yuming; Erdélyi, Robert

    2018-03-01

    Coronal mass ejections (CMEs) are arguably the most violent eruptions in the solar system. CMEs can cause severe disturbances in interplanetary space and can even affect human activities in many aspects, causing damage to infrastructure and loss of revenue. Fast and accurate prediction of CME arrival time is vital to minimize the disruption that CMEs may cause when interacting with geospace. In this paper, we propose a new approach for partial-/full halo CME Arrival Time Prediction Using Machine learning Algorithms (CAT-PUMA). Via detailed analysis of the CME features and solar-wind parameters, we build a prediction engine taking advantage of 182 previously observed geo-effective partial-/full halo CMEs and using algorithms of the Support Vector Machine. We demonstrate that CAT-PUMA is accurate and fast. In particular, predictions made after applying CAT-PUMA to a test set unknown to the engine show a mean absolute prediction error of ∼5.9 hr within the CME arrival time, with 54% of the predictions having absolute errors less than 5.9 hr. Comparisons with other models reveal that CAT-PUMA has a more accurate prediction for 77% of the events investigated that can be carried out very quickly, i.e., within minutes of providing the necessary input parameters of a CME. A practical guide containing the CAT-PUMA engine and the source code of two examples are available in the Appendix, allowing the community to perform their own applications for prediction using CAT-PUMA.

  18. Artificial Intelligence Systems as Prognostic and Predictive Tools in Ovarian Cancer.

    Science.gov (United States)

    Enshaei, A; Robson, C N; Edmondson, R J

    2015-11-01

    The ability to provide accurate prognostic and predictive information to patients is becoming increasingly important as clinicians enter an era of personalized medicine. For a disease as heterogeneous as epithelial ovarian cancer, conventional algorithms become too complex for routine clinical use. This study therefore investigated the potential for an artificial intelligence model to provide this information and compared it with conventional statistical approaches. The authors created a database comprising 668 cases of epithelial ovarian cancer during a 10-year period and collected data routinely available in a clinical environment. They also collected survival data for all the patients, then constructed an artificial intelligence model capable of comparing a variety of algorithms and classifiers alongside conventional statistical approaches such as logistic regression. The model was used to predict overall survival and demonstrated that an artificial neural network (ANN) algorithm was capable of predicting survival with high accuracy (93 %) and an area under the curve (AUC) of 0.74 and that this outperformed logistic regression. The model also was used to predict the outcome of surgery and again showed that ANN could predict outcome (complete/optimal cytoreduction vs. suboptimal cytoreduction) with 77 % accuracy and an AUC of 0.73. These data are encouraging and demonstrate that artificial intelligence systems may have a role in providing prognostic and predictive data for patients. The performance of these systems likely will improve with increasing data set size, and this needs further investigation.

  19. A Popularity Based Prediction and Data Redistribution Tool for ATLAS Distributed Data Management

    CERN Document Server

    Beermann, T; The ATLAS collaboration; Maettig, P

    2014-01-01

    This paper presents a system to predict future data popularity for data-intensive systems, such as ATLAS distributed data management (DDM). Using these predictions it is possible to make a better distribution of data, helping to reduce the waiting time for jobs using with this data. This system is based on a tracer infrastructure that is able to monitor and store historical data accesses and which is used to create popularity reports. These reports provide detailed summaries about data accesses in the past, including information about the accessed files, the involved users and the sites. From this past data it is possible to then make near-term forecasts for data popularity in the future. The prediction system introduced in this paper makes use of both simple prediction methods as well as predictions made by neural networks. The best prediction method is dependent on the type of data and the data is carefully filtered for use in either system. The second part of the paper introduces a system that effectively ...

  20. ToxiM: A Toxicity Prediction Tool for Small Molecules Developed Using Machine Learning and Chemoinformatics Approaches

    Directory of Open Access Journals (Sweden)

    Ashok K. Sharma

    2017-11-01

    Full Text Available The experimental methods for the prediction of molecular toxicity are tedious and time-consuming tasks. Thus, the computational approaches could be used to develop alternative methods for toxicity prediction. We have developed a tool for the prediction of molecular toxicity along with the aqueous solubility and permeability of any molecule/metabolite. Using a comprehensive and curated set of toxin molecules as a training set, the different chemical and structural based features such as descriptors and fingerprints were exploited for feature selection, optimization and development of machine learning based classification and regression models. The compositional differences in the distribution of atoms were apparent between toxins and non-toxins, and hence, the molecular features were used for the classification and regression. On 10-fold cross-validation, the descriptor-based, fingerprint-based and hybrid-based classification models showed similar accuracy (93% and Matthews's correlation coefficient (0.84. The performances of all the three models were comparable (Matthews's correlation coefficient = 0.84–0.87 on the blind dataset. In addition, the regression-based models using descriptors as input features were also compared and evaluated on the blind dataset. Random forest based regression model for the prediction of solubility performed better (R2 = 0.84 than the multi-linear regression (MLR and partial least square regression (PLSR models, whereas, the partial least squares based regression model for the prediction of permeability (caco-2 performed better (R2 = 0.68 in comparison to the random forest and MLR based regression models. The performance of final classification and regression models was evaluated using the two validation datasets including the known toxins and commonly used constituents of health products, which attests to its accuracy. The ToxiM web server would be a highly useful and reliable tool for the prediction of toxicity

  1. Evaluation of the efficacy of nutritional screening tools to predict malnutrition in the elderly at a geriatric care hospital.

    Science.gov (United States)

    Baek, Myoung-Ha; Heo, Young-Ran

    2015-12-01

    Malnutrition in the elderly is a serious problem, prevalent in both hospitals and care homes. Due to the absence of a gold standard for malnutrition, herein we evaluate the efficacy of five nutritional screening tools developed or used for the elderly. Elected medical records of 141 elderly patients (86 men and 55 women, aged 73.5 ± 5.2 years) hospitalized at a geriatric care hospital were analyzed. Nutritional screening was performed using the following tools: Mini Nutrition Assessment (MNA), Mini Nutrition Assessment-Short Form (MNA-SF), Geriatric Nutritional Risk Index (GNRI), Malnutrition Universal Screening Tool (MUST) and Nutritional Risk Screening 2002 (NRS 2002). A combined index for malnutrition was also calculated as a reference tool. Each patient evaluated as malnourished to any degree or at risk of malnutrition according to at least four out of five of the aforementioned tools was categorized as malnourished in the combined index classification. According to the combined index, 44.0% of the patients were at risk of malnutrition to some degree. While the nutritional risk and/or malnutrition varied greatly depending on the tool applied, ranging from 36.2% (MUST) to 72.3% (MNA-SF). MUST showed good validity (sensitivity 80.6%, specificity 98.7%) and almost perfect agreement (k = 0.81) with the combined index. In contrast, MNA-SF showed poor validity (sensitivity 100%, specificity 49.4%) and only moderate agreement (k = 0.46) with the combined index. MNA-SF was found to overestimate the nutritional risk in the elderly. MUST appeared to be the most valid and useful screening tool to predict malnutrition in the elderly at a geriatric care hospital.

  2. SMART-COP: a tool for predicting the need for intensive respiratory or vasopressor support in community-acquired pneumonia.

    Science.gov (United States)

    Charles, Patrick G P; Wolfe, Rory; Whitby, Michael; Fine, Michael J; Fuller, Andrew J; Stirling, Robert; Wright, Alistair A; Ramirez, Julio A; Christiansen, Keryn J; Waterer, Grant W; Pierce, Robert J; Armstrong, John G; Korman, Tony M; Holmes, Peter; Obrosky, D Scott; Peyrani, Paula; Johnson, Barbara; Hooy, Michelle; Grayson, M Lindsay

    2008-08-01

    Existing severity assessment tools, such as the pneumonia severity index (PSI) and CURB-65 (tool based on confusion, urea level, respiratory rate, blood pressure, and age >or=65 years), predict 30-day mortality in community-acquired pneumonia (CAP) and have limited ability to predict which patients will require intensive respiratory or vasopressor support (IRVS). The Australian CAP Study (ACAPS) was a prospective study of 882 episodes in which each patient had a detailed assessment of severity features, etiology, and treatment outcomes. Multivariate logistic regression was performed to identify features at initial assessment that were associated with receipt of IRVS. These results were converted into a simple points-based severity tool that was validated in 5 external databases, totaling 7464 patients. In ACAPS, 10.3% of patients received IRVS, and the 30-day mortality rate was 5.7%. The features statistically significantly associated with receipt of IRVS were low systolic blood pressure (2 points), multilobar chest radiography involvement (1 point), low albumin level (1 point), high respiratory rate (1 point), tachycardia (1 point), confusion (1 point), poor oxygenation (2 points), and low arterial pH (2 points): SMART-COP. A SMART-COP score of >or=3 points identified 92% of patients who received IRVS, including 84% of patients who did not need immediate admission to the intensive care unit. Accuracy was also high in the 5 validation databases. Sensitivities of PSI and CURB-65 for identifying the need for IRVS were 74% and 39%, respectively. SMART-COP is a simple, practical clinical tool for accurately predicting the need for IRVS that is likely to assist clinicians in determining CAP severity.

  3. A Simplified Tool for Predicting the Thermal Behavior and the Energy Saving Potential of Ventilated Windows

    DEFF Research Database (Denmark)

    Zhang, Chen; Heiselberg, Per Kvols; Larsen, Olena Kalyanova

    2016-01-01

    Currently, the studies of ventilated windows mainly rely on complex fluid and thermal simulation software, which require extensive information, data and are very time consuming. The aim of this paper is to develop a simplified tool to assess the thermal behavior and energy performance of ventilat...

  4. Exploring Predictability of Instructor Ratings Using a Quantitative Tool for Evaluating Soft Skills among MBA Students

    Science.gov (United States)

    Brill, Robert T.; Gilfoil, David M.; Doll, Kristen

    2014-01-01

    Academic researchers have often touted the growing importance of "soft skills" for modern day business leaders, especially leadership and communication skills. Despite this growing interest and attention, relatively little work has been done to develop and validate tools to assess soft skills. Forty graduate students from nine MBA…

  5. Prediction of surface roughness in turning of Ti-6Al-4V using cutting parameters, forces and tool vibration

    Science.gov (United States)

    Sahu, Neelesh Kumar; Andhare, Atul B.; Andhale, Sandip; Raju Abraham, Roja

    2018-04-01

    Present work deals with prediction of surface roughness using cutting parameters along with in-process measured cutting force and tool vibration (acceleration) during turning of Ti-6Al-4V with cubic boron nitride (CBN) inserts. Full factorial design is used for design of experiments using cutting speed, feed rate and depth of cut as design variables. Prediction model for surface roughness is developed using response surface methodology with cutting speed, feed rate, depth of cut, resultant cutting force and acceleration as control variables. Analysis of variance (ANOVA) is performed to find out significant terms in the model. Insignificant terms are removed after performing statistical test using backward elimination approach. Effect of each control variables on surface roughness is also studied. Correlation coefficient (R2 pred) of 99.4% shows that model correctly explains the experiment results and it behaves well even when adjustment is made in factors or new factors are added or eliminated. Validation of model is done with five fresh experiments and measured forces and acceleration values. Average absolute error between RSM model and experimental measured surface roughness is found to be 10.2%. Additionally, an artificial neural network model is also developed for prediction of surface roughness. The prediction results of modified regression model are compared with ANN. It is found that RSM model and ANN (average absolute error 7.5%) are predicting roughness with more than 90% accuracy. From the results obtained it is found that including cutting force and vibration for prediction of surface roughness gives better prediction than considering only cutting parameters. Also, ANN gives better prediction over RSM models.

  6. Insights into an original pocket-ligand pair classification: a promising tool for ligand profile prediction.

    Directory of Open Access Journals (Sweden)

    Stéphanie Pérot

    Full Text Available Pockets are today at the cornerstones of modern drug discovery projects and at the crossroad of several research fields, from structural biology to mathematical modeling. Being able to predict if a small molecule could bind to one or more protein targets or if a protein could bind to some given ligands is very useful for drug discovery endeavors, anticipation of binding to off- and anti-targets. To date, several studies explore such questions from chemogenomic approach to reverse docking methods. Most of these studies have been performed either from the viewpoint of ligands or targets. However it seems valuable to use information from both ligands and target binding pockets. Hence, we present a multivariate approach relating ligand properties with protein pocket properties from the analysis of known ligand-protein interactions. We explored and optimized the pocket-ligand pair space by combining pocket and ligand descriptors using Principal Component Analysis and developed a classification engine on this paired space, revealing five main clusters of pocket-ligand pairs sharing specific and similar structural or physico-chemical properties. These pocket-ligand pair clusters highlight correspondences between pocket and ligand topological and physico-chemical properties and capture relevant information with respect to protein-ligand interactions. Based on these pocket-ligand correspondences, a protocol of prediction of clusters sharing similarity in terms of recognition characteristics is developed for a given pocket-ligand complex and gives high performances. It is then extended to cluster prediction for a given pocket in order to acquire knowledge about its expected ligand profile or to cluster prediction for a given ligand in order to acquire knowledge about its expected pocket profile. This prediction approach shows promising results and could contribute to predict some ligand properties critical for binding to a given pocket, and conversely

  7. Insights into an original pocket-ligand pair classification: a promising tool for ligand profile prediction.

    Science.gov (United States)

    Pérot, Stéphanie; Regad, Leslie; Reynès, Christelle; Spérandio, Olivier; Miteva, Maria A; Villoutreix, Bruno O; Camproux, Anne-Claude

    2013-01-01

    Pockets are today at the cornerstones of modern drug discovery projects and at the crossroad of several research fields, from structural biology to mathematical modeling. Being able to predict if a small molecule could bind to one or more protein targets or if a protein could bind to some given ligands is very useful for drug discovery endeavors, anticipation of binding to off- and anti-targets. To date, several studies explore such questions from chemogenomic approach to reverse docking methods. Most of these studies have been performed either from the viewpoint of ligands or targets. However it seems valuable to use information from both ligands and target binding pockets. Hence, we present a multivariate approach relating ligand properties with protein pocket properties from the analysis of known ligand-protein interactions. We explored and optimized the pocket-ligand pair space by combining pocket and ligand descriptors using Principal Component Analysis and developed a classification engine on this paired space, revealing five main clusters of pocket-ligand pairs sharing specific and similar structural or physico-chemical properties. These pocket-ligand pair clusters highlight correspondences between pocket and ligand topological and physico-chemical properties and capture relevant information with respect to protein-ligand interactions. Based on these pocket-ligand correspondences, a protocol of prediction of clusters sharing similarity in terms of recognition characteristics is developed for a given pocket-ligand complex and gives high performances. It is then extended to cluster prediction for a given pocket in order to acquire knowledge about its expected ligand profile or to cluster prediction for a given ligand in order to acquire knowledge about its expected pocket profile. This prediction approach shows promising results and could contribute to predict some ligand properties critical for binding to a given pocket, and conversely, some key pocket

  8. THE ROLE OF SELF-EFFICACY IN PREDICTING USE OF DISTANCE EDUCATION TOOLS AND LEARNING MANAGEMENT SYSTEMS

    Directory of Open Access Journals (Sweden)

    Ibrahim ARPACI

    2017-01-01

    Full Text Available This study aims to investigate the role of self-efficacy in predicting students’ use of distance education tools and learning management systems (LMSs. A total of 124 undergraduate students who enrolled in a course on Distance Education and selected using convenience sampling willingly participated in the study. The participants had little prior knowledge about distance education tools and LMSs. Therefore, they received instructions from the researcher over the course of a semester. The study proposed a research model based on the Technology Acceptance Model that has been widely used to predict user acceptance and use. Structural equation modelling was used to test the research model against the data collected through questionnaire surveys. Pretest-posttest results suggested that the students had significant learning by participating in the instruction. The results of the main analysis also suggested that self-efficacy positively affects perceived ease of use, while usefulness and ease of use perceptions positively affect attitudes toward using distance education tools and systems. Implications are provided along with limitations of the study discussed.

  9. Satellite-based hybrid drought monitoring tool for prediction of vegetation condition in Eastern Africa: A case study for Ethiopia

    Science.gov (United States)

    Tadesse, Tsegaye; Demisse, Getachew Berhan; Zaitchik, Ben; Dinku, Tufa

    2014-03-01

    An experimental drought monitoring tool has been developed that predicts the vegetation condition (Vegetation Outlook) using a regression-tree technique at a monthly time step during the growing season in Eastern Africa. This prediction tool (VegOut-Ethiopia) is demonstrated for Ethiopia as a case study. VegOut-Ethiopia predicts the standardized values of the Normalized Difference Vegetation Index (NDVI) at multiple time steps (weeks to months into the future) based on analysis of "historical patterns" of satellite, climate, and oceanic data over historical records. The model underlying VegOut-Ethiopia capitalizes on historical climate-vegetation interactions and ocean-climate teleconnections (such as El Niño and the Southern Oscillation (ENSO)) expressed over the 24 year data record and also considers several environmental characteristics (e.g., land cover and elevation) that influence vegetation's response to weather conditions to produce 8 km maps that depict future general vegetation conditions. VegOut-Ethiopia could provide vegetation monitoring capabilities at local, national, and regional levels that can complement more traditional remote sensing-based approaches that monitor "current" vegetation conditions. The preliminary results of this case study showed that the models were able to predict the vegetation stress (both spatial extent and severity) in drought years 1-3 months ahead during the growing season in Ethiopia. The correlation coefficients between the predicted and satellite-observed vegetation condition range from 0.50 to 0.90. Based on the lessons learned from past research activities and emerging experimental forecast models, future studies are recommended that could help Eastern Africa in advancing knowledge of climate, remote sensing, hydrology, and water resources.

  10. A Probabilistic Approach for Reliability and Life Prediction of Electronics in Drilling and Evaluation Tools

    Science.gov (United States)

    2014-12-23

    while drilling (MWD) and logging while drilling ( LWD ). The OnTrak tool takes measurements like resistivity, gamma ray, pressure and vibration. (3) Bi...estimation. LVPS = Low voltage power supply LWD = Logging while drilling MaPS = Maintenance and performance system MLE = Maximum likelihood estimation...years industry experience, with prior roles in Geoscience and LWD Operations. He has a BSc in Geology from the University of South Australia, an MSc

  11. Development of Prediction Tool for Sound Absorption and Sound Insulation for Sound Proof Properties

    OpenAIRE

    Yoshio Kurosawa; Takao Yamaguchi

    2015-01-01

    High frequency automotive interior noise above 500 Hz considerably affects automotive passenger comfort. To reduce this noise, sound insulation material is often laminated on body panels or interior trim panels. For a more effective noise reduction, the sound reduction properties of this laminated structure need to be estimated. We have developed a new calculate tool that can roughly calculate the sound absorption and insulation properties of laminate structure and handy ...

  12. SimUVEx v2 : a numeric tool to predict anatomical solar ultraviolet exposure

    OpenAIRE

    Religi, Arianna; Moccozet, Laurent; Farahmand, Meghdad; Vuilleumier, L.aurent; Vernez, David; Milon, Antoine; Backes, Claudine; Bulliard, Jean-Luc

    2016-01-01

    Solar ultraviolet (UV) radiation has a dual effect on human health: low UV doses promote the photosynthesis of vitamin D and regulate calcium and phosphorus metabolism, while an excessive UV exposure is the main cause of skin cancer, along with eye diseases and premature skin ageing. The link between UV radiation levels and UV exposure is not fully understood since exposure data are limited and individual anatomical variations in UV doses are significant. SimUVEx is a numeric simulation tool ...

  13. The equivalent pore aspect ratio as a tool for pore type prediction in carbonate reservoirs

    OpenAIRE

    FOURNIER , François; Pellerin , Matthieu; Villeneuve , Quentin; Teillet , Thomas; Hong , Fei; Poli , Emmanuelle; Borgomano , Jean; Léonide , Philippe; Hairabian , Alex

    2018-01-01

    International audience; The equivalent pore aspect ratios (EPAR) provide a tool to detect pore types by combining P-and S-wave velocities, porosity, bulk density and mineralogical composition of carbonate rocks. The integration of laboratory measurements, well log data and petrographic analysis of 468 carbonate samples from various depositional and diagenetic settings (Lower Cretaceous pre-salt non-marine carbonates from offshore Brazil, Lower Cretaceous shallow-water platform carbonates from...

  14. PAH plant uptake prediction: Evaluation of combined availability tools and modeling approach

    OpenAIRE

    Ouvrard, Stéphanie; DUPUY, Joan; Leglize, Pierre; Sterckeman, Thibault

    2015-01-01

    Transfer to plant is one of the main human exposure pathways of polycyclic aromatic hydrocarbons (PAH) from contaminated soils. However existing models implemented in risk assessment tools mostly rely on i) total contaminant concentration and ii) plant uptake models based on hydroponics experiments established with pesticides (Briggs et al., 1982, 1983). Total concentrations of soil contaminants are useful to indicate pollution, however they do not necessarily indicate risk. Me...

  15. Does diagnosis affect the predictive accuracy of risk assessment tools for juvenile offenders: Conduct Disorder and Attention Deficit Hyperactivity Disorder.

    Science.gov (United States)

    Khanna, Dinesh; Shaw, Jenny; Dolan, Mairead; Lennox, Charlotte

    2014-10-01

    Studies have suggested an increased risk of criminality in juveniles if they suffer from co-morbid Attention Deficit Hyperactivity Disorder (ADHD) along with Conduct Disorder. The Structured Assessment of Violence Risk in Youth (SAVRY), the Psychopathy Checklist Youth Version (PCL:YV), and Youth Level of Service/Case Management Inventory (YLS/CMI) have been shown to be good predictors of violent and non-violent re-offending. The aim was to compare the accuracy of these tools to predict violent and non-violent re-offending in young people with co-morbid ADHD and Conduct Disorder and Conduct Disorder only. The sample included 109 White-British adolescent males in secure settings. Results revealed no significant differences between the groups for re-offending. SAVRY factors had better predictive values than PCL:YV or YLS/CMI. Tools generally had better predictive values for the Conduct Disorder only group than the co-morbid group. Possible reasons for these findings have been discussed along with limitations of the study. Copyright © 2014 The Foundation for Professionals in Services for Adolescents. Published by Elsevier Ltd. All rights reserved.

  16. Is The Ca + K + Mg/Al Ratio in the Soil Solution a Predictive Tool for Estimating Forest Damage?

    International Nuclear Information System (INIS)

    Goeransson, A.; Eldhuset, T. D.

    2001-01-01

    The ratio between (Ca +K +Mg) and Al in nutrient solution has been suggested as a predictive tool for estimating tree growth disturbance. However, the ratio is unspecific in the sense that it is based on several elements which are all essential for plant growth;each of these may be growth-limiting. Furthermore,aluminium retards growth at higher concentrations. Itis therefore difficult to give causal and objective biological explanations for possible growth disturbances. The importance of the proportion of base-cations to N, at a fixed base-cation/Al ratio, is evaluated with regard to growth of Picea abies.The uptake of elements was found to be selective; nutrients were taken up while most Al remained in solution. Biomass partitioning to the roots increased after aluminium addition with low proportions of basecations to nitrogen. We conclude that the low growthrates depend on nutrient limitation in these treatments. Low growth rates in the high proportion experiments may be explained by high internal Alconcentrations. The results strongly suggest that growth rate is not correlated with the ratio in the rooting medium and question the validity of using ratios as predictive tools for estimating forest damage. We suggest that growth limitation of Picea abies in the field may depend on low proportions of base cations to nitrate. It is therefore important to know the nutritional status of the plant material in relation to the growth potential and environmental limitation to be able to predict and estimate forest damage

  17. The predictive and external validity of the STarT Back Tool in Danish primary care

    DEFF Research Database (Denmark)

    Morsø, Lars; Kent, Peter; Albert, Hanne B

    2013-01-01

    distinguished between low- and medium-risk subgroups with a similar predictive ability of the UK SBT. That distinction is useful information for informing patients about their expected prognosis and may help guiding clinicians' choice of treatment. However, cross-cultural differences in the SBT psychosocial...

  18. Efficacy of specific gravity as a tool for prediction of biodiesel-petroleum diesel blend ratio

    Science.gov (United States)

    Prediction of volumetric biodiesel/petrodiesel blend ratio (VBD) from specific gravity (SG) data was the subject of the current investigation. Fatty acid methyl esters obtained from soybean, palm, and rapeseed oils along with chicken fat (SME-1, SME-2, PME, RME, and CFME) were blended (0 to 20 volum...

  19. LoopIng: a template-based tool for predicting the structure of protein loops.

    KAUST Repository

    Messih, Mario Abdel

    2015-08-06

    Predicting the structure of protein loops is very challenging, mainly because they are not necessarily subject to strong evolutionary pressure. This implies that, unlike the rest of the protein, standard homology modeling techniques are not very effective in modeling their structure. However, loops are often involved in protein function, hence inferring their structure is important for predicting protein structure as well as function.We describe a method, LoopIng, based on the Random Forest automated learning technique, which, given a target loop, selects a structural template for it from a database of loop candidates. Compared to the most recently available methods, LoopIng is able to achieve similar accuracy for short loops (4-10 residues) and significant enhancements for long loops (11-20 residues). The quality of the predictions is robust to errors that unavoidably affect the stem regions when these are modeled. The method returns a confidence score for the predicted template loops and has the advantage of being very fast (on average: 1 min/loop).www.biocomputing.it/loopinganna.tramontano@uniroma1.itSupplementary data are available at Bioinformatics online.

  20. BIPS: BIANA Interolog Prediction Server. A tool for protein-protein interaction inference.

    Science.gov (United States)

    Garcia-Garcia, Javier; Schleker, Sylvia; Klein-Seetharaman, Judith; Oliva, Baldo

    2012-07-01

    Protein-protein interactions (PPIs) play a crucial role in biology, and high-throughput experiments have greatly increased the coverage of known interactions. Still, identification of complete inter- and intraspecies interactomes is far from being complete. Experimental data can be complemented by the prediction of PPIs within an organism or between two organisms based on the known interactions of the orthologous genes of other organisms (interologs). Here, we present the BIANA (Biologic Interactions and Network Analysis) Interolog Prediction Server (BIPS), which offers a web-based interface to facilitate PPI predictions based on interolog information. BIPS benefits from the capabilities of the framework BIANA to integrate the several PPI-related databases. Additional metadata can be used to improve the reliability of the predicted interactions. Sensitivity and specificity of the server have been calculated using known PPIs from different interactomes using a leave-one-out approach. The specificity is between 72 and 98%, whereas sensitivity varies between 1 and 59%, depending on the sequence identity cut-off used to calculate similarities between sequences. BIPS is freely accessible at http://sbi.imim.es/BIPS.php.

  1. Discrete event simulation as an ergonomic tool to predict workload exposures during systems design

    NARCIS (Netherlands)

    Perez, J.; Looze, M.P. de; Bosch, T.; Neumann, W.P.

    2014-01-01

    This methodological paper presents a novel approach to predict operator's mechanical exposure and fatigue accumulation in discrete event simulations. A biomechanical model of work-cycle loading is combined with a discrete event simulation model which provides work cycle patterns over the shift

  2. Dialysis Malnutrition and Malnutrition Inflammation Scores: screening tools for prediction of dialysis-related protein-energy wasting in Malaysia.

    Science.gov (United States)

    Harvinder, Gilcharan Singh; Swee, Winnie Chee Siew; Karupaiah, Tilakavati; Sahathevan, Sharmela; Chinna, Karuthan; Ahmad, Ghazali; Bavanandan, Sunita; Goh, Bak Leong

    2016-01-01

    Malnutrition is highly prevalent in Malaysian dialysis patients and there is a need for a valid screening tool for early identification and management. This cross-sectional study aims to examine the sensitivity of the Dialysis Malnutrition Score (DMS) and Malnutrition Inflammation Score (MIS) tools in predicting protein-energy wasting (PEW) among Malaysian dialysis patients. A total of 155 haemodialysis (HD) and 90 peritoneal dialysis (PD) patients were screened for risk of malnutrition using DMS and MIS and comparisons were made with established guidelines by International Society of Renal Nutrition and Metabolism (ISRNM) for PEW. MIS cut-off score of >=5 indicated presence of malnutrition in all patients. A total of 59% of HD and 83% of PD patients had PEW by ISRNM criteria. Based on DMS, 73% of HD and 71% of PD patients exhibited moderate malnutrition, whilst using MIS, 88% and 90%, respectively were malnourished. DMS and MIS correlated significantly in HD (r2=0.552, pmalnutrition classification were established (score >=5) for use amongst Malaysian dialysis patients. Both DMS and MIS are valid tools to be used for nutrition screening of dialysis patients especially those undergoing peritoneal dialysis. The DMS may be a more practical and simpler tool to be utilized in the Malaysian dialysis settings as it does not require laboratory markers.

  3. Flood Foresight: A near-real time flood monitoring and forecasting tool for rapid and predictive flood impact assessment

    Science.gov (United States)

    Revilla-Romero, Beatriz; Shelton, Kay; Wood, Elizabeth; Berry, Robert; Bevington, John; Hankin, Barry; Lewis, Gavin; Gubbin, Andrew; Griffiths, Samuel; Barnard, Paul; Pinnell, Marc; Huyck, Charles

    2017-04-01

    The hours and days immediately after a major flood event are often chaotic and confusing, with first responders rushing to mobilise emergency responders, provide alleviation assistance and assess loss to assets of interest (e.g., population, buildings or utilities). Preparations in advance of a forthcoming event are becoming increasingly important; early warning systems have been demonstrated to be useful tools for decision markers. The extent of damage, human casualties and economic loss estimates can vary greatly during an event, and the timely availability of an accurate flood extent allows emergency response and resources to be optimised, reduces impacts, and helps prioritise recovery. In the insurance sector, for example, insurers are under pressure to respond in a proactive manner to claims rather than waiting for policyholders to report losses. Even though there is a great demand for flood inundation extents and severity information in different sectors, generating flood footprints for large areas from hydraulic models in real time remains a challenge. While such footprints can be produced in real time using remote sensing, weather conditions and sensor availability limit their ability to capture every single flood event across the globe. In this session, we will present Flood Foresight (www.floodforesight.com), an operational tool developed to meet the universal requirement for rapid geographic information, before, during and after major riverine flood events. The tool provides spatial data with which users can measure their current or predicted impact from an event - at building, basin, national or continental scales. Within Flood Foresight, the Screening component uses global rainfall predictions to provide a regional- to continental-scale view of heavy rainfall events up to a week in advance, alerting the user to potentially hazardous situations relevant to them. The Forecasting component enhances the predictive suite of tools by providing a local

  4. Lipidome as a predictive tool in progression to type 2 diabetes in Finnish men

    DEFF Research Database (Denmark)

    Suvitaival, Tommi; Bondia-Pons, Isabel; Yetukuri, Laxman

    2018-01-01

    are not helpful at distinguishing progressors from non-progressors. BACKGROUND: There is a need for early markers to track and predict the development of type 2 diabetes mellitus (T2DM) from the state of normal glucose tolerance through prediabetes. In this study we tested whether the plasma molecular lipidome...... has biomarker potential to predicting the onset of T2DM. METHODS: We applied global lipidomic profiling on plasma samples from well-phenotyped men (107 cases, 216 controls) participating in the longitudinal METSIM study at baseline and at five-year follow-up. To validate the lipid markers......RESULTS: A persistent lipid signature with higher levels of triacylglycerols and diacyl-phospholipids as well as lower levels of alkylacyl phosphatidylcholines was observed in progressors to T2DM. Lysophosphatidylcholine acyl C18:2 (LysoPC(18:2)), phosphatidylcholines PC(32:1), PC(34:2e) and PC(36...

  5. Tools for Trustworthy Autonomy: Robust Predictions, Intuitive Control, and Optimized Interaction

    OpenAIRE

    Driggs Campbell, Katherine Rose

    2017-01-01

    In the near future, robotics will impact nearly every aspect of life. Yet for technology to smoothly integrate into society, we need interactive systems to be well modeled and predictable; have robust decision making and control; and be trustworthy to improve cooperation and interaction. To achieve these goals, we propose taking a human-centered approach to ease the transition into human-dominated fields. In this work, our modeling methods and control schemes are validated through user stu...

  6. A prediction tool for real-time application in the disruption protection system at JET

    International Nuclear Information System (INIS)

    Cannas, B.; Fanni, A.; Sonato, P.; Zedda, M.K.

    2007-01-01

    A disruption prediction system, based on neural networks, is presented in this paper. The system is ideally suitable for on-line application in the disruption avoidance and/or mitigation scheme at the JET tokamak. A multi-layer perceptron (MLP) predictor module has been trained on nine plasma diagnostic signals extracted from 86 disruptive pulses, selected from four years of JET experiments in the pulse range 47830-57346 (from 1999 to 2002). The disruption class of the disruptive pulses is available. In particular, the selected pulses belong to four classes (density limit/high radiated power, internal transport barrier, mode lock and h-mode/l-mode). A self-organizing map has been used to select the samples of the pulses to train the MLP predictor module and to determine its target, increasing the prediction capability of the system. The prediction performance has been tested over 86 disruptive and 102 non-disruptive pulses. The test has been performed presenting to the network all the samples of each pulse sampled every 20 ms. The missed alarm rate and the false alarm rate of the predictor, up to 100 ms prior to the disruption time, are 23% and 1%, respectively. Recent plasma configurations might present features different from those observed in the experiments used in the training set. This 'novelty' can lead to incorrect behaviour of the predictor. To improve the robustness and reliability of the system, a novelty detection module has been integrated in the prediction system, increasing the system performance and resulting in a missed alarm rate reduced to 7% and a false alarm rate reduced to 0%

  7. Monte Carlo simulation as a tool to predict blasting fragmentation based on the Kuz Ram model

    Science.gov (United States)

    Morin, Mario A.; Ficarazzo, Francesco

    2006-04-01

    Rock fragmentation is considered the most important aspect of production blasting because of its direct effects on the costs of drilling and blasting and on the economics of the subsequent operations of loading, hauling and crushing. Over the past three decades, significant progress has been made in the development of new technologies for blasting applications. These technologies include increasingly sophisticated computer models for blast design and blast performance prediction. Rock fragmentation depends on many variables such as rock mass properties, site geology, in situ fracturing and blasting parameters and as such has no complete theoretical solution for its prediction. However, empirical models for the estimation of size distribution of rock fragments have been developed. In this study, a blast fragmentation Monte Carlo-based simulator, based on the Kuz-Ram fragmentation model, has been developed to predict the entire fragmentation size distribution, taking into account intact and joints rock properties, the type and properties of explosives and the drilling pattern. Results produced by this simulator were quite favorable when compared with real fragmentation data obtained from a blast quarry. It is anticipated that the use of Monte Carlo simulation will increase our understanding of the effects of rock mass and explosive properties on the rock fragmentation by blasting, as well as increase our confidence in these empirical models. This understanding will translate into improvements in blasting operations, its corresponding costs and the overall economics of open pit mines and rock quarries.

  8. Prediction of retention time in reversed-phase liquid chromatography as a tool for steroid identification

    International Nuclear Information System (INIS)

    Randazzo, Giuseppe Marco; Tonoli, David; Hambye, Stephanie; Guillarme, Davy; Jeanneret, Fabienne; Nurisso, Alessandra; Goracci, Laura; Boccard, Julien; Rudaz, Serge

    2016-01-01

    The untargeted profiling of steroids constitutes a growing research field because of their importance as biomarkers of endocrine disruption. New technologies in analytical chemistry, such as ultra high-pressure liquid chromatography coupled with mass spectrometry (MS), offer the possibility of a fast and sensitive analysis. Nevertheless, difficulties regarding steroid identification are encountered when considering isotopomeric steroids. Thus, the use of retention times is of great help for the unambiguous identification of steroids. In this context, starting from the linear solvent strength (LSS) theory, quantitative structure retention relationship (QSRR) models, based on a dataset composed of 91 endogenous steroids and VolSurf + descriptors combined with a new dedicated molecular fingerprint, were developed to predict retention times of steroid structures in any gradient mode conditions. Satisfactory performance was obtained during nested cross-validation with a predictive ability (Q"2) of 0.92. The generalisation ability of the model was further confirmed by an average error of 4.4% in external prediction. This allowed the list of candidates associated with identical monoisotopic masses to be strongly reduced, facilitating definitive steroid identification. - Highlights: • Difficulties regarding steroid identification are encountered when considering isotopomeric steroids. • Quantitative structure retention relationship (QSRR) models were developed from the linear solvent strength theory. • A dataset composed of 91 steroids and VolSurf + descriptors combined with a new dedicated molecular fingerprint, were used. • The list of candidates associated with identical monoisotopic masses was reduced, facilitating steroid identification.

  9. Prediction of retention time in reversed-phase liquid chromatography as a tool for steroid identification

    Energy Technology Data Exchange (ETDEWEB)

    Randazzo, Giuseppe Marco [School of Pharmaceutical Sciences, University of Geneva and University of Lausanne, Geneva (Switzerland); Tonoli, David [School of Pharmaceutical Sciences, University of Geneva and University of Lausanne, Geneva (Switzerland); Swiss Centre for Applied Human Toxicology (SCAHT), Universities of Basel and Geneva, Basel (Switzerland); Human Protein Sciences Department, University of Geneva, Geneva (Switzerland); Hambye, Stephanie; Guillarme, Davy [School of Pharmaceutical Sciences, University of Geneva and University of Lausanne, Geneva (Switzerland); Jeanneret, Fabienne [School of Pharmaceutical Sciences, University of Geneva and University of Lausanne, Geneva (Switzerland); Swiss Centre for Applied Human Toxicology (SCAHT), Universities of Basel and Geneva, Basel (Switzerland); Human Protein Sciences Department, University of Geneva, Geneva (Switzerland); Nurisso, Alessandra [School of Pharmaceutical Sciences, University of Geneva and University of Lausanne, Geneva (Switzerland); Goracci, Laura [Department of Chemistry, Biology and Biotechnology, University of Perugia, Perugia (Italy); Boccard, Julien [School of Pharmaceutical Sciences, University of Geneva and University of Lausanne, Geneva (Switzerland); Rudaz, Serge, E-mail: serge.rudaz@unige.ch [School of Pharmaceutical Sciences, University of Geneva and University of Lausanne, Geneva (Switzerland); Swiss Centre for Applied Human Toxicology (SCAHT), Universities of Basel and Geneva, Basel (Switzerland)

    2016-04-15

    The untargeted profiling of steroids constitutes a growing research field because of their importance as biomarkers of endocrine disruption. New technologies in analytical chemistry, such as ultra high-pressure liquid chromatography coupled with mass spectrometry (MS), offer the possibility of a fast and sensitive analysis. Nevertheless, difficulties regarding steroid identification are encountered when considering isotopomeric steroids. Thus, the use of retention times is of great help for the unambiguous identification of steroids. In this context, starting from the linear solvent strength (LSS) theory, quantitative structure retention relationship (QSRR) models, based on a dataset composed of 91 endogenous steroids and VolSurf + descriptors combined with a new dedicated molecular fingerprint, were developed to predict retention times of steroid structures in any gradient mode conditions. Satisfactory performance was obtained during nested cross-validation with a predictive ability (Q{sup 2}) of 0.92. The generalisation ability of the model was further confirmed by an average error of 4.4% in external prediction. This allowed the list of candidates associated with identical monoisotopic masses to be strongly reduced, facilitating definitive steroid identification. - Highlights: • Difficulties regarding steroid identification are encountered when considering isotopomeric steroids. • Quantitative structure retention relationship (QSRR) models were developed from the linear solvent strength theory. • A dataset composed of 91 steroids and VolSurf + descriptors combined with a new dedicated molecular fingerprint, were used. • The list of candidates associated with identical monoisotopic masses was reduced, facilitating steroid identification.

  10. Energetic utilisation of biomass in Hungary

    International Nuclear Information System (INIS)

    Barotfi, I.

    1994-01-01

    Energetic utilisation of biomass has been known since prehistoric times and was only pushed into the background by the technological developments of the last century. The energy crisis and, more recently, environmental problems have now brought it back to the fore, and efforts are being made worldwide to find modern technical applications for biomass and contribute to its advance. (orig.) [de

  11. Missed Opportunities: Emergency Contraception Utilisation by ...

    African Journals Online (AJOL)

    Although contraceptives, including emergency contraceptives, are widely available free at public health facilities in South Africa, rates of teenage and unintended pregnancy are high. This paper analyses awareness and utilisation of emergency contraception amongst 193 young women (aged 15-24 years) attending public ...

  12. Application of Soft Computing Tools for Wave Prediction at Specific Locations in the Arabian Sea Using Moored Buoy Observations

    Directory of Open Access Journals (Sweden)

    J. Vimala

    2012-12-01

    Full Text Available The knowledge of design and operational values of significant wave heights is perhaps the single most important input needed in ocean engineering studies. Conventionally such information is obtained using classical statistical analysis and stochastic methods. As the causative variables are innumerable and underlying physics is too complicated, the results obtained from the numerical models may not always be very satisfactory. Soft computing tools like Artificial Neural Network (ANN and Adaptive Network based Fuzzy Inference System (ANFIS may therefore be useful to predict significant wave heights in some situations. The study is aimed at forecasting of significant wave height values in real time over a period of 24hrs at certain locations in Indian seas using the models of ANN and ANFIS. The data for the work were collected by National Institute of Ocean Technology, Chennai. It was found that the predictions of wave heights can be done by both methods with equal efficiency and satisfaction.

  13. Environmental assessment of incinerator residue utilisation

    Energy Technology Data Exchange (ETDEWEB)

    Toller, Susanna

    2008-10-15

    In Sweden, utilisation of incinerator residues outside disposal areas is restricted by environmental concerns, as such residues commonly contain greater amounts of potentially toxic trace elements than the natural materials they replace. On the other hand, utilisation can also provide environmental benefits by decreasing the need for landfill and reducing raw material extraction. This thesis provides increased knowledge and proposes better approaches for environmental assessment of incinerator residue utilisation, particularly bottom ash from municipal solid waste incineration (MSWI). A life cycle assessment (LCA) based approach was outlined for environmental assessment of incinerator residue utilisation, in which leaching of trace elements as well as other emissions to air and water and the use of resources were regarded as constituting the potential environmental impact from the system studied. Case studies were performed for i) road construction with or without MSWI bottom ash, ii) three management scenarios for MSWI bottom ash and iii) three management scenarios for wood ash. Different types of potential environmental impact predominated in the activities of the system and the scenarios differed in use of resources and energy. Utilising MSWI bottom ash in road construction and recycling of wood ash on forest land saved more natural resources and energy than when these materials were managed according to the other scenarios investigated, including dumping in landfill. There is a potential for trace element leaching regardless of how the ash is managed. Trace element leaching, particularly of copper (Cu), was identified as being relatively important for environmental assessment of MSWI bottom ash utilisation. CuO is suggested as the most important type of Cu-containing mineral in weathered MSWI bottom ash, whereas in the leachate Cu is mainly present in complexes with dissolved organic matter (DOM). The hydrophilic components of the DOM were more important for Cu

  14. CCTop: An Intuitive, Flexible and Reliable CRISPR/Cas9 Target Prediction Tool.

    Directory of Open Access Journals (Sweden)

    Manuel Stemmer

    Full Text Available Engineering of the CRISPR/Cas9 system has opened a plethora of new opportunities for site-directed mutagenesis and targeted genome modification. Fundamental to this is a stretch of twenty nucleotides at the 5' end of a guide RNA that provides specificity to the bound Cas9 endonuclease. Since a sequence of twenty nucleotides can occur multiple times in a given genome and some mismatches seem to be accepted by the CRISPR/Cas9 complex, an efficient and reliable in silico selection and evaluation of the targeting site is key prerequisite for the experimental success. Here we present the CRISPR/Cas9 target online predictor (CCTop, http://crispr.cos.uni-heidelberg.de to overcome limitations of already available tools. CCTop provides an intuitive user interface with reasonable default parameters that can easily be tuned by the user. From a given query sequence, CCTop identifies and ranks all candidate sgRNA target sites according to their off-target quality and displays full documentation. CCTop was experimentally validated for gene inactivation, non-homologous end-joining as well as homology directed repair. Thus, CCTop provides the bench biologist with a tool for the rapid and efficient identification of high quality target sites.

  15. The challenge of predicting problematic chemicals using a decision analysis tool: Triclosan as a case study.

    Science.gov (United States)

    Perez, Angela L; Gauthier, Alison M; Ferracini, Tyler; Cowan, Dallas M; Kingsbury, Tony; Panko, Julie

    2017-01-01

    Manufacturers lack a reliable means for determining whether a chemical will be targeted for deselection from their supply chain. In this analysis, 3 methods for determining whether a specific chemical (triclosan) would meet the criteria necessary for being targeted for deselection are presented. The methods included a list-based approach, use of a commercially available chemical assessment software tool run in 2 modes, and a public interest evaluation. Our results indicated that triclosan was included on only 6 of the lists reviewed, none of which were particularly influential in chemical selection decisions. The results from the chemical assessment tool evaluations indicated that human and ecological toxicity for triclosan is low and received scores indicating that the chemical would be considered of low concern. However, triclosan's peak public interest tracked several years in advance of increased regulatory scrutiny of this chemical suggesting that public pressure may have been influential in deselection decisions. Key data gaps and toxicity endpoints not yet regulated such as endocrine disruption potential or phototoxicity, but that are important to estimate the trajectory for deselection of a chemical, are discussed. Integr Environ Assess Manag 2017;13:198-207. © 2016 SETAC. © 2016 SETAC.

  16. GENECODIS-Grid: An online grid-based tool to predict functional information in gene lists

    International Nuclear Information System (INIS)

    Nogales, R.; Mejia, E.; Vicente, C.; Montes, E.; Delgado, A.; Perez Griffo, F. J.; Tirado, F.; Pascual-Montano, A.

    2007-01-01

    In this work we introduce GeneCodis-Grid, a grid-based alternative to a bioinformatics tool named Genecodis that integrates different sources of biological information to search for biological features (annotations) that frequently co-occur in a set of genes and rank them by statistical significance. GeneCodis-Grid is a web-based application that takes advantage of two independent grid networks and a computer cluster managed by a meta-scheduler and a web server that host the application. The mining of concurrent biological annotations provides significant information for the functional analysis of gene list obtained by high throughput experiments in biology. Due to the large popularity of this tool, that has registered more than 13000 visits since its publication in January 2007, there is a strong need to facilitate users from different sites to access the system simultaneously. In addition, the complexity of some of the statistical tests used in this approach has made this technique a good candidate for its implementation in a Grid opportunistic environment. (Author)

  17. Screening strategies and predictive diagnostic tools for the development of new-onset diabetes mellitus after transplantation: an overview

    Directory of Open Access Journals (Sweden)

    Pham PT

    2012-10-01

    Full Text Available Phuong-Thu T Pham,1 Kari L Edling,2 Harini A Chakkera,3 Phuong-Chi T Pham,4 Phuong-Mai T Pham51Department of Medicine, Nephrology Division, Kidney Transplant Program, David Geffen School of Medicine at UCLA, Los Angeles, CA, USA; 2Department of Medicine, Division of Endocrinology, Diabetes and Hypertension, David Geffen School of Medicine at UCLA, Los Angeles, CA, USA; 3Department of Medicine, Nephrology Division Kidney Transplant Program, Mayo Clinic Hospital, Phoenix, AZ, USA; 4Department of Medicine, Nephrology Division, UCLA-Olive View Medical Center, Sylmar, CA, USA; 5Department of Medicine, Greater Los Angeles, Veterans Administration Health Care System, CA, USAAbstract: New-onset diabetes mellitus after transplantation (NODAT is a serious and common complication following solid organ transplantation. NODAT has been reported in 2% to 53% of all solid organ transplants. Kidney transplant recipients who develop NODAT have variably been reported to be at increased risk of fatal and nonfatal cardiovascular events and other adverse outcomes including infection, reduced patient survival, graft rejection, and accelerated graft loss compared with those who do not develop diabetes. Limited clinical studies in liver, heart, and lung transplants similarly suggested that NODAT has an adverse impact on patient and graft outcomes. Early detection and management of NODAT must, therefore, be integrated into the treatment of transplant recipients. Studies investigating the best screening or predictive tool for identifying patients at risk for developing NODAT early after transplantation, however, are lacking. We review the clinical predictive values of fasting plasma glucose, oral glucose tolerance test, and A1C in assessing the risk for NODAT development and as a screening tool. Simple diabetes prediction models that incorporate clinical and/or metabolic risk factors (such as age, body mass index, hypertriglyceridemia, or metabolic syndrome are also

  18. PROCARB: A Database of Known and Modelled Carbohydrate-Binding Protein Structures with Sequence-Based Prediction Tools

    Directory of Open Access Journals (Sweden)

    Adeel Malik

    2010-01-01

    Full Text Available Understanding of the three-dimensional structures of proteins that interact with carbohydrates covalently (glycoproteins as well as noncovalently (protein-carbohydrate complexes is essential to many biological processes and plays a significant role in normal and disease-associated functions. It is important to have a central repository of knowledge available about these protein-carbohydrate complexes as well as preprocessed data of predicted structures. This can be significantly enhanced by tools de novo which can predict carbohydrate-binding sites for proteins in the absence of structure of experimentally known binding site. PROCARB is an open-access database comprising three independently working components, namely, (i Core PROCARB module, consisting of three-dimensional structures of protein-carbohydrate complexes taken from Protein Data Bank (PDB, (ii Homology Models module, consisting of manually developed three-dimensional models of N-linked and O-linked glycoproteins of unknown three-dimensional structure, and (iii CBS-Pred prediction module, consisting of web servers to predict carbohydrate-binding sites using single sequence or server-generated PSSM. Several precomputed structural and functional properties of complexes are also included in the database for quick analysis. In particular, information about function, secondary structure, solvent accessibility, hydrogen bonds and literature reference, and so forth, is included. In addition, each protein in the database is mapped to Uniprot, Pfam, PDB, and so forth.

  19. CADRE-SS, an in Silico Tool for Predicting Skin Sensitization Potential Based on Modeling of Molecular Interactions.

    Science.gov (United States)

    Kostal, Jakub; Voutchkova-Kostal, Adelina

    2016-01-19

    Using computer models to accurately predict toxicity outcomes is considered to be a major challenge. However, state-of-the-art computational chemistry techniques can now be incorporated in predictive models, supported by advances in mechanistic toxicology and the exponential growth of computing resources witnessed over the past decade. The CADRE (Computer-Aided Discovery and REdesign) platform relies on quantum-mechanical modeling of molecular interactions that represent key biochemical triggers in toxicity pathways. Here, we present an external validation exercise for CADRE-SS, a variant developed to predict the skin sensitization potential of commercial chemicals. CADRE-SS is a hybrid model that evaluates skin permeability using Monte Carlo simulations, assigns reactive centers in a molecule and possible biotransformations via expert rules, and determines reactivity with skin proteins via quantum-mechanical modeling. The results were promising with an overall very good concordance of 93% between experimental and predicted values. Comparison to performance metrics yielded by other tools available for this endpoint suggests that CADRE-SS offers distinct advantages for first-round screenings of chemicals and could be used as an in silico alternative to animal tests where permissible by legislative programs.

  20. Predicting the chance of live birth for women undergoing IVF: a novel pretreatment counselling tool.

    Science.gov (United States)

    Dhillon, R K; McLernon, D J; Smith, P P; Fishel, S; Dowell, K; Deeks, J J; Bhattacharya, S; Coomarasamy, A

    2016-01-01

    Which pretreatment patient variables have an effect on live birth rates following assisted conception? The predictors in the final multivariate logistic regression model found to be significantly associated with reduced chances of IVF/ICSI success were increasing age (particularly above 36 years), tubal factor infertility, unexplained infertility and Asian or Black ethnicity. The two most widely recognized prediction models for live birth following IVF were developed on data from 1991 to 2007; pre-dating significant changes in clinical practice. These existing IVF outcome prediction models do not incorporate key pretreatment predictors, such as BMI, ethnicity and ovarian reserve, which are readily available now. In this cohort study a model to predict live birth was derived using data collected from 9915 women who underwent IVF/ICSI treatment at any CARE (Centres for Assisted Reproduction) clinic from 2008 to 2012. Model validation was performed on data collected from 2723 women who underwent treatment in 2013. The primary outcome for the model was live birth, which was defined as any birth event in which at least one baby was born alive and survived for more than 1 month. Data were collected from 12 fertility clinics within the CARE consortium in the UK. Multivariable logistic regression was used to develop the model. Discriminatory ability was assessed using the area under receiver operating characteristic (AUROC) curve, and calibration was assessed using calibration-in-the-large and the calibration slope test. The predictors in the final model were female age, BMI, ethnicity, antral follicle count (AFC), previous live birth, previous miscarriage, cause and duration of infertility. Upon assessing predictive ability, the AUROC curve for the final model and validation cohort was (0.62; 95% confidence interval (CI) 0.61-0.63) and (0.62; 95% CI 0.60-0.64) respectively. Calibration-in-the-large showed a systematic over-estimation of the predicted probability of live

  1. Predicting RNA hyper-editing with a novel tool when unambiguous alignment is impossible.

    Science.gov (United States)

    McKerrow, Wilson H; Savva, Yiannis A; Rezaei, Ali; Reenan, Robert A; Lawrence, Charles E

    2017-07-10

    Repetitive elements are now known to have relevant cellular functions, including self-complementary sequences that form double stranded (ds) RNA. There are numerous pathways that determine the fate of endogenous dsRNA, and misregulation of endogenous dsRNA is a driver of autoimmune disease, particularly in the brain. Unfortunately, the alignment of high-throughput, short-read sequences to repeat elements poses a dilemma: Such sequences may align equally well to multiple genomic locations. In order to differentiate repeat elements, current alignment methods depend on sequence variation in the reference genome. Reads are discarded when no such variations are present. However, RNA hyper-editing, a possible fate for dsRNA, introduces enough variation to distinguish between repeats that are otherwise identical. To take advantage of this variation, we developed a new algorithm, RepProfile, that simultaneously aligns reads and predicts novel variations. RepProfile accurately aligns hyper-edited reads that other methods discard. In particular we predict hyper-editing of Drosophila melanogaster repeat elements in vivo at levels previously described only in vitro, and provide validation by Sanger sequencing sixty-two individual cloned sequences. We find that hyper-editing is concentrated in genes involved in cell-cell communication at the synapse, including some that are associated with neurodegeneration. We also find that hyper-editing tends to occur in short runs. Previous studies of RNA hyper-editing discarded ambiguously aligned reads, ignoring hyper-editing in long, perfect dsRNA - the perfect substrate for hyper-editing. We provide a method that simulation and Sanger validation show accurately predicts such RNA editing, yielding a superior picture of hyper-editing.

  2. MODIFIED MALLAMPATI CLASSIFICATION SCORE- A SIMPLE TOOL FOR PREDICTING TOLERANCE IN UNSEDATED OESOPHAGOGASTRODUODENOSCOPY

    Directory of Open Access Journals (Sweden)

    Abishek Sasidharan

    2017-04-01

    Full Text Available BACKGROUND 40-47% of patients poorly tolerates esophagogastroduodenoscopy (EGD. Early identification of potentially intolerant patients improve procedural success and avoid patient discomfort. Modified Mallampati Classification (MMC score is a simple scoring system used to predict difficult tracheal intubation and laryngoscope insertion. As EGD involves the same level of patient discomfort during introduction, MMC may predict EGD tolerance. MATERIALS AND METHODS 100 patients with dyspeptic symptoms and no alarm features attending our department were recruited for unsedated EGD between January and July 2012. All patients had good performance status and underlying anxiety disorder was excluded. Based on MMC, patients placed into 4 classes- I: Soft palate, fauces, pillars and uvula visible. II: Soft palate, fauces and uvula visible. III: Soft palate and base of uvula visible. IV: Soft palate not visible. They were divided into good view (class I and II and poor view (class III and IV. EGD was performed by the same consultant and MMS status assessed by two independent trained personnel. All received 2 doses of topical pharyngeal spray containing 10% lidocaine hydrochloride. Outcome measurements were gag reflex, endoscopist’s assessment and patient feedback. RESULTS Of 100 patients, 52 were males. 58 in group A and 42 in group B. Gag reflex was present in 32.7% of good view group compared to 78.6% in poor view (p<0.001. From the endoscopist’s view, good tolerability observed in 72.4% of good view group compared to 21% in poor view (p<0.001. 74.1% patient reported satisfactory feedback in good view group compared to 19% in poor view group (p<0.001. CONCLUSION MMC is a good clinical indicator for predicting tolerance in unsedated EGD.

  3. An Easy Tool to Predict Survival in Patients Receiving Radiation Therapy for Painful Bone Metastases

    Energy Technology Data Exchange (ETDEWEB)

    Westhoff, Paulien G., E-mail: p.g.westhoff@umcutrecht.nl [Department of Radiotherapy, University Medical Center Utrecht, Utrecht (Netherlands); Graeff, Alexander de [Department of Medical Oncology, University Medical Center Utrecht, Utrecht (Netherlands); Monninkhof, Evelyn M. [Julius Center for Health Sciences and Primary Care, University Medical Center Utrecht, Utrecht (Netherlands); Bollen, Laurens; Dijkstra, Sander P. [Department of Orthopedic Surgery, Leiden University Medical Center (Netherlands); Steen-Banasik, Elzbieta M. van der [ARTI Institute for Radiation Oncology Arnhem, Arnhem (Netherlands); Vulpen, Marco van [Department of Radiotherapy, University Medical Center Utrecht, Utrecht (Netherlands); Leer, Jan Willem H. [Department of Radiotherapy, University Medical Center Nijmegen, Nijmegen (Netherlands); Marijnen, Corrie A.; Linden, Yvette M. van der [Department of Clinical Oncology, Leiden University Medical Center, Leiden (Netherlands)

    2014-11-15

    Purpose: Patients with bone metastases have a widely varying survival. A reliable estimation of survival is needed for appropriate treatment strategies. Our goal was to assess the value of simple prognostic factors, namely, patient and tumor characteristics, Karnofsky performance status (KPS), and patient-reported scores of pain and quality of life, to predict survival in patients with painful bone metastases. Methods and Materials: In the Dutch Bone Metastasis Study, 1157 patients were treated with radiation therapy for painful bone metastases. At randomization, physicians determined the KPS; patients rated general health on a visual analogue scale (VAS-gh), valuation of life on a verbal rating scale (VRS-vl) and pain intensity. To assess the predictive value of the variables, we used multivariate Cox proportional hazard analyses and C-statistics for discriminative value. Of the final model, calibration was assessed. External validation was performed on a dataset of 934 patients who were treated with radiation therapy for vertebral metastases. Results: Patients had mainly breast (39%), prostate (23%), or lung cancer (25%). After a maximum of 142 weeks' follow-up, 74% of patients had died. The best predictive model included sex, primary tumor, visceral metastases, KPS, VAS-gh, and VRS-vl (C-statistic = 0.72, 95% CI = 0.70-0.74). A reduced model, with only KPS and primary tumor, showed comparable discriminative capacity (C-statistic = 0.71, 95% CI = 0.69-0.72). External validation showed a C-statistic of 0.72 (95% CI = 0.70-0.73). Calibration of the derivation and the validation dataset showed underestimation of survival. Conclusion: In predicting survival in patients with painful bone metastases, KPS combined with primary tumor was comparable to a more complex model. Considering the amount of variables in complex models and the additional burden on patients, the simple model is preferred for daily use. In addition, a risk table for survival is

  4. Microgravity spheroids as a reliable, long-term tool for predictive toxicology

    DEFF Research Database (Denmark)

    Fey, S. J.; Wrzesinski, Krzysztof

    2013-01-01

    those seen in vivo. Studies with 5 common drugs (acetaminophen, amiodarone, metformin, phenformin, and valproic acid) have shown that they are more predictive of lethally-toxic plasma levels in vivo than published studies using primary human hepatocytes. Shotgun proteomics has revealed that the gain...... this time they are metabolically stable for at least 24 days more; grow slowly (a doubling time of >20 days); produce physiological levels of urea, cholesterol and ATP; exhibit stable gene expression (for selected liver relevant genes); and can post translationally modify proteins in a manner which mirrors...

  5. Diffusion Weighted MRI as a predictive tool for effect of radiotherapy in locally advanced cervical cancer

    DEFF Research Database (Denmark)

    Haack, Søren; Tanderup, Kari; Fokdal, Lars

    Diffusion weighted MRI has shown great potential in diagnostic cancer imaging and may also have value for monitoring tumor response during radiotherapy. Patients with advanced cervical cancer are treated with external beam radiotherapy followed by brachytherapy. This study evaluates the value of DW......-MRI for predicting outcome of patients with advanced cervical cancer at time of brachytherapy. Volume of hyper-intensity on highly diffusion sensitive images and resulting ADC value for treatment responders and non-responders is compared. The change of ADC and volume of hyper-intensity over time of BT is also...

  6. Numerical predicting of the structure and stresses state in hardened element made of tool steel

    Directory of Open Access Journals (Sweden)

    A. Bokota

    2008-03-01

    Full Text Available The paper presents numerical model of thcrmal phcnomcna, phasc transformation and mcchanical phcnomcna associated with hardeningof carbon tool steel. Model for evaluation or fractions OF phases and their kinetics bascd on continuous heating diagram (CHT andcontinuous cooling diagram (CCT. The stresses generated during hardening were assumed to rcsult from ~hermal load. stntcturaI plasticdeformations and transformation plasricity. Thc hardened material was assumed to be elastic-plastic, and in ordcr to mark plastic strains the non-isothermal plastic law of flow with the isotropic hardening and condition plasticity of Huber-Misses were used. TherrnophysicaI values of mechanical phenomena dependent on bo~hth e phase composition and temperature. In the numerical example thc simulated estimation of the phasc Fraction and strcss distributions in the hardened axisimmetrical elemcnt was performed.

  7. Lipidomic analysis of epidermal lipids: a tool to predict progression of inflammatory skin disease in humans.

    Science.gov (United States)

    Li, Shan; Ganguli-Indra, Gitali; Indra, Arup K

    2016-05-01

    Lipidomics is the large-scale profiling and characterization of lipid species in a biological system using mass spectrometry. The skin barrier is mainly comprised of corneocytes and a lipid-enriched extracellular matrix. The major skin lipids are ceramides, cholesterol and free fatty acids (FFA). Lipid compositions are altered in inflammatory skin disorders with disrupted skin barrier such as atopic dermatitis (AD). Here we discuss some of the recent applications of lipidomics in human skin biology and in inflammatory skin diseases such as AD, psoriasis and Netherton syndrome. We also review applications of lipidomics in human skin equivalent and in pre-clinical animal models of skin diseases to gain insight into the pathogenesis of the skin disease. Expert commentary: Skin lipidomics analysis could be a fast, reliable and noninvasive tool to characterize the skin lipid profile and to monitor the progression of inflammatory skin diseases such as AD.

  8. Study of the stiffness for predicting the accuracy of machine tools

    International Nuclear Information System (INIS)

    Ortega, N.; Campa, F.J.; Fernandez Valdivielso, A.; Alonso, U.; Olvera, D.; Compean, F.I.

    2010-01-01

    Machining processes are frequently faced with the challenge of achieving more and more precision and surface qualities. These requirements are usually attained taking into account some process variables, including the cutting parameters and the use or not of refrigerant, leaving aside the mechanical aspects associated with the influence of machine tool itself. There are many sources of error linked with machine-workpiece interaction, but, in general, we can summarize them into two types of error: quasi-static and dynamic. This paper shows the influence of quasi-static error caused by low machine rigidity on the accuracy applied on two very different processes: turning and grinding. For the study of the static stiffness of these two machines, two different methods are proposed, both of them equally valid. The first one is based on separated parameters and the second one on finite elements. (Author).

  9. A clinical risk stratification tool for predicting treatment resistance in major depressive disorder.

    Science.gov (United States)

    Perlis, Roy H

    2013-07-01

    Early identification of depressed individuals at high risk for treatment resistance could be helpful in selecting optimal setting and intensity of care. At present, validated tools to facilitate this risk stratification are rarely used in psychiatric practice. Data were drawn from the first two treatment levels of a multicenter antidepressant effectiveness study in major depressive disorder, the STAR*D (Sequenced Treatment Alternatives to Relieve Depression) cohort. This cohort was divided into training, testing, and validation subsets. Only clinical or sociodemographic variables available by or readily amenable to self-report were considered. Multivariate models were developed to discriminate individuals reaching remission with a first or second pharmacological treatment trial from those not reaching remission despite two trials. A logistic regression model achieved an area under the receiver operating characteristic curve exceeding .71 in training, testing, and validation cohorts and maintained good calibration across cohorts. Performance of three alternative models with machine learning approaches--a naïve Bayes classifier and a support vector machine, and a random forest model--was less consistent. Similar performance was observed between more and less severe depression, men and women, and primary versus specialty care sites. A web-based calculator was developed that implements this tool and provides graphical estimates of risk. Risk for treatment resistance among outpatients with major depressive disorder can be estimated with a simple model incorporating baseline sociodemographic and clinical features. Future studies should examine the performance of this model in other clinical populations and its utility in treatment selection or clinical trial design. Copyright © 2013 Society of Biological Psychiatry. Published by Elsevier Inc. All rights reserved.

  10. Wing Leading Edge RCC Rapid Response Damage Prediction Tool (IMPACT2)

    Science.gov (United States)

    Clark, Robert; Cottter, Paul; Michalopoulos, Constantine

    2013-01-01

    This rapid response computer program predicts Orbiter Wing Leading Edge (WLE) damage caused by ice or foam impact during a Space Shuttle launch (Program "IMPACT2"). The program was developed after the Columbia accident in order to assess quickly WLE damage due to ice, foam, or metal impact (if any) during a Shuttle launch. IMPACT2 simulates an impact event in a few minutes for foam impactors, and in seconds for ice and metal impactors. The damage criterion is derived from results obtained from one sophisticated commercial program, which requires hours to carry out simulations of the same impact events. The program was designed to run much faster than the commercial program with prediction of projectile threshold velocities within 10 to 15% of commercial-program values. The mathematical model involves coupling of Orbiter wing normal modes of vibration to nonlinear or linear springmass models. IMPACT2 solves nonlinear or linear impact problems using classical normal modes of vibration of a target, and nonlinear/ linear time-domain equations for the projectile. Impact loads and stresses developed in the target are computed as functions of time. This model is novel because of its speed of execution. A typical model of foam, or other projectile characterized by material nonlinearities, impacting an RCC panel is executed in minutes instead of hours needed by the commercial programs. Target damage due to impact can be assessed quickly, provided that target vibration modes and allowable stress are known.

  11. Artificial neural network as the tool in prediction rheological features of raw minced meat.

    Science.gov (United States)

    Balejko, Jerzy A; Nowak, Zbigniew; Balejko, Edyta

    2012-01-01

    The aim of the study was to elaborate a method of modelling and forecasting rheological features which could be applied to raw minced meat at the stage of mixture preparation with a given ingredient composition. The investigated material contained pork and beef meat, pork fat, fat substitutes, ice and curing mixture in various proportions. Seven texture parameters were measured for each sample of raw minced meat. The data obtained were processed using the artificial neural network module in Statistica 9.0 software. The model that reached the lowest training error was a multi-layer perceptron MLP with three neural layers and architecture 7:7-11-7:7. Correlation coefficients between the experimental and calculated values in training, verification and testing subsets were similar and rather high (around 0.65) which indicated good network performance. High percentage of the total variance explained in PCA analysis (73.5%) indicated that the percentage composition of raw minced meat can be successfully used in the prediction of its rheological features. Statistical analysis of the results revealed, that artificial neural network model is able to predict rheological parameters and thus a complete texture profile of raw minced meat.

  12. Computational tools for experimental determination and theoretical prediction of protein structure

    Energy Technology Data Exchange (ETDEWEB)

    O`Donoghue, S.; Rost, B.

    1995-12-31

    This tutorial was one of eight tutorials selected to be presented at the Third International Conference on Intelligent Systems for Molecular Biology which was held in the United Kingdom from July 16 to 19, 1995. The authors intend to review the state of the art in the experimental determination of protein 3D structure (focus on nuclear magnetic resonance), and in the theoretical prediction of protein function and of protein structure in 1D, 2D and 3D from sequence. All the atomic resolution structures determined so far have been derived from either X-ray crystallography (the majority so far) or Nuclear Magnetic Resonance (NMR) Spectroscopy (becoming increasingly more important). The authors briefly describe the physical methods behind both of these techniques; the major computational methods involved will be covered in some detail. They highlight parallels and differences between the methods, and also the current limitations. Special emphasis will be given to techniques which have application to ab initio structure prediction. Large scale sequencing techniques increase the gap between the number of known proteins sequences and that of known protein structures. They describe the scope and principles of methods that contribute successfully to closing that gap. Emphasis will be given on the specification of adequate testing procedures to validate such methods.

  13. A Practical Framework Toward Prediction of Breaking Force and Disintegration of Tablet Formulations Using Machine Learning Tools.

    Science.gov (United States)

    Akseli, Ilgaz; Xie, Jingjin; Schultz, Leon; Ladyzhynsky, Nadia; Bramante, Tommasina; He, Xiaorong; Deanne, Rich; Horspool, Keith R; Schwabe, Robert

    2017-01-01

    Enabling the paradigm of quality by design requires the ability to quantitatively correlate material properties and process variables to measureable product performance attributes. Conventional, quality-by-test methods for determining tablet breaking force and disintegration time usually involve destructive tests, which consume significant amount of time and labor and provide limited information. Recent advances in material characterization, statistical analysis, and machine learning have provided multiple tools that have the potential to develop nondestructive, fast, and accurate approaches in drug product development. In this work, a methodology to predict the breaking force and disintegration time of tablet formulations using nondestructive ultrasonics and machine learning tools was developed. The input variables to the model include intrinsic properties of formulation and extrinsic process variables influencing the tablet during manufacturing. The model has been applied to predict breaking force and disintegration time using small quantities of active pharmaceutical ingredient and prototype formulation designs. The novel approach presented is a step forward toward rational design of a robust drug product based on insight into the performance of common materials during formulation and process development. It may also help expedite drug product development timeline and reduce active pharmaceutical ingredient usage while improving efficiency of the overall process. Copyright © 2016 American Pharmacists Association®. Published by Elsevier Inc. All rights reserved.

  14. Micro-Vibration Performance Prediction of SEPTA24 Using SMeSim (RUAG Space Mechanism Simulator Tool)

    Science.gov (United States)

    Omiciuolo, Manolo; Lang, Andreas; Wismer, Stefan; Barth, Stephan; Szekely, Gerhard

    2013-09-01

    Scientific space missions are currently challenging the performances of their payloads. The performances can be dramatically restricted by micro-vibration loads generated by any moving parts of the satellites, thus by Solar Array Drive Assemblies too. Micro-vibration prediction of SADAs is therefore very important to support their design and optimization in the early stages of a programme. The Space Mechanism Simulator (SMeSim) tool, developed by RUAG, enhances the capability of analysing the micro-vibration emissivity of a Solar Array Drive Assembly (SADA) under a specified set of boundary conditions. The tool is developed in the Matlab/Simulink® environment throughout a library of blocks simulating the different components a SADA is made of. The modular architecture of the blocks, assembled by the user, and the set up of the boundary conditions allow time-domain and frequency-domain analyses of a rigid multi-body model with concentrated flexibilities and coupled- electronic control of the mechanism. SMeSim is used to model the SEPTA24 Solar Array Drive Mechanism and predict its micro-vibration emissivity. SMeSim and the return of experience earned throughout its development and use can now support activities like verification by analysis of micro-vibration emissivity requirements and/or design optimization to minimize the micro- vibration emissivity of a SADA.

  15. The Framework of a Coastal Hazards Model - A Tool for Predicting the Impact of Severe Storms

    Science.gov (United States)

    Barnard, Patrick L.; O'Reilly, Bill; van Ormondt, Maarten; Elias, Edwin; Ruggiero, Peter; Erikson, Li H.; Hapke, Cheryl; Collins, Brian D.; Guza, Robert T.; Adams, Peter N.; Thomas, Julie

    2009-01-01

    The U.S. Geological Survey (USGS) Multi-Hazards Demonstration Project in Southern California (Jones and others, 2007) is a five-year project (FY2007-FY2011) integrating multiple USGS research activities with the needs of external partners, such as emergency managers and land-use planners, to produce products and information that can be used to create more disaster-resilient communities. The hazards being evaluated include earthquakes, landslides, floods, tsunamis, wildfires, and coastal hazards. For the Coastal Hazards Task of the Multi-Hazards Demonstration Project in Southern California, the USGS is leading the development of a modeling system for forecasting the impact of winter storms threatening the entire Southern California shoreline from Pt. Conception to the Mexican border. The modeling system, run in real-time or with prescribed scenarios, will incorporate atmospheric information (that is, wind and pressure fields) with a suite of state-of-the-art physical process models (that is, tide, surge, and wave) to enable detailed prediction of currents, wave height, wave runup, and total water levels. Additional research-grade predictions of coastal flooding, inundation, erosion, and cliff failure will also be performed. Initial model testing, performance evaluation, and product development will be focused on a severe winter-storm scenario developed in collaboration with the Winter Storm Working Group of the USGS Multi-Hazards Demonstration Project in Southern California. Additional offline model runs and products will include coastal-hazard hindcasts of selected historical winter storms, as well as additional severe winter-storm simulations based on statistical analyses of historical wave and water-level data. The coastal-hazards model design will also be appropriate for simulating the impact of storms under various sea level rise and climate-change scenarios. The operational capabilities of this modeling system are designed to provide emergency planners with

  16. A design tool for predicting the capillary transport characteristics of fuel cell diffusion media using an artificial neural network

    Science.gov (United States)

    Kumbur, E. C.; Sharp, K. V.; Mench, M. M.

    Developing a robust, intelligent design tool for multivariate optimization of multi-phase transport in fuel cell diffusion media (DM) is of utmost importance to develop advanced DM materials. This study explores the development of a DM design algorithm based on artificial neural network (ANN) that can be used as a powerful tool for predicting the capillary transport characteristics of fuel cell DM. Direct measurements of drainage capillary pressure-saturation curves of the differently engineered DMs (5, 10 and 20 wt.% PTFE) were performed at room temperature under three compressions (0, 0.6 and 1.4 MPa) [E.C. Kumbur, K.V. Sharp, M.M. Mench, J. Electrochem. Soc. 154(12) (2007) B1295-B1304; E.C. Kumbur, K.V. Sharp, M.M. Mench, J. Electrochem. Soc. 154(12) (2007) B1305-B1314; E.C. Kumbur, K.V. Sharp, M.M. Mench, J. Electrochem. Soc. 154(12) (2007) B1315-B1324]. The generated benchmark data were utilized to systematically train a three-layered ANN framework that processes the feed-forward error back propagation methodology. The designed ANN successfully predicts the measured capillary pressures within an average uncertainty of ±5.1% of the measured data, confirming that the present ANN model can be used as a design tool within the range of tested parameters. The ANN simulations reveal that tailoring the DM with high PTFE loading and applying high compression pressure lead to a higher capillary pressure, therefore promoting the liquid water transport within the pores of the DM. Any increase in hydrophobicity of the DM is found to amplify the compression effect, thus yielding a higher capillary pressure for the same saturation level and compression.

  17. Argot2: a large scale function prediction tool relying on semantic similarity of weighted Gene Ontology terms.

    Science.gov (United States)

    Falda, Marco; Toppo, Stefano; Pescarolo, Alessandro; Lavezzo, Enrico; Di Camillo, Barbara; Facchinetti, Andrea; Cilia, Elisa; Velasco, Riccardo; Fontana, Paolo

    2012-03-28

    Predicting protein function has become increasingly demanding in the era of next generation sequencing technology. The task to assign a curator-reviewed function to every single sequence is impracticable. Bioinformatics tools, easy to use and able to provide automatic and reliable annotations at a genomic scale, are necessary and urgent. In this scenario, the Gene Ontology has provided the means to standardize the annotation classification with a structured vocabulary which can be easily exploited by computational methods. Argot2 is a web-based function prediction tool able to annotate nucleic or protein sequences from small datasets up to entire genomes. It accepts as input a list of sequences in FASTA format, which are processed using BLAST and HMMER searches vs UniProKB and Pfam databases respectively; these sequences are then annotated with GO terms retrieved from the UniProtKB-GOA database and the terms are weighted using the e-values from BLAST and HMMER. The weighted GO terms are processed according to both their semantic similarity relations described by the Gene Ontology and their associated score. The algorithm is based on the original idea developed in a previous tool called Argot. The entire engine has been completely rewritten to improve both accuracy and computational efficiency, thus allowing for the annotation of complete genomes. The revised algorithm has been already employed and successfully tested during in-house genome projects of grape and apple, and has proven to have a high precision and recall in all our benchmark conditions. It has also been successfully compared with Blast2GO, one of the methods most commonly employed for sequence annotation. The server is freely accessible at http://www.medcomp.medicina.unipd.it/Argot2.

  18. Numerical Simulations as Tool to Predict Chemical and Radiological Hazardous Diffusion in Case of Nonconventional Events

    Directory of Open Access Journals (Sweden)

    J.-F. Ciparisse

    2016-01-01

    Full Text Available CFD (Computational Fluid Dynamics simulations are widely used nowadays to predict the behaviour of fluids in pure research and in industrial applications. This approach makes it possible to get quantitatively meaningful results, often in good agreement with the experimental ones. The aim of this paper is to show how CFD calculations can help to understand the time evolution of two possible CBRNe (Chemical-Biological-Radiological-Nuclear-explosive events: (1 hazardous dust mobilization due to the interaction between a jet of air and a metallic powder in case of a LOVA (Loss Of Vacuum Accidents that is one of the possible accidents that can occur in experimental nuclear fusion plants; (2 toxic gas release in atmosphere. The scenario analysed in the paper has consequences similar to those expected in case of a release of dangerous substances (chemical or radioactive in enclosed or open environment during nonconventional events (like accidents or man-made or natural disasters.

  19. Predictive analytics tools to adjust and monitor performance metrics for the ATLAS Production System

    CERN Document Server

    Titov, Mikhail; The ATLAS collaboration

    2017-01-01

    Every scientific workflow involves an organizational part which purpose is to plan an analysis process thoroughly according to defined schedule, thus to keep work progress efficient. Having such information as an estimation of the processing time or possibility of system outage (abnormal behaviour) will improve the planning process, provide an assistance to monitor system performance and predict its next state. The ATLAS Production System is an automated scheduling system that is responsible for central production of Monte-Carlo data, highly specialized production for physics groups, as well as data pre-processing and analysis using such facilities as grid infrastructures, clouds and supercomputers. With its next generation (ProdSys2) the processing rate is around 2M tasks per year that is more than 365M jobs per year. ProdSys2 evolves to accommodate a growing number of users and new requirements from the ATLAS Collaboration, physics groups and individual users. ATLAS Distributed Computing in its current stat...

  20. Predictive Engineering Tools for Injection-Molded Long-Carbon-Fiber Thermoplastic Composites. Topical Report

    Energy Technology Data Exchange (ETDEWEB)

    Nguyen, Ba Nghiep [Pacific Northwest National Lab. (PNNL), Richland, WA (United States); Fifield, Leonard S. [Pacific Northwest National Lab. (PNNL), Richland, WA (United States); Wang, Jin [Autodesk, Inc., Ithaca, NY (United States); Costa, Franco [Autodesk, Inc., Ithaca, NY (United States); Lambert, Gregory [Virginia Polytechnic Inst. and State Univ. (Virginia Tech), Blacksburg, VA (United States); Baird, Donald G. [Virginia Polytechnic Inst. and State Univ. (Virginia Tech), Blacksburg, VA (United States); Sharma, Bhisham A. [Purdue Univ., West Lafayette, IN (United States); Kijewski, Seth A. [Purdue Univ., West Lafayette, IN (United States); Sangid, Michael D. [Purdue Univ., West Lafayette, IN (United States); Gandhi, Umesh N. [Toyota Research Inst. North America, Ann Arbor, MI (United States); Wollan, Eric J. [PlastiComp, Inc., Winona, MN (United States); Roland, Dale [PlastiComp, Inc., Winona, MN (United States); Mori, Steven [Magna Exteriors and Interiors Corporation, Aurora, ON (Canada); Tucker, III, Charles L. [Univ. of Illinois, Urbana-Champaign, IL (United States)

    2016-06-01

    This project aimed to integrate, optimize, and validate the fiber orientation and length distribution models previously developed and implemented in the Autodesk® Simulation Moldflow® Insight (ASMI) software package for injection-molded long-carbon-fiber (LCF) thermoplastic composite structures. The project was organized into two phases. Phase 1 demonstrated the ability of the advanced ASMI package to predict fiber orientation and length distributions in LCF/polypropylene (PP) and LCF/polyamide-6, 6 (PA66) plaques within 15% of experimental results. Phase 2 validated the advanced ASMI package by predicting fiber orientation and length distributions within 15% of experimental results for a complex three-dimensional (3D) Toyota automotive part injection-molded from LCF/PP and LCF/PA66 materials. Work under Phase 2 also included estimate of weight savings and cost impacts for a vehicle system using ASMI and structural analyses of the complex part. The present report summarizes the completion of Phases 1 and 2 work activities and accomplishments achieved by the team comprising Pacific Northwest National Laboratory (PNNL); Purdue University (Purdue); Virginia Polytechnic Institute and State University (Virginia Tech); Autodesk, Inc. (Autodesk); PlastiComp, Inc. (PlastiComp); Toyota Research Institute North America (Toyota); Magna Exteriors and Interiors Corp. (Magna); and University of Illinois. Figure 1 illustrates the technical approach adopted in this project that progressed from compounding LCF/PP and LCF/PA66 materials, to process model improvement and implementation, to molding and modeling LCF/PP and LCF/PA66 plaques. The lessons learned from the plaque study and the successful validation of improved process models for fiber orientation and length distributions for these plaques enabled the project to go to Phase 2 to mold, model, and optimize the 3D complex part.

  1. Using open source computational tools for predicting human metabolic stability and additional absorption, distribution, metabolism, excretion, and toxicity properties.

    Science.gov (United States)

    Gupta, Rishi R; Gifford, Eric M; Liston, Ted; Waller, Chris L; Hohman, Moses; Bunin, Barry A; Ekins, Sean

    2010-11-01

    Ligand-based computational models could be more readily shared between researchers and organizations if they were generated with open source molecular descriptors [e.g., chemistry development kit (CDK)] and modeling algorithms, because this would negate the requirement for proprietary commercial software. We initially evaluated open source descriptors and model building algorithms using a training set of approximately 50,000 molecules and a test set of approximately 25,000 molecules with human liver microsomal metabolic stability data. A C5.0 decision tree model demonstrated that CDK descriptors together with a set of Smiles Arbitrary Target Specification (SMARTS) keys had good statistics [κ = 0.43, sensitivity = 0.57, specificity = 0.91, and positive predicted value (PPV) = 0.64], equivalent to those of models built with commercial Molecular Operating Environment 2D (MOE2D) and the same set of SMARTS keys (κ = 0.43, sensitivity = 0.58, specificity = 0.91, and PPV = 0.63). Extending the dataset to ∼193,000 molecules and generating a continuous model using Cubist with a combination of CDK and SMARTS keys or MOE2D and SMARTS keys confirmed this observation. When the continuous predictions and actual values were binned to get a categorical score we observed a similar κ statistic (0.42). The same combination of descriptor set and modeling method was applied to passive permeability and P-glycoprotein efflux data with similar model testing statistics. In summary, open source tools demonstrated predictive results comparable to those of commercial software with attendant cost savings. We discuss the advantages and disadvantages of open source descriptors and the opportunity for their use as a tool for organizations to share data precompetitively, avoiding repetition and assisting drug discovery.

  2. The CHESS score: a simple tool for early prediction of shunt dependency after aneurysmal subarachnoid hemorrhage.

    Science.gov (United States)

    Jabbarli, R; Bohrer, A-M; Pierscianek, D; Müller, D; Wrede, K H; Dammann, P; El Hindy, N; Özkan, N; Sure, U; Müller, O

    2016-05-01

    Acute hydrocephalus is an early and common complication of aneurysmal subarachnoid hemorrhage (SAH). However, considerably fewer patients develop chronic hydrocephalus requiring shunt placement. Our aim was to develop a risk score for early identification of patients with shunt dependency after SAH. Two hundred and forty-two SAH individuals who were treated in our institution between January 2008 and December 2013 and survived the initial impact were retrospectively analyzed. Clinical parameters within 72 h after the ictus were correlated with shunt dependency. Independent predictors were summarized into a new risk score which was validated in a subsequent SAH cohort treated between January and December 2014. Seventy-five patients (31%) underwent shunt placement. Of 23 evaluated variables, only the following five showed independent associations with shunt dependency and were subsequently used to establish the Chronic Hydrocephalus Ensuing from SAH Score (CHESS, 0-8 points): Hunt and Hess grade ≥IV (1 point), location of the ruptured aneurysm in the posterior circulation (1 point), acute hydrocephalus (4 points), the presence of intraventricular hemorrhage (1 point) and early cerebral infarction on follow-up computed tomography scan (1 point). The CHESS showed strong correlation with shunt dependency (P = 0.0007) and could be successfully validated in both internal SAH cohorts tested. Patients scoring ≥6 CHESS points had significantly higher risk of shunt dependency (P CHESS may become a valuable diagnostic tool for early estimation of shunt dependency after SAH. Further evaluation and external validation will be required in prospective studies. © 2016 EAN.

  3. Cross-cultural adaptation of the STRATIFY tool in detecting and predicting risk of falling.

    Science.gov (United States)

    Enríquez de Luna-Rodríguez, Margarita; Aranda-Gallardo, Marta; Canca-Sánchez, José Carlos; Vazquez-Blanco, M José; Moya-Suárez, Ana Belén; Morales-Asencio, José Miguel

    To adapt to Spanish language the STRATIFY tool for clinical use in the Spanish-speaking World. A multicenter, 2 care settings cross-sectional study cultural adaptation study in acute care hospitals and nursing homes was performed in Andalusia during 2014. The adaptation process was divided into 4 stages: translation, back-translation, equivalence between the 2 back-translations and piloting of the Spanish version, thus obtaining the final version. The validity of appearance, content validity and the time required to complete the scale were taken into account. For analysis, the median, central tendency and dispersion of scores, the interquartile range, and the interquartile deviation for the possible variability in responses it was calculated. Content validity measured by content validity index reached a profit of 1. For the validity aspect the clarity and comprehensibility of the questions were taken into account. Of the 5 questions of the instrument, 2 had a small disagreement solved with the introduction of an explanatory phrase to achieve conceptual equivalence. Median both questions were equal or superior to 5. The average time for completion of the scale was less than 3 minutes. The process of adaptation to Spanish of STRATIFY has led to a semantic version and culturally equivalent to the original for easy filling and understanding for use in the Spanish-speaking world. Copyright © 2016 Elsevier España, S.L.U. All rights reserved.

  4. Overview on How Data Mining Tools May Support Cardiovascular Disease Prediction

    Directory of Open Access Journals (Sweden)

    Dan-Andrei Sitar-Taut

    2010-01-01

    Full Text Available Terms as knowledge discovery or KnowledgeDiscovery from Databases (KDD, Data Mining (DM, ArtificialIntelligence (AI, Machine Learning (ML, Artificial Neuralnetworks (ANN, decision tables and trees, gain from day to day,an increasing significance in medical data analysis. They permitthe identification, evaluation, and quantification of some lessvisible, intuitively unpredictable, by using generally large sets ofdata. Cardiology represents an extremely vast and importantdomain, having multiple and complex social and humanimplications. These are enough reasons to promote theresearches in this area, becoming shortly not just national orEuropean priorities, but also world-level ones. The profoundand multiple interwoven relationships among the cardiovascularrisk factors and cardiovascular diseases – but still far to becompletely discovered or understood – represent a niche forapplying IT&C modern and multidisciplinary tools in order tosolve the existing knowledge gaps.This paper’s aim is to present, by emphasizing their absoluteor relative pros and cons, several opportunities of applying DMtools in cardiology, more precisely in endothelial dysfunctiondiagnostic and quantification the relationships between theseand so-called “classical” cardiovascular risk factors.

  5. Efficient thermal error prediction in a machine tool using finite element analysis

    International Nuclear Information System (INIS)

    Mian, Naeem S; Fletcher, Simon; Longstaff, Andrew P; Myers, Alan

    2011-01-01

    Thermally induced errors have a major significance on the positional accuracy of a machine tool. Heat generated during the machining process produces thermal gradients that flow through the machine structure causing linear and nonlinear thermal expansions and distortions of associated complex discrete structures, producing deformations that adversely affect structural stability. The heat passes through structural linkages and mechanical joints where interfacial parameters such as the roughness and form of the contacting surfaces affect the thermal resistance and thus the heat transfer coefficients. This paper presents a novel offline technique using finite element analysis (FEA) to simulate the effects of the major internal heat sources such as bearings, motors and belt drives of a small vertical milling machine (VMC) and the effects of ambient temperature pockets that build up during the machine operation. Simplified models of the machine have been created offline using FEA software and evaluated experimental results applied for offline thermal behaviour simulation of the full machine structure. The FEA simulated results are in close agreement with the experimental results ranging from 65% to 90% for a variety of testing regimes and revealed a maximum error range of 70 µm reduced to less than 10 µm

  6. Comparison Of Human Modelling Tools For Efficiency Of Prediction Of EVA Tasks

    Science.gov (United States)

    Dischinger, H. Charles, Jr.; Loughead, Tomas E.

    1998-01-01

    Construction of the International Space Station (ISS) will require extensive extravehicular activity (EVA, spacewalks), and estimates of the actual time needed continue to rise. As recently as September, 1996, the amount of time to be spent in EVA was believed to be about 400 hours, excluding spacewalks on the Russian segment. This estimate has recently risen to over 1100 hours, and it could go higher before assembly begins in the summer of 1998. These activities are extremely expensive and hazardous, so any design tools which help assure mission success and improve the efficiency of the astronaut in task completion can pay off in reduced design and EVA costs and increased astronaut safety. The tasks which astronauts can accomplish in EVA are limited by spacesuit mobility. They are therefore relatively simple, from an ergonomic standpoint, requiring gross movements rather than time motor skills. The actual tasks include driving bolts, mating and demating electric and fluid connectors, and actuating levers; the important characteristics to be considered in design improvement include the ability of the astronaut to see and reach the item to be manipulated and the clearance required to accomplish the manipulation. This makes the tasks amenable to simulation in a Computer-Assisted Design (CAD) environment. For EVA, the spacesuited astronaut must have his or her feet attached on a work platform called a foot restraint to obtain a purchase against which work forces may be actuated. An important component of the design is therefore the proper placement of foot restraints.

  7. Estimating Prediction Uncertainty from Geographical Information System Raster Processing: A User's Manual for the Raster Error Propagation Tool (REPTool)

    Science.gov (United States)

    Gurdak, Jason J.; Qi, Sharon L.; Geisler, Michael L.

    2009-01-01

    The U.S. Geological Survey Raster Error Propagation Tool (REPTool) is a custom tool for use with the Environmental System Research Institute (ESRI) ArcGIS Desktop application to estimate error propagation and prediction uncertainty in raster processing operations and geospatial modeling. REPTool is designed to introduce concepts of error and uncertainty in geospatial data and modeling and provide users of ArcGIS Desktop a geoprocessing tool and methodology to consider how error affects geospatial model output. Similar to other geoprocessing tools available in ArcGIS Desktop, REPTool can be run from a dialog window, from the ArcMap command line, or from a Python script. REPTool consists of public-domain, Python-based packages that implement Latin Hypercube Sampling within a probabilistic framework to track error propagation in geospatial models and quantitatively estimate the uncertainty of the model output. Users may specify error for each input raster or model coefficient represented in the geospatial model. The error for the input rasters may be specified as either spatially invariant or spatially variable across the spatial domain. Users may specify model output as a distribution of uncertainty for each raster cell. REPTool uses the Relative Variance Contribution method to quantify the relative error contribution from the two primary components in the geospatial model - errors in the model input data and coefficients of the model variables. REPTool is appropriate for many types of geospatial processing operations, modeling applications, and related research questions, including applications that consider spatially invariant or spatially variable error in geospatial data.

  8. Integrated Predictive Tools for Customizing Microstructure and Material Properties of Additively Manufactured Aerospace Components

    Energy Technology Data Exchange (ETDEWEB)

    Radhakrishnan, Balasubramaniam [Oak Ridge National Lab. (ORNL), Oak Ridge, TN (United States); Fattebert, Jean-Luc [Oak Ridge National Lab. (ORNL), Oak Ridge, TN (United States); Gorti, Sarma B. [Oak Ridge National Lab. (ORNL), Oak Ridge, TN (United States); Haxhimali, Timor [Lawrence Livermore National Lab. (LLNL), Livermore, CA (United States); El-Wardany, Tahany [United Technologies Research Center (UTRC), East Hartford, CT (United States); Acharya, Ranadip [United Technologies Research Center (UTRC), East Hartford, CT (United States); Staroselsky, Alexander [United Technologies Research Center (UTRC), East Hartford, CT (United States)

    2017-12-01

    Additive Manufacturing (AM) refers to a process by which digital three-dimensional (3-D) design data is converted to build up a component by depositing material layer-by-layer. United Technologies Corporation (UTC) is currently involved in fabrication and certification of several AM aerospace structural components made from aerospace materials. This is accomplished by using optimized process parameters determined through numerous design-of-experiments (DOE)-based studies. Certification of these components is broadly recognized as a significant challenge, with long lead times, very expensive new product development cycles and very high energy consumption. Because of these challenges, United Technologies Research Center (UTRC), together with UTC business units have been developing and validating an advanced physics-based process model. The specific goal is to develop a physics-based framework of an AM process and reliably predict fatigue properties of built-up structures as based on detailed solidification microstructures. Microstructures are predicted using process control parameters including energy source power, scan velocity, deposition pattern, and powder properties. The multi-scale multi-physics model requires solution and coupling of governing physics that will allow prediction of the thermal field and enable solution at the microstructural scale. The state-of-the-art approach to solve these problems requires a huge computational framework and this kind of resource is only available within academia and national laboratories. The project utilized the parallel phase-fields codes at Oak Ridge National Laboratory (ORNL) and Lawrence Livermore National Laboratory (LLNL), along with the high-performance computing (HPC) capabilities existing at the two labs to demonstrate the simulation of multiple dendrite growth in threedimensions (3-D). The LLNL code AMPE was used to implement the UTRC phase field model that was previously developed for a model binary alloy, and

  9. pH-dependent solubility and permeability profiles: A useful tool for prediction of oral bioavailability.

    Science.gov (United States)

    Sieger, P; Cui, Y; Scheuerer, S

    2017-07-15

    pH-dependent solubility - permeability profiles offer a simple way to predict bioavailability after oral application, if bioavailability is only solubility and permeability driven. Combining both pH-dependent solubility and pH-dependent permeability in one diagram provides a pH-window (=ΔpH sol-perm ) from which the conditions for optimal oral bioavailability can be taken. The size of this window is directly proportional to the observed oral bioavailability. A set of 21 compounds, with known absolute human oral bioavailability, was used to establish this correlation. Compounds with ΔpH sol-perm bioavailability (bioavailability typically by approximately 25%. For compounds where ΔpH sol-perm ≥3 but still showing poor bioavailability, most probably other pharmacokinetic aspects (e.g. high clearance), are limiting exposure. Interestingly, the location of this pH-window seems to have a negligible influence on the observed oral bioavailability. In scenarios, where the bioavailability is impaired by certain factors, like for example proton pump inhibitor co-medication or food intake, the exact position of this pH-window might be beneficial for understanding the root cause. Copyright © 2017 Elsevier B.V. All rights reserved.

  10. Prediction and mastering of wine acidity and tartaric precipitations: the Mextar® software tool

    Directory of Open Access Journals (Sweden)

    Audrey Devatine

    2002-06-01

    The article describes the theoretical background upon which MEXTAR® has been developed and details the computation procedures associated to the simulation of the operations described above. Iterative processes are used to determine the various parameters enounced above. They aim in particular at the verification of the electroneutrality of the sample. However, at first electroneutrality is rarely verified because the wine chemical analytical determination is never exhaustive. Therefore, we introduce the concept of «vinic acid» to compensate the negative ion deficit of usual wine chemical analytical determination. This «vinic acid» is describe as a diacid with a fixed second dissociation constant (pK2 = 5 and a first dissociation constant determined automatically by MEXTAR®. This enables MEXTAR® to simulate accurately experimental tartaric precipitations for several samples. Thanks to the predicting capacity of MEXTAR®, the correlation between the total polyphenol content (IPT in red wines and the difficulty to stabilise high IPT wine with respect to tartaric salts is confirmed. Finally, malolactic fermentations are also well simulated. A validation of MEXTAR® for the acid addition or removal has yet to be done when reliable experimental data are available.

  11. Improvement of the Prediction of Drugs Demand Using Spatial Data Mining Tools.

    Science.gov (United States)

    Ramos, M Isabel; Cubillas, Juan José; Feito, Francisco R

    2016-01-01

    The continued availability of products at any store is the major issue in order to provide good customer service. If the store is a drugstore this matter reaches a greater importance, as out of stock of a drug when there is high demand causes problems and tensions in the healthcare system. There are numerous studies of the impact this issue has on patients. The lack of any drug in a pharmacy in certain seasons is very common, especially when some external factors proliferate favoring the occurrence of certain diseases. This study focuses on a particular drug consumed in the city of Jaen, southern Andalucia, Spain. Our goal is to determine in advance the Salbutamol demand. Advanced data mining techniques have been used with spatial variables. These last have a key role to generate an effective model. In this research we have used the attributes that are associated with Salbutamol demand and it has been generated a very accurate prediction model of 5.78% of mean absolute error. This is a very encouraging data considering that the consumption of this drug in Jaen varies 500% from one period to another.

  12. CAMS as a tool for identifying and predicting abnormal plant states using real-time simulation

    International Nuclear Information System (INIS)

    Fantoni, P.F.; Soerenssen, A.; Meyer, G.

    1999-01-01

    CAMS (Computerised Accident Management Support) is a system that provides assistance to the staff in a nuclear power plant control room, in the technical support centre and in the national safety centre. Support is offered in identification of the current plant state, in assessment of the future development of the accident and in planning mitigation strategies. CAMS is a modular system, where several modules perform different tasks under the control and supervision of a central knowledge based system, which is responsible of the syncronisation and the flow of information through the activated modules. A CAMS prototype has been tested by the Swedish Nuclear Inspectorate during a safety exercise in Sweden in 1995, with satisfactory results. Future developments include automatic control of the Predictive Simulator by the State Identification, for the generation of possible mitigation strategies, and the development of an improved user interface which considers the integration of the system in an advanced control room. CAMS is a system developed as a joint research activity at the Halden Reactor Project in close cooperation with member organisations. The project, started in 1993, has now arrived to the second prototype version, which has been presented and demonstrated at several seminars and workshops around the world. (author)

  13. Status epilepticus severity score (STESS): A useful tool to predict outcome of status epilepticus.

    Science.gov (United States)

    Goyal, Manoj Kumar; Chakravarthi, Sudheer; Modi, Manish; Bhalla, Ashish; Lal, Vivek

    2015-12-01

    The treatment protocols for status epilepticus (SE) range from small doses of intravenous benzodiazepines to induction of coma. The pros and cons of more aggressive treatment regimen remain debatable. The importance of an index need not be overemphasized which can predict outcome of SE and guide the intensity of treatment. We tried to evaluate utility of one such index Status epilepticus severity score (STESS). 44 consecutive patients of SE were enrolled in the study. STESS results were compared with various outcome measures: (a) mortality, (b) final neurological outcome at discharge as defined by functional independence measure (FIM) (good outcome: FIM score 5-7; bad outcome: FIM score 1-4), (c) control of SE within 1h of start of treatment and (d) need for coma induction. A higher STESS score correlated significantly with poor neurological outcome at discharge (p=0.0001), need for coma induction (p=0.0001) and lack of response to treatment within 1h (p=0.001). A STESS of status epilepticus. Further studies on STESS based treatment approach may help in designing better therapeutic regimens for SE. Copyright © 2015 Elsevier B.V. All rights reserved.

  14. Utilisation des "algues-fourrage" en aquaculture

    OpenAIRE

    Chretiennot-dinet, Marie-josèphe; Robert, Rene; His, Edouard

    1986-01-01

    Les travaux concernant l'utilisation d'algues unicellulaires pour la nutrtion de larves et de juvéniles de bivalves d'intérêt commercial sont analysés. Sur une cinquantaine d'espèces d'algues testées, un dizaine seulement sont produites en grande quantité dans des écloseries commerciales sous le non "d'algues fourrage". Les principales espèces employées sont décrites et leurs caractéristiques majeures illustrées. Les critères permettant de retenir une espèce pour son utilisation en aquacultur...

  15. Pattern of Smartphones Utilisation among Engineering Undergraduates

    OpenAIRE

    Muliati Sedek

    2014-01-01

    The smartphones ownership among the undergraduates in Malaysia was recorded as high. However, little was known about its utilization patterns, thus, the focus of this research was to determine the utilisation patterns of smartphones based on the National Education Technology Standard for Students (NETS.S) among engineering undergraduates in Malaysia. This study was based on a quantitative research and the population comprised undergraduates from four Malaysian Technical Universities. A total ...

  16. Environmental assessment of incinerator residue utilisation

    OpenAIRE

    Toller, Susanna; Kärrman, Erik; Gustafsson, Jon Petter; Magnusson, Y.

    2009-01-01

    Incineration ashes may be treated either as a waste to be dumped in landfill, or as a resource that is suit able for re-use. In order to choose the best management scenario, knowledge is needed on the potential environmental impact that may be expected, including not only local, but also regional and global impact. In this study. A life cycle assessment (LCA) based approach Was Outlined for environmental assessment of incinerator residue utilisation, in which leaching of trace elements as wel...

  17. Char characterization and DTF assays as tools to predict burnout of coal blends in power plants

    Energy Technology Data Exchange (ETDEWEB)

    C. Ulloa; A.G. Borrego; S. Helle; A.L. Gordon; X. Garcia [Universidad de Concepcion, Concepcion (Chile). Departamento de Ingenieria Quimica

    2005-02-01

    The aim of this study is to predict efficiency deviations in the combustion of coal blends in power plants. Combustion of blends, as compared to its single coals, shows that for some blends the behavior is non-additive in nature. Samples of coal feed and fly ashes from combustion of blends at two power plants, plus chars of the parent coals generated in a drop-tube furnace (DTF) at temperatures and heating rates similar to those found in the industrial boilers were used. Intrinsic kinetic parameters, burning profiles and petrographic characteristics of these chars correlated well with the burnout in power plants and DTF experiments. The blend combustion in a DTF reproduces both positive and negative burnout deviations from the expected weighted average. These burnout deviations have been previously attributed to parallel or parallel-series pathways of competition for oxygen. No deviations were found for blends of low rank coals of similar characteristics yielding chars close in morphology, optical texture and reactivity. Negative deviations were found for blends of coals differing moderately in rank and were interpreted as associated with long periods of competition. In this case, fly-ashes were enriched in material derived from the least reactive char, but also unburnt material attributed to the most reactive char was identified. Improved burnout compared to the weighted average was observed for blends of coals very different in rank, and interpreted as the result of a short interaction period, followed by a period where the less reactive char burns under conditions that are more favorable to its combustion. In this case, only unburned material from the least reactive char was identified in the fly-ashes. 20 refs., 9 figs., 5 tabs.

  18. DEWEPS - Development and Evaluation of new Wind forecasting tools with an Ensemble Prediction System

    Energy Technology Data Exchange (ETDEWEB)

    Moehrlen, C.; Joergensen, Jess

    2012-02-15

    There is an ongoing trend of increased privatization in the handling of renewable energy. This trend is required to ensure an efficient energy system, where improvements that make economic sense are prioritised. The reason why centralized forecasting can be a challenge in that matter is that the TSOs tend to optimize on physical error rather than cost. Consequently, the market is likely to speculate against the TSO, which in turn increases the cost of balancing. A privatized pool of wind and/or solar power is more difficult to speculate against, because the optimization criteria is unpredictable due to subjective risk considerations that may be taken into account at any time. Although there is and additional level of costs for the trading of the private volume, it can be argued that competition will accelerate efficiency from an economic perspective. The amount of power put into the market will become less predictable, when the wind power spot market bid takes place on the basis of a risk consideration in addition to the forecast information itself. The scope of this project is to contribute to more efficient wind power integration targeted both to centralised and decentralised cost efficient IT solutions, which will complement each other in market based energy systems. The DEWEPS project resulted in an extension of the number of Ensemble forecasts, an incremental trade strategy for balancing unpredictable power production, and an IT platform for efficient handling of power generation units. Together, these three elements contribute to less need for reserves, more capacity in the market, and thus more competition. (LN)

  19. Predicting the mean cycle time as a function of throughput and product mix for cluster tool workstations using EPT-based aggregate modeling

    NARCIS (Netherlands)

    Veeger, C.P.L.; Etman, L.F.P.; Herk, van J.; Rooda, J.E.

    2009-01-01

    Predicting the mean cycle time as a function of throughput and product mix is helpful in making the production planning for cluster tools. To predict the mean cycle time, detailed simulation models may be used. However, detailed models require much development time, and it may not be possible to

  20. Development of a simple tool to predict the risk of postpartum diabetes in women with gestational diabetes mellitus.

    Science.gov (United States)

    Köhler, M; Ziegler, A G; Beyerlein, A

    2016-06-01

    Women with gestational diabetes mellitus (GDM) have an increased risk of diabetes postpartum. We developed a score to predict the long-term risk of postpartum diabetes using clinical and anamnestic variables recorded during or shortly after delivery. Data from 257 GDM women who were prospectively followed for diabetes outcome over 20 years of follow-up were used to develop and validate the risk score. Participants were divided into training and test sets. The risk score was calculated using Lasso Cox regression and divided into four risk categories, and its prediction performance was assessed in the test set. Postpartum diabetes developed in 110 women. The computed training set risk score of 5 × body mass index in early pregnancy (per kg/m(2)) + 132 if GDM was treated with insulin (otherwise 0) + 44 if the woman had a family history of diabetes (otherwise 0) - 35 if the woman lactated (otherwise 0) had R (2) values of 0.23, 0.25, and 0.33 at 5, 10, and 15 years postpartum, respectively, and a C-Index of 0.75. Application of the risk score in the test set resulted in observed risk of postpartum diabetes at 5 years of 11 % for low risk scores ≤140, 29 % for scores 141-220, 64 % for scores 221-300, and 80 % for scores >300. The derived risk score is easy to calculate, allows accurate prediction of GDM-related postpartum diabetes, and may thus be a useful prediction tool for clinicians and general practitioners.

  1. Developing a computational tool for predicting physical parameters of a typical VVER-1000 core based on artificial neural network

    International Nuclear Information System (INIS)

    Mirvakili, S.M.; Faghihi, F.; Khalafi, H.

    2012-01-01

    Highlights: ► Thermal–hydraulics parameters of a VVER-1000 core based on neural network (ANN), are carried out. ► Required data for ANN training are found based on modified COBRA-EN code and then linked each other using MATLAB software. ► Based on ANN method, average and maximum temperature of fuel and clad as well as MDNBR of each FA are predicted. -- Abstract: The main goal of the present article is to design a computational tool to predict physical parameters of the VVER-1000 nuclear reactor core based on artificial neural network (ANN), taking into account a detailed physical model of the fuel rods and coolant channels in a fuel assembly. Predictions of thermal characteristics of fuel, clad and coolant are performed using cascade feed forward ANN based on linear fission power distribution and power peaking factors of FAs and hot channels factors (which are found based on our previous neutronic calculations). A software package has been developed to prepare the required data for ANN training which applies a modified COBRA-EN code for sub-channel analysis and links the codes using the MATLAB software. Based on the current estimation system, five main core TH parameters are predicted, which include the average and maximum temperatures of fuel and clad as well as the minimum departure from nucleate boiling ratio (MDNBR) for each FA. To get the best conditions for the considered ANNs training, a comprehensive sensitivity study has been performed to examine the effects of variation of hidden neurons, hidden layers, transfer functions, and the learning algorithms on the training and simulation results. Performance evaluation results show that the developed ANN can be trained to estimate the core TH parameters of a typical VVER-1000 reactor quickly without loss of accuracy.

  2. Development of a tool for prediction of ovarian cancer in patients with adnexal masses: Value of plasma fibrinogen.

    Directory of Open Access Journals (Sweden)

    Veronika Seebacher

    Full Text Available To develop a tool for individualized risk estimation of presence of cancer in women with adnexal masses, and to assess the added value of plasma fibrinogen.We performed a retrospective analysis of a prospectively maintained database of 906 patients with adnexal masses who underwent cystectomy or oophorectomy. Uni- and multivariate logistic regression analyses including pre-operative plasma fibrinogen levels and established predictors were performed. A nomogram was generated to predict the probability of ovarian cancer. Internal validation with split-sample analysis was performed. Decision curve analysis (DCA was then used to evaluate the clinical net benefit of the prediction model.Ovarian cancer including borderline tumours was found in 241 (26.6% patients. In multivariate analysis, elevated plasma fibrinogen, elevated CA-125, suspicion for malignancy on ultrasound, and postmenopausal status were associated with ovarian cancer and formed the basis for the nomogram. The overall predictive accuracy of the model, as measured by AUC, was 0.91 (95% CI 0.87-0.94. DCA revealed a net benefit for using this model for predicting ovarian cancer presence compared to a strategy of treat all or treat none.We confirmed the value of plasma fibrinogen as a strong predictor for ovarian cancer in a large cohort of patients with adnexal masses. We developed a highly accurate multivariable model to help in the clinical decision-making regarding the presence of ovarian cancer. This model provided net benefit for a wide range of threshold probabilities. External validation is needed before a recommendation for its use in routine practice can be given.

  3. Exonic Splicing Mutations Are More Prevalent than Currently Estimated and Can Be Predicted by Using In Silico Tools

    Science.gov (United States)

    Soukarieh, Omar; Gaildrat, Pascaline; Hamieh, Mohamad; Drouet, Aurélie; Baert-Desurmont, Stéphanie; Frébourg, Thierry; Tosi, Mario; Martins, Alexandra

    2016-01-01

    The identification of a causal mutation is essential for molecular diagnosis and clinical management of many genetic disorders. However, even if next-generation exome sequencing has greatly improved the detection of nucleotide changes, the biological interpretation of most exonic variants remains challenging. Moreover, particular attention is typically given to protein-coding changes often neglecting the potential impact of exonic variants on RNA splicing. Here, we used the exon 10 of MLH1, a gene implicated in hereditary cancer, as a model system to assess the prevalence of RNA splicing mutations among all single-nucleotide variants identified in a given exon. We performed comprehensive minigene assays and analyzed patient’s RNA when available. Our study revealed a staggering number of splicing mutations in MLH1 exon 10 (77% of the 22 analyzed variants), including mutations directly affecting splice sites and, particularly, mutations altering potential splicing regulatory elements (ESRs). We then used this thoroughly characterized dataset, together with experimental data derived from previous studies on BRCA1, BRCA2, CFTR and NF1, to evaluate the predictive power of 3 in silico approaches recently described as promising tools for pinpointing ESR-mutations. Our results indicate that ΔtESRseq and ΔHZEI-based approaches not only discriminate which variants affect splicing, but also predict the direction and severity of the induced splicing defects. In contrast, the ΔΨ-based approach did not show a compelling predictive power. Our data indicates that exonic splicing mutations are more prevalent than currently appreciated and that they can now be predicted by using bioinformatics methods. These findings have implications for all genetically-caused diseases. PMID:26761715

  4. Exonic Splicing Mutations Are More Prevalent than Currently Estimated and Can Be Predicted by Using In Silico Tools.

    Directory of Open Access Journals (Sweden)

    Omar Soukarieh

    2016-01-01

    Full Text Available The identification of a causal mutation is essential for molecular diagnosis and clinical management of many genetic disorders. However, even if next-generation exome sequencing has greatly improved the detection of nucleotide changes, the biological interpretation of most exonic variants remains challenging. Moreover, particular attention is typically given to protein-coding changes often neglecting the potential impact of exonic variants on RNA splicing. Here, we used the exon 10 of MLH1, a gene implicated in hereditary cancer, as a model system to assess the prevalence of RNA splicing mutations among all single-nucleotide variants identified in a given exon. We performed comprehensive minigene assays and analyzed patient's RNA when available. Our study revealed a staggering number of splicing mutations in MLH1 exon 10 (77% of the 22 analyzed variants, including mutations directly affecting splice sites and, particularly, mutations altering potential splicing regulatory elements (ESRs. We then used this thoroughly characterized dataset, together with experimental data derived from previous studies on BRCA1, BRCA2, CFTR and NF1, to evaluate the predictive power of 3 in silico approaches recently described as promising tools for pinpointing ESR-mutations. Our results indicate that ΔtESRseq and ΔHZEI-based approaches not only discriminate which variants affect splicing, but also predict the direction and severity of the induced splicing defects. In contrast, the ΔΨ-based approach did not show a compelling predictive power. Our data indicates that exonic splicing mutations are more prevalent than currently appreciated and that they can now be predicted by using bioinformatics methods. These findings have implications for all genetically-caused diseases.

  5. Analysis and prediction of flow from local source in a river basin using a Neuro-fuzzy modeling tool.

    Science.gov (United States)

    Aqil, Muhammad; Kita, Ichiro; Yano, Akira; Nishiyama, Soichi

    2007-10-01

    Traditionally, the multiple linear regression technique has been one of the most widely used models in simulating hydrological time series. However, when the nonlinear phenomenon is significant, the multiple linear will fail to develop an appropriate predictive model. Recently, neuro-fuzzy systems have gained much popularity for calibrating the nonlinear relationships. This study evaluated the potential of a neuro-fuzzy system as an alternative to the traditional statistical regression technique for the purpose of predicting flow from a local source in a river basin. The effectiveness of the proposed identification technique was demonstrated through a simulation study of the river flow time series of the Citarum River in Indonesia. Furthermore, in order to provide the uncertainty associated with the estimation of river flow, a Monte Carlo simulation was performed. As a comparison, a multiple linear regression analysis that was being used by the Citarum River Authority was also examined using various statistical indices. The simulation results using 95% confidence intervals indicated that the neuro-fuzzy model consistently underestimated the magnitude of high flow while the low and medium flow magnitudes were estimated closer to the observed data. The comparison of the prediction accuracy of the neuro-fuzzy and linear regression methods indicated that the neuro-fuzzy approach was more accurate in predicting river flow dynamics. The neuro-fuzzy model was able to improve the root mean square error (RMSE) and mean absolute percentage error (MAPE) values of the multiple linear regression forecasts by about 13.52% and 10.73%, respectively. Considering its simplicity and efficiency, the neuro-fuzzy model is recommended as an alternative tool for modeling of flow dynamics in the study area.

  6. Primary Sclerosing Cholangitis Risk Estimate Tool (PREsTo) Predicts Outcomes in PSC: A Derivation & Validation Study Using Machine Learning.

    Science.gov (United States)

    Eaton, John E; Vesterhus, Mette; McCauley, Bryan M; Atkinson, Elizabeth J; Schlicht, Erik M; Juran, Brian D; Gossard, Andrea A; LaRusso, Nicholas F; Gores, Gregory J; Karlsen, Tom H; Lazaridis, Konstantinos N

    2018-05-09

    Improved methods are needed to risk stratify and predict outcomes in patients with primary sclerosing cholangitis (PSC). Therefore, we sought to derive and validate a new prediction model and compare its performance to existing surrogate markers. The model was derived using 509 subjects from a multicenter North American cohort and validated in an international multicenter cohort (n=278). Gradient boosting, a machine based learning technique, was used to create the model. The endpoint was hepatic decompensation (ascites, variceal hemorrhage or encephalopathy). Subjects with advanced PSC or cholangiocarcinoma at baseline were excluded. The PSC risk estimate tool (PREsTo) consists of 9 variables: bilirubin, albumin, serum alkaline phosphatase (SAP) times the upper limit of normal (ULN), platelets, AST, hemoglobin, sodium, patient age and the number of years since PSC was diagnosed. Validation in an independent cohort confirms PREsTo accurately predicts decompensation (C statistic 0.90, 95% confidence interval (CI) 0.84-0.95) and performed well compared to MELD score (C statistic 0.72, 95% CI 0.57-0.84), Mayo PSC risk score (C statistic 0.85, 95% CI 0.77-0.92) and SAP statistic 0.65, 95% CI 0.55-0.73). PREsTo continued to be accurate among individuals with a bilirubin statistic 0.90, 95% CI 0.82-0.96) and when the score was re-applied at a later course in the disease (C statistic 0.82, 95% CI 0.64-0.95). PREsTo accurately predicts hepatic decompensation in PSC and exceeds the performance among other widely available, noninvasive prognostic scoring systems. This article is protected by copyright. All rights reserved. © 2018 by the American Association for the Study of Liver Diseases.

  7. Phylogeny is a powerful tool for predicting plant biomass responses to nitrogen enrichment.

    Science.gov (United States)

    Wooliver, Rachel C; Marion, Zachary H; Peterson, Christopher R; Potts, Brad M; Senior, John K; Bailey, Joseph K; Schweitzer, Jennifer A

    2017-08-01

    Increasing rates of anthropogenic nitrogen (N) enrichment to soils often lead to the dominance of nitrophilic plant species and reduce plant diversity in natural ecosystems. Yet, we lack a framework to predict which species will be winners or losers in soil N enrichment scenarios, a framework that current literature suggests should integrate plant phylogeny, functional tradeoffs, and nutrient co-limitation. Using a controlled fertilization experiment, we quantified biomass responses to N enrichment for 23 forest tree species within the genus Eucalyptus that are native to Tasmania, Australia. Based on previous work with these species' responses to global change factors and theory on the evolution of plant resource-use strategies, we hypothesized that (1) growth responses to N enrichment are phylogenetically structured, (2) species with more resource-acquisitive functional traits have greater growth responses to N enrichment, and (3) phosphorus (P) limits growth responses to N enrichment differentially across species, wherein P enrichment increases growth responses to N enrichment more in some species than others. We built a hierarchical Bayesian model estimating effects of functional traits (specific leaf area, specific stem density, and specific root length) and P fertilization on species' biomass responses to N, which we then compared between lineages to determine whether phylogeny explains variation in responses to N. In concordance with literature on N limitation, a majority of species responded strongly and positively to N enrichment. Mean responses ranged three-fold, from 6.21 (E. pulchella) to 16.87 (E. delegatensis) percent increases in biomass per g N·m -2 ·yr -1 added. We identified a strong difference in responses to N between two phylogenetic lineages in the Eucalyptus subgenus Symphyomyrtus, suggesting that shared ancestry explains variation in N limitation. However, our model indicated that after controlling for phylogenetic non

  8. Toward a coupled Hazard-Vulnerability Tool for Flash Flood Impacts Prediction

    Science.gov (United States)

    Terti, Galateia; Ruin, Isabelle; Anquetin, Sandrine; Gourley, Jonathan J.

    2015-04-01

    Flash floods (FF) are high-impact, catastrophic events that result from the intersection of hydrometeorological extremes and society at small space-time scales, generally on the order of minutes to hours. Because FF events are generally localized in space and time, they are very difficult to forecast with precision and can subsequently leave people uninformed and subject to surprise in the midst of their daily activities (e.g., commuting to work). In Europe, FFs are the main source of natural hazard fatalities, although they affect smaller areas than riverine flooding. In the US, also, flash flooding is the leading cause of weather-related deaths most years, with some 200 annual fatalities. There were 954 fatalities and approximately 31 billion U.S. dollars of property damage due to floods and flash floods from 1995 to 2012 in the US. For forecasters and emergency managers the prediction of and subsequent response to impacts due to such a sudden onset and localized event remains a challenge. This research is motivated by the hypothesis that the intersection of the spatio-temporal context of the hazard with the distribution of people and their characteristics across space and time reveals different paths of vulnerability. We argue that vulnerability and the dominant impact type varies dynamically throughout the day and week according to the location under concern. Thus, indices are appropriate to develop and provide, for example, vehicle-related impacts on active population being focused on the road network during morning or evening rush hours. This study describes the methodological developments of our approach and applies our hypothesis to the case of the June 14th, 2010 flash flood event in the Oklahoma City area (Oklahoma, US). Social (i.e. population socio-economic profile), exposure (i.e. population distribution, land use), and physical (i.e. built and natural environment) data are used to compose different vulnerability products based on the forecast location

  9. Faculty Decisions on Serials Subscriptions Differ Significantly from Decisions Predicted by a Bibliometric Tool.

    Directory of Open Access Journals (Sweden)

    Sue F. Phelps

    2016-03-01

    Full Text Available Objective – To compare faculty choices of serials subscription cancellations to the scores of a bibliometric tool. Design – Natural experiment. Data was collected about faculty valuations of serials. The California Digital Library Weighted Value Algorithm (CDL-WVA was used to measure the value of journals to a particular library. These two sets of scores were then compared. Setting – A public research university in the United States of America. Subjects – Teaching and research faculty, as well as serials data. Methods – Experimental methodology was used to compare faculty valuations of serials (based on their journal cancellation choices to bibliometric valuations of the same journal titles (determined by CDL-WVA scores to identify the match rate between the faculty choices and the bibliographic data. Faculty were asked to select titles to cancel that totaled approximately 30% of the budget for their disciplinary fund code. This “keep” or “cancel” choice was the binary variable for the study. Usage data was gathered for articles downloaded through the link resolver for titles in each disciplinary dataset, and the CDL-WVA scores were determined for each journal title based on utility, quality, and cost effectiveness. Titles within each dataset were ranked highest to lowest using the CDL-WVA scores within each fund code, and then by subscription cost for titles with the same CDL-WVA score. The journal titles selected for comparison were those that ranked above the approximate 30% of titles chosen for cancellation by faculty and CDL-WVA scores. Researchers estimated an odds ratio of faculty choosing to keep a title and a CDL-WVA score that indicated the title should be kept. The p-value for that result was less than 0.0001, indicating that there was a negligible probability that the results were by chance. They also applied logistic regression to quantify the association between the numeric score of CDL-WVA and the binary variable

  10. Inflation and capacity utilisation in Nigeria's manufacturing sector ...

    African Journals Online (AJOL)

    This study analysed the relationship between inflation and capacity utilisation empirically leaning on the model employed by Baylor (2001). It utilised time series secondary data using least square multiple regression technique. The quarterly data utilised were tested for stationarity using ADF test. The multiple regression ...

  11. Comparison between frailty index of deficit accumulation and fracture risk assessment tool (FRAX) in prediction of risk of fractures.

    Science.gov (United States)

    Li, Guowei; Thabane, Lehana; Papaioannou, Alexandra; Adachi, Jonathan D

    2015-08-01

    A frailty index (FI) of deficit accumulation could quantify and predict the risk of fractures based on the degree of frailty in the elderly. We aimed to compare the predictive powers between the FI and the fracture risk assessment tool (FRAX) in predicting risk of major osteoporotic fracture (hip, upper arm or shoulder, spine, or wrist) and hip fracture, using the data from the Global Longitudinal Study of Osteoporosis in Women (GLOW) 3-year Hamilton cohort. There were 3985 women included in the study, with the mean age of 69.4 years (standard deviation [SD] = 8.89). During the follow-up, there were 149 (3.98%) incident major osteoporotic fractures and 18 (0.48%) hip fractures reported. The FRAX and FI were significantly related to each other. Both FRAX and FI significantly predicted risk of major osteoporotic fracture, with a hazard ratio (HR) of 1.03 (95% confidence interval [CI]: 1.02-1.05) and 1.02 (95% CI: 1.01-1.04) for per-0.01 increment for the FRAX and FI respectively. The HRs were 1.37 (95% CI: 1.19-1.58) and 1.26 (95% CI: 1.12-1.42) for an increase of per-0.10 (approximately one SD) in the FRAX and FI respectively. Similar discriminative ability of the models was found: c-index = 0.62 for the FRAX and c-index = 0.61 for the FI. When cut-points were chosen to trichotomize participants into low-risk, medium-risk and high-risk groups, a significant increase in fracture risk was found in the high-risk group (HR = 2.04, 95% CI: 1.36-3.07) but not in the medium-risk group (HR = 1.23, 95% CI: 0.82-1.84) compared with the low-risk women for the FI, while for FRAX the medium-risk (HR = 2.00, 95% CI: 1.09-3.68) and high-risk groups (HR = 2.61, 95% CI: 1.48-4.58) predicted risk of major osteoporotic fracture significantly only when survival time exceeded 18months (550 days). Similar findings were observed for hip fracture and in sensitivity analyses. In conclusion, the FI is comparable with FRAX in the prediction of risk of future fractures, indicating that

  12. Computational Immunology Meets Bioinformatics: The Use of Prediction Tools for Molecular Binding in the Simulation of the Immune System

    DEFF Research Database (Denmark)

    Rapin, N.; Lund, Ole; Bernaschi, M.

    2010-01-01

    potential measurements, for assessing molecular binding in the context of immune complexes. We benchmark the resulting model by simulating a classical immunization experiment that reproduces the development of immune memory. We also investigate the role of major histocompatibility complex (MHC) haplotype...... proliferate more than any other. These results show that the simulator produces dynamics that are stable and consistent with basic immunological knowledge. We believe that the combination of genomic information and simulation of the dynamics of the immune system, in one single tool, can offer new perspectives......We present a new approach to the study of the immune system that combines techniques of systems biology with information provided by data-driven prediction methods. To this end, we have extended an agent-based simulator of the immune response, C-IMMSIM, such that it represents pathogens, as well...

  13. In 'big bang' major incidents do triage tools accurately predict clinical priority?: a systematic review of the literature.

    Science.gov (United States)

    Kilner, T M; Brace, S J; Cooke, M W; Stallard, N; Bleetman, A; Perkins, G D

    2011-05-01

    The term "big bang" major incidents is used to describe sudden, usually traumatic,catastrophic events, involving relatively large numbers of injured individuals, where demands on clinical services rapidly outstrip the available resources. Triage tools support the pre-hospital provider to prioritise which patients to treat and/or transport first based upon clinical need. The aim of this review is to identify existing triage tools and to determine the extent to which their reliability and validity have been assessed. A systematic review of the literature was conducted to identify and evaluate published data validating the efficacy of the triage tools. Studies using data from trauma patients that report on the derivation, validation and/or reliability of the specific pre-hospital triage tools were eligible for inclusion.Purely descriptive studies, reviews, exercises or reports (without supporting data) were excluded. The search yielded 1982 papers. After initial scrutiny of title and abstract, 181 papers were deemed potentially applicable and from these 11 were identified as relevant to this review (in first figure). There were two level of evidence one studies, three level of evidence two studies and six level of evidence three studies. The two level of evidence one studies were prospective validations of Clinical Decision Rules (CDR's) in children in South Africa, all the other studies were retrospective CDR derivation, validation or cohort studies. The quality of the papers was rated as good (n=3), fair (n=7), poor (n=1). There is limited evidence for the validity of existing triage tools in big bang major incidents.Where evidence does exist it focuses on sensitivity and specificity in relation to prediction of trauma death or severity of injury based on data from single or small number patient incidents. The Sacco system is unique in combining survivability modelling with the degree by which the system is overwhelmed in the triage decision system. The

  14. Software tool for portal dosimetry research.

    Science.gov (United States)

    Vial, P; Hunt, P; Greer, P B; Oliver, L; Baldock, C

    2008-09-01

    This paper describes a software tool developed for research into the use of an electronic portal imaging device (EPID) to verify dose for intensity modulated radiation therapy (IMRT) beams. A portal dose image prediction (PDIP) model that predicts the EPID response to IMRT beams has been implemented into a commercially available treatment planning system (TPS). The software tool described in this work was developed to modify the TPS PDIP model by incorporating correction factors into the predicted EPID image to account for the difference in EPID response to open beam radiation and multileaf collimator (MLC) transmitted radiation. The processes performed by the software tool include; i) read the MLC file and the PDIP from the TPS, ii) calculate the fraction of beam-on time that each point in the IMRT beam is shielded by MLC leaves, iii) interpolate correction factors from look-up tables, iv) create a corrected PDIP image from the product of the original PDIP and the correction factors and write the corrected image to file, v) display, analyse, and export various image datasets. The software tool was developed using the Microsoft Visual Studio.NET framework with the C# compiler. The operation of the software tool was validated. This software provided useful tools for EPID dosimetry research, and it is being utilised and further developed in ongoing EPID dosimetry and IMRT dosimetry projects.

  15. Feasibility of an Assessment Tool for Children's Competence to Consent to Predictive Genetic Testing: a Pilot Study.

    Science.gov (United States)

    Hein, Irma M; Troost, Pieter W; Lindeboom, Robert; Christiaans, Imke; Grisso, Thomas; van Goudoever, Johannes B; Lindauer, Ramón J L

    2015-12-01

    Knowledge on children's capacities to consent to medical treatment is limited. Also, age limits for asking children's consent vary considerably between countries. Decision-making on predictive genetic testing (PGT) is especially complicated, considering the ongoing ethical debate. In order to examine just age limits for alleged competence to consent in children, we evaluated feasibility of a standardized assessment tool, and investigated cutoff ages for children's competence to consent to PGT. We performed a pilot study, including 17 pediatric outpatients between 6 and 18 years at risk for an autosomal dominantly inherited cardiac disease, eligible for predictive genetic testing. The reference standard for competence was established by experts trained in the relevant criteria for competent decision-making. The MacArthur Competence Assessment Tool for Treatment (MacCAT-T) served as index test. Data analysis included raw agreement between competence classifications, difference in mean ages between children judged competent and judged incompetent, and estimation of cutoff ages for judgments of competence. Twelve (71 %) children were considered competent by the reference standard, and 16 (94 %) by the MacCAT-T, with an overall agreement of 76 %. The expert judgments disagreed in most cases, while the MacCAT-T judgments agreed in 65 %. Mean age of children judged incompetent was 9.3 years and of children judged competent 12.1 years (p = .035). With 90 % sensitivity, children younger than 10.0 years were judged incompetent, with 90 % specificity children older than 11.8 years were judged competent. Feasibility of the MacCAT-T in children is confirmed. Initial findings on age cutoffs are indicative for children between the age of 12 and 18 to be judged competent for involvement in the informed consent process. Future research on appropriate age-limits for children's alleged competence to consent is needed.

  16. Effect-based tools for monitoring and predicting the ecotoxicological effects of chemicals in the aquatic environment.

    Science.gov (United States)

    Connon, Richard E; Geist, Juergen; Werner, Inge

    2012-01-01

    Ecotoxicology faces the challenge of assessing and predicting the effects of an increasing number of chemical stressors on aquatic species and ecosystems. Herein we review currently applied tools in ecological risk assessment, combining information on exposure with expected biological effects or environmental water quality standards; currently applied effect-based tools are presented based on whether exposure occurs in a controlled laboratory environment or in the field. With increasing ecological relevance the reproducibility, specificity and thus suitability for standardisation of methods tends to diminish. We discuss the use of biomarkers in ecotoxicology including ecotoxicogenomics-based endpoints, which are becoming increasingly important for the detection of sublethal effects. Carefully selected sets of biomarkers allow an assessment of exposure to and effects of toxic chemicals, as well as the health status of organisms and, when combined with chemical analysis, identification of toxicant(s). The promising concept of "adverse outcome pathways (AOP)" links mechanistic responses on the cellular level with whole organism, population, community and potentially ecosystem effects and services. For most toxic mechanisms, however, practical application of AOPs will require more information and the identification of key links between responses, as well as key indicators, at different levels of biological organization, ecosystem functioning and ecosystem services.

  17. Effect-Based Tools for Monitoring and Predicting the Ecotoxicological Effects of Chemicals in the Aquatic Environment

    Directory of Open Access Journals (Sweden)

    Richard E. Connon

    2012-09-01

    Full Text Available Ecotoxicology faces the challenge of assessing and predicting the effects of an increasing number of chemical stressors on aquatic species and ecosystems. Herein we review currently applied tools in ecological risk assessment, combining information on exposure with expected biological effects or environmental water quality standards; currently applied effect-based tools are presented based on whether exposure occurs in a controlled laboratory environment or in the field. With increasing ecological relevance the reproducibility, specificity and thus suitability for standardisation of methods tends to diminish. We discuss the use of biomarkers in ecotoxicology including ecotoxicogenomics-based endpoints, which are becoming increasingly important for the detection of sublethal effects. Carefully selected sets of biomarkers allow an assessment of exposure to and effects of toxic chemicals, as well as the health status of organisms and, when combined with chemical analysis, identification of toxicant(s. The promising concept of “adverse outcome pathways (AOP” links mechanistic responses on the cellular level with whole organism, population, community and potentially ecosystem effects and services. For most toxic mechanisms, however, practical application of AOPs will require more information and the identification of key links between responses, as well as key indicators, at different levels of biological organization, ecosystem functioning and ecosystem services.

  18. Results prediction in industrial processes. A control tool based on the Knowledge; La prediccion de resultados en procesos industriales

    Energy Technology Data Exchange (ETDEWEB)

    Zabala-Uriarte, A.; Suarez-Creo, R.; Izaga-Gmaguregi, J.

    2009-07-01

    The difficulties involved in most metallurgical processes are well known, specially when the number of factors that act in then is very high. The problems are even more important if we want to forecast the process behaviour, because it is not easy to build a framework of links between the critical variables using the available information. This work takes into account the availability of several computer generic tools which, with a suitable adaptation and endowed with the specific knowledge, are able of learning the process, of connecting a lot of facts and of forecasting the product quality, sustaining at the same time the process under control. These tools manage the plant information, help to reach a robust process, increase its knowledge and improve its performance, related with the reject level in ppm. the development of this type of tools, were considered some years ago as utopian. The analytical method used is based in an initial selection of the defect that we want to study an the, factor or characteristics managing the process. Afterwards we will describe the most likely potential causes, source of the studied defect, and we will arrange then and give priority with probabilistic criteria, searching the root causes to all of them. During the running of industrial processes we will connect, through the computer program, the experimental measurements of the selected factors with the actual results, so that the system learns, and the same time we can reject the less significant variables, improving that way the reliability of the prediction. The conclusions are based in real applications, put in practice in different lines of production, for the validation of the system and the testing of its efficiency using the corresponding success index. (Author)

  19. Methods Developed by the Tools for Engine Diagnostics Task to Monitor and Predict Rotor Damage in Real Time

    Science.gov (United States)

    Baaklini, George Y.; Smith, Kevin; Raulerson, David; Gyekenyesi, Andrew L.; Sawicki, Jerzy T.; Brasche, Lisa

    2003-01-01

    Tools for Engine Diagnostics is a major task in the Propulsion System Health Management area of the Single Aircraft Accident Prevention project under NASA s Aviation Safety Program. The major goal of the Aviation Safety Program is to reduce fatal aircraft accidents by 80 percent within 10 years and by 90 percent within 25 years. The goal of the Propulsion System Health Management area is to eliminate propulsion system malfunctions as a primary or contributing factor to the cause of aircraft accidents. The purpose of Tools for Engine Diagnostics, a 2-yr-old task, is to establish and improve tools for engine diagnostics and prognostics that measure the deformation and damage of rotating engine components at the ground level and that perform intermittent or continuous monitoring on the engine wing. In this work, nondestructive-evaluation- (NDE-) based technology is combined with model-dependent disk spin experimental simulation systems, like finite element modeling (FEM) and modal norms, to monitor and predict rotor damage in real time. Fracture mechanics time-dependent fatigue crack growth and damage-mechanics-based life estimation are being developed, and their potential use investigated. In addition, wireless eddy current and advanced acoustics are being developed for on-wing and just-in-time NDE engine inspection to provide deeper access and higher sensitivity to extend on-wing capabilities and improve inspection readiness. In the long run, these methods could establish a base for prognostic sensing while an engine is running, without any overt actions, like inspections. This damage-detection strategy includes experimentally acquired vibration-, eddy-current- and capacitance-based displacement measurements and analytically computed FEM-, modal norms-, and conventional rotordynamics-based models of well-defined damages and critical mass imbalances in rotating disks and rotors.

  20. Detecting Human Hydrologic Alteration from Diversion Hydropower Requires Universal Flow Prediction Tools: A Proposed Framework for Flow Prediction in Poorly-gauged, Regulated Rivers

    Science.gov (United States)

    Kibler, K. M.; Alipour, M.

    2016-12-01

    Achieving the universal energy access Sustainable Development Goal will require great investment in renewable energy infrastructure in the developing world. Much growth in the renewable sector will come from new hydropower projects, including small and diversion hydropower in remote and mountainous regions. Yet, human impacts to hydrological systems from diversion hydropower are poorly described. Diversion hydropower is often implemented in ungauged rivers, thus detection of impact requires flow analysis tools suited to prediction in poorly-gauged and human-altered catchments. We conduct a comprehensive analysis of hydrologic alteration in 32 rivers developed with diversion hydropower in southwestern China. As flow data are sparse, we devise an approach for estimating streamflow during pre- and post-development periods, drawing upon a decade of research into prediction in ungauged basins. We apply a rainfall-runoff model, parameterized and forced exclusively with global-scale data, in hydrologically-similar gauged and ungauged catchments. Uncertain "soft" data are incorporated through fuzzy numbers and confidence-based weighting, and a multi-criteria objective function is applied to evaluate model performance. Testing indicates that the proposed framework returns superior performance (NSE = 0.77) as compared to models parameterized by rote calibration (NSE = 0.62). Confident that the models are providing `the right answer for the right reasons', our analysis of hydrologic alteration based on simulated flows indicates statistically significant hydrologic effects of diversion hydropower across many rivers. Mean annual flows, 7-day minimum and 7-day maximum flows decreased. Frequency and duration of flow exceeding Q25 decreased while duration of flows sustained below the Q75 increased substantially. Hydrograph rise and fall rates and flow constancy increased. The proposed methodology may be applied to improve diversion hydropower design in data-limited regions.

  1. lncRScan-SVM: A Tool for Predicting Long Non-Coding RNAs Using Support Vector Machine.

    Science.gov (United States)

    Sun, Lei; Liu, Hui; Zhang, Lin; Meng, Jia

    2015-01-01

    Functional long non-coding RNAs (lncRNAs) have been bringing novel insight into biological study, however it is still not trivial to accurately distinguish the lncRNA transcripts (LNCTs) from the protein coding ones (PCTs). As various information and data about lncRNAs are preserved by previous studies, it is appealing to develop novel methods to identify the lncRNAs more accurately. Our method lncRScan-SVM aims at classifying PCTs and LNCTs using support vector machine (SVM). The gold-standard datasets for lncRScan-SVM model training, lncRNA prediction and method comparison were constructed according to the GENCODE gene annotations of human and mouse respectively. By integrating features derived from gene structure, transcript sequence, potential codon sequence and conservation, lncRScan-SVM outperforms other approaches, which is evaluated by several criteria such as sensitivity, specificity, accuracy, Matthews correlation coefficient (MCC) and area under curve (AUC). In addition, several known human lncRNA datasets were assessed using lncRScan-SVM. LncRScan-SVM is an efficient tool for predicting the lncRNAs, and it is quite useful for current lncRNA study.

  2. Predicting recidivism among adult male child pornography offenders: Development of the Child Pornography Offender Risk Tool (CPORT).

    Science.gov (United States)

    Seto, Michael C; Eke, Angela W

    2015-08-01

    In this study, we developed a structured risk checklist, the Child Pornography Offender Risk Tool (CPORT), to predict any sexual recidivism among adult male offenders with a conviction for child pornography offenses. We identified predictors of sexual recidivism using a 5-year fixed follow-up analysis from a police case file sample of 266 adult male child pornography offenders in the community after their index offense. In our 5-year follow-up, 29% committed a new offense, and 11% committed a new sexual offense, with 3% committing a new contact sexual offense against a child and 9% committing a new child pornography offense. The CPORT items comprised younger offender age, any prior criminal history, any contact sexual offending, any failure on conditional release, indication of sexual interest in child pornography material or prepubescent or pubescent children, more boy than girl content in child pornography, and more boy than girl content in other child depictions. The CPORT was significantly associated with any sexual recidivism, with moderate predictive accuracy, and thus has promise in the risk assessment of adult male child pornography offenders with further cross-validation. (c) 2015 APA, all rights reserved).

  3. The nematode Caenorhabditis elegans as a tool to predict chemical activity on mammalian development and identify mechanisms influencing toxicological outcome.

    Science.gov (United States)

    Harlow, Philippa H; Perry, Simon J; Widdison, Stephanie; Daniels, Shannon; Bondo, Eddie; Lamberth, Clemens; Currie, Richard A; Flemming, Anthony J

    2016-03-18

    To determine whether a C. elegans bioassay could predict mammalian developmental activity, we selected diverse compounds known and known not to elicit such activity and measured their effect on C. elegans egg viability. 89% of compounds that reduced C. elegans egg viability also had mammalian developmental activity. Conversely only 25% of compounds found not to reduce egg viability in C. elegans were also inactive in mammals. We conclude that the C. elegans egg viability assay is an accurate positive predictor, but an inaccurate negative predictor, of mammalian developmental activity. We then evaluated C. elegans as a tool to identify mechanisms affecting toxicological outcomes among related compounds. The difference in developmental activity of structurally related fungicides in C. elegans correlated with their rate of metabolism. Knockdown of the cytochrome P450s cyp-35A3 and cyp-35A4 increased the toxicity to C. elegans of the least developmentally active compounds to the level of the most developmentally active. This indicated that these P450s were involved in the greater rate of metabolism of the less toxic of these compounds. We conclude that C. elegans based approaches can predict mammalian developmental activity and can yield plausible hypotheses for factors affecting the biological potency of compounds in mammals.

  4. Using Bayesian Network as a tool for coastal storm flood impact prediction at Varna Bay (Bulgaria, Western Black Sea)

    Science.gov (United States)

    Valchev, Nikolay; Eftimova, Petya; Andreeva, Nataliya; Prodanov, Bogdan

    2017-04-01

    Coastal zone is among the fastest evolving areas worldwide. Ever increasing population inhabiting coastal settlements develops often conflicting economic and societal activities. The existing imbalance between the expansion of these activities, on one hand, and the potential to accommodate them in a sustainable manner, on the other, becomes a critical problem. Concurrently, coasts are affected by various hydro-meteorological phenomena such as storm surges, heavy seas, strong winds and flash floods, which intensities and occurrence frequency is likely to increase due to the climate change. This implies elaboration of tools capable of quick prediction of impact of those phenomena on the coast and providing solutions in terms of disaster risk reduction measures. One such tool is Bayesian network. Proposed paper describes the set-up of such network for Varna Bay (Bulgaria, Western Black Sea). It relates near-shore storm conditions to their onshore flood potential and ultimately to relevant impact as relative damage on coastal and manmade environment. Methodology for set-up and training of the Bayesian network was developed within RISC-KIT project (Resilience-Increasing Strategies for Coasts - toolKIT). Proposed BN reflects the interaction between boundary conditions, receptors, hazard, and consequences. Storm boundary conditions - maximum significant wave height and peak surge level, were determined on the basis of their historical and projected occurrence. The only hazard considered in this study is flooding characterized by maximum inundation depth. BN was trained with synthetic events created by combining estimated boundary conditions. Flood impact was modeled with the process-based morphodynamical model XBeach. Restaurants, sport and leisure facilities, administrative buildings, and car parks were introduced in the network as receptors. Consequences (impact) are estimated in terms of relative damage caused by given inundation depth. National depth

  5. Computational immunology meets bioinformatics: the use of prediction tools for molecular binding in the simulation of the immune system.

    Directory of Open Access Journals (Sweden)

    Nicolas Rapin

    Full Text Available We present a new approach to the study of the immune system that combines techniques of systems biology with information provided by data-driven prediction methods. To this end, we have extended an agent-based simulator of the immune response, C-ImmSim, such that it represents pathogens, as well as lymphocytes receptors, by means of their amino acid sequences and makes use of bioinformatics methods for T and B cell epitope prediction. This is a key step for the simulation of the immune response, because it determines immunogenicity. The binding of the epitope, which is the immunogenic part of an invading pathogen, together with activation and cooperation from T helper cells, is required to trigger an immune response in the affected host. To determine a pathogen's epitopes, we use existing prediction methods. In addition, we propose a novel method, which uses Miyazawa and Jernigan protein-protein potential measurements, for assessing molecular binding in the context of immune complexes. We benchmark the resulting model by simulating a classical immunization experiment that reproduces the development of immune memory. We also investigate the role of major histocompatibility complex (MHC haplotype heterozygosity and homozygosity with respect to the influenza virus and show that there is an advantage to heterozygosity. Finally, we investigate the emergence of one or more dominating clones of lymphocytes in the situation of chronic exposure to the same immunogenic molecule and show that high affinity clones proliferate more than any other. These results show that the simulator produces dynamics that are stable and consistent with basic immunological knowledge. We believe that the combination of genomic information and simulation of the dynamics of the immune system, in one single tool, can offer new perspectives for a better understanding of the immune system.

  6. Substrate utilisation by plant-cell cultures

    Energy Technology Data Exchange (ETDEWEB)

    Fowler, M W

    1982-01-01

    Plant cell cultures have been grown on a wide range of carbon sources in addition to the traditional ones of sucrose and glucose. Biomass yields and growth rates vary greatly between the different carbon sources and there is a variation in response between different cell cultures to individual carbon sources. Some attempts have been made to grow cell cultures on 'waste' and related carbon sources, such as lactose, maltose, starch, molasses and milk whey. Only maltose was found to support growth to anything near the levels observed with glucose and sucrose. In the case of molasses carbon source cell growth was either non-existent or only just measurable. All the data point to glucose as being the most suitable carbon source, principally on the grounds of biomass yield and growth rate. It should be noted, however, that other carbon sources do appear to have a major (positive) influence on natural product synthesis. Uptake into the cell is an important aspect of carbohydrate utilisation. There is strong evidence that from disaccharides upwards, major degradation to smaller units occurs before uptake. In some cases the necessary enzymes appear to be excreted into the culture broth, in others they may be located within the cell wall; invertase that hydrolyses sucrose is a good example. Once the products of carbohydrate degradation and mobilisation enter the cell they may suffer one of two fates, oxidation or utilisation for biosynthesis. The precise split between these two varies depending on such factors as cell growth rate, cell size, nutrient broth composition and carbohydrate status of the cells. In general rapidly growing cells have a high rate of oxidation, whereas cells growing more slowly tend to be more directed towards biosynthesis. Carbohydrate utilisation is a key area of study, underpinning as it does both biomass yield and natural product synthesis. (Refs. 13).

  7. Mutagenesis Objective Search and Selection Tool (MOSST: an algorithm to predict structure-function related mutations in proteins

    Directory of Open Access Journals (Sweden)

    Asenjo Juan A

    2011-04-01

    's primary sequence into a group of functionally non-disruptive amino acids and a second group of functionally deleterious amino acids. Conclusions With this approach, not only conserved amino acid positions in a protein family can be labeled as functionally relevant, but also non-conserved amino acid positions can be identified to have a physicochemically meaningful functional effect. These results become a discriminative tool in the selection and elaboration of rational mutagenesis strategies for the protein. They can also be used to predict if a given nsSNP, identified, for instance, in a genomic-scale analysis, can have a functional implication for a particular protein and which nsSNPs are most likely to be functionally silent for a protein. This analytical tool could be used to rapidly and automatically discard any irrelevant nsSNP and guide the research focus toward functionally significant mutations. Based on preliminary results and applications, this technique shows promising performance as a valuable bioinformatics tool to aid in the development of new protein variants and in the understanding of function-structure relationships in proteins.

  8. Utilising UDT to push the bandwidth envelope

    Science.gov (United States)

    Garrett, B.; Davies, B.

    eScience applications, in particular High Energy Physics, often involve large amounts of data and/or computing and often require secure resource sharing across organizational boundaries, and are thus not easily handled by today's networking infrastructures. By utilising the switched lightpath connections provided by the UKLight network it has been possible to research the use of alternate protocols for data transport. While the HEP projects make use of a number of middleware solutions for data storage and transport, they all rely on GridFTP for WAN transport. The GridFTP protocol runs over TCP as the layer 3 protocol by default, however with the latest released of the Globus toolkit it is possible to utilise alternate protocols at the layer 3 level. One of the alternatives is a reliable version of UDP called UDT. This report presents the results of the tests measuring the performance of single-threaded file transfers using GridFTP running over both TCP and the UDT protocol.

  9. Energy analysis of various grassland utilisation systems

    Directory of Open Access Journals (Sweden)

    Jozef Ržonca

    2005-01-01

    Full Text Available In 2003 and 2004 was carried out the energy analysis of the different types of permanent grassland utilization on the Hrubý Jeseník locality. There were estimated values of the particular entrances of additional energy. Energy entrances moved according to the pratotechnologies from 2.17 GJ. ha–1 to 22.70 GJ.ha–1. The biggest share on energy entrances had fertilizers. It was 84.93% by the nitrogen fertilisation. The most energy benefit of brutto and nettoenergy was marked by the low intensive utilisation (33.40 GJ.ha–1 NEL and 32.40 GJ.ha–1 NEV on average. The highest value of energy efficiency (13.23% was marked by the low intensive utilization of permanent grassland. By using of higher doses of industrial fertilizers has energy efficiency decreased. From view of energy benefit and intensiveness on energy entrances it appears the most available utilisation of permanent grassland with three cuts per year (first cut on May 31st at the latest, every next after 60 days or two cuts per year (first cut on July 15th, next cuts after 90 days.

  10. Which neuromuscular or cognitive test is the optimal screening tool to predict falls in frail community-dwelling older people?

    Science.gov (United States)

    Shimada, Hiroyuki; Suzukawa, Megumi; Tiedemann, Anne; Kobayashi, Kumiko; Yoshida, Hideyo; Suzuki, Takao

    2009-01-01

    The use of falls risk screening tools may aid in targeting fall prevention interventions in older individuals most likely to benefit. To determine the optimal physical or cognitive test to screen for falls risk in frail older people. This prospective cohort study involved recruitment from 213 day-care centers in Japan. The feasibility study included 3,340 ambulatory individuals aged 65 years or older enrolled in the Tsukui Ordered Useful Care for Health (TOUCH) program. The external validation study included a subsample of 455 individuals who completed all tests. Physical tests included grip strength (GS), chair stand test (CST), one-leg standing test (OLS), functional reach test (FRT), tandem walking test (TWT), 6-meter walking speed at a comfortable pace (CWS) and at maximum pace (MWS), and timed up-and-go test (TUG). The mental status questionnaire (MSQ) was used to measure cognitive function. The incidence of falls during 1 year was investigated by self-report or an interview with the participant's family and care staff. The most practicable tests were the GS and MSQ, which could be administered to more than 90% of the participants regardless of the activities of daily living status. The FRT and TWT had lower feasibility than other lower limb function tests. During the 1-year retrospective analysis of falls, 99 (21.8%) of the 455 validation study participants had fallen at least once. Fallers showed significantly poorer performance than non-fallers in the OLS (p = 0.003), TWT (p = 0.001), CWS (p = 0.013), MWS (p = 0.007), and TUG (p = 0.011). The OLS, CWS, and MWS remained significantly associated with falls when performance cut-points were determined. Logistic regression analysis revealed that the TWT was a significant and independent, yet weak predictor of falls. A weighting system which considered feasibility and validity scored the CWS (at a cut-point of 0.7 m/s) as the best test to predict risk of falls. Clinical tests of neuromuscular function can predict

  11. Dementia Population Risk Tool (DemPoRT): study protocol for a predictive algorithm assessing dementia risk in the community.

    Science.gov (United States)

    Fisher, Stacey; Hsu, Amy; Mojaverian, Nassim; Taljaard, Monica; Huyer, Gregory; Manuel, Douglas G; Tanuseputro, Peter

    2017-10-24

    The burden of disease from dementia is a growing global concern as incidence increases dramatically with age, and average life expectancy has been increasing around the world. Planning for an ageing population requires reliable projections of dementia prevalence; however, existing population projections are simple and have poor predictive accuracy. The Dementia Population Risk Tool (DemPoRT) will predict incidence of dementia in the population setting using multivariable modelling techniques and will be used to project dementia prevalence. The derivation cohort will consist of elderly Ontario respondents of the Canadian Community Health Survey (CCHS) (2001, 2003, 2005 and 2007; 18 764 males and 25 288 females). Prespecified predictors include sociodemographic, general health, behavioural, functional and health condition variables. Incident dementia will be identified through individual linkage of survey respondents to population-level administrative healthcare databases (1797 and 3281 events, and 117 795 and 166 573 person-years of follow-up, for males and females, respectively, until 31 March 2014). Using time of first dementia capture as the primary outcome and death as a competing risk, sex-specific proportional hazards regression models will be estimated. The 2008/2009 CCHS survey will be used for validation (approximately 4600 males and 6300 females). Overall calibration and discrimination will be assessed as well as calibration within predefined subgroups of importance to clinicians and policy makers. Research ethics approval has been granted by the Ottawa Health Science Network Research Ethics Board. DemPoRT results will be submitted for publication in peer-review journals and presented at scientific meetings. The algorithm will be assessable online for both population and individual uses. ClinicalTrials.gov NCT03155815, pre-results. © Article author(s) (or their employer(s) unless otherwise stated in the text of the article) 2017. All rights reserved. No

  12. Simulation for Prediction of Entry Article Demise (SPEAD): An Analysis Tool for Spacecraft Safety Analysis and Ascent/Reentry Risk Assessment

    Science.gov (United States)

    Ling, Lisa

    2014-01-01

    For the purpose of performing safety analysis and risk assessment for a potential off-nominal atmospheric reentry resulting in vehicle breakup, a synthesis of trajectory propagation coupled with thermal analysis and the evaluation of node failure is required to predict the sequence of events, the timeline, and the progressive demise of spacecraft components. To provide this capability, the Simulation for Prediction of Entry Article Demise (SPEAD) analysis tool was developed. The software and methodology have been validated against actual flights, telemetry data, and validated software, and safety/risk analyses were performed for various programs using SPEAD. This report discusses the capabilities, modeling, validation, and application of the SPEAD analysis tool.

  13. The modified high-density survival assay is the useful tool to predict the effectiveness of fractionated radiation exposure

    International Nuclear Information System (INIS)

    Kuwahara, Yoshikazu; Mori, Miyuki; Oikawa, Toshiyuki; Shimura, Tsutomu; Fukumoto, Manabu; Ohtake, Yosuke; Ohkubo, Yasuhito; Mori, Shiro

    2010-01-01

    The high-density survival (HDS) assay was originally elaborated to assess cancer cell responses to therapeutic agents under the influence of intercellular communication. Here, we simplified the original HDS assay and studied its applicability for the detection of cellular radioresistance. We have recently defined clinically relevant radioresistant (CRR) cells, which continue to proliferate with daily exposure to 2 gray (Gy) of X-rays for more than 30 days in vitro. We established human CRR cell lines, HepG2-8960-R from HepG2, and SAS-R1 and -R2 from SAS, respectively. In an attempt to apply the HDS assay to detect radioresistance with clinical relevance, we simplified the original HDS assay by scoring the total number of surviving cells after exposure to X-rays. The modified HDS assay successfully detected radioresistance with clinical relevance. The modified HDS assay detected CRR phenotype, which is not always detectable by clonogenic assay. Therefore, we believe that the modified HDS assay presented in this study is a powerful tool to predict the effectiveness of fractionated radiotherapy against malignant tumors. (author)

  14. Spatiotemporal floodplain mapping and prediction using HEC-RAS - GIS tools: Case of the Mejerda river, Tunisia

    Science.gov (United States)

    Ben Khalfallah, C.; Saidi, S.

    2018-06-01

    The floods have become a scourge in recent years (Floods of, 2003, 2006, 2009, 2011, and 2012), increasingly frequent and devastating. Tunisia does not escape flooding problems, the flood management requires basically a better knowledge of the phenomenon (flood), and the use of predictive methods. In order to limit this risk, we became interested in hydrodynamics modeling of Medjerda basin. To reach this aim, rainfall distribution is studied and mapped using GIS tools. In addition, flood and return period estimation of rainfall are calculated using Hyfran. Also, Simulations of recent floods are calculated and mapped using HEC-RAS and HEC-GeoRAS for the most recent flood occurred in February-March 2015 in Medjerda basin. The analysis of the results shows a good correlation between simulated parameters and those measured. There is a flood of the river exceeding 240 m3/s (DGRE, 2015) and more flowing sections are observed in the future simulations; for return periods of 10yr, 20yr and 50yr.

  15. Using of pH as a tool to predict salinity of groundwater for irrigation purpose using artificial neural network

    Directory of Open Access Journals (Sweden)

    Mahmoud Nasr

    2014-01-01

    Full Text Available Monitoring of groundwater quality is one of the important tools to provide adequate information about water management. In the present study, artificial neural network (ANN with a feed-forward back-propagation was designed to predict groundwater salinity, expressed by total dissolved solids (TDS, using pH as an input parameter. Groundwater samples were collected from a 36 m depth well located in the experimental farm of the City of Scientific Researches and Technological Applications (SRTA City, New Borg El-Arab City, Alexandria, Egypt. The network structure was 1–5–3–1 and used the default Levenberg–Marquardt algorithm for training. It was observed that, the best validation performance, based on the mean square error, was 14819 at epoch 0, and no major problems or over-fitting occurred with the training step. The simulated output tracked the measured data with a correlation coefficient (R-value of 0.64, 0.67 and 0.90 for training, validation and test, respectively. In this case, the network response was acceptable, and simulation could be used for entering new inputs.

  16. The Glasgow Prognostic Score. An useful tool to predict survival in patients with advanced esophageal squamous cell carcinoma.

    Science.gov (United States)

    Henry, Maria Aparecida Coelho de Arruda; Lerco, Mauro Masson; de Oliveira, Walmar Kerche; Guerra, Anderson Roberto; Rodrigues, Maria Aparecida Marchesan

    2015-08-01

    To evaluate the usefulness of the Glasgow Prognostic Score (GPS) in patients with esophageal carcinoma (EC). A total of 50 patients with EC were analyzed for GPS, nutritional and clinicopathologic parameters. Patients with CRP ≤ 1.0mg/L and albumin ≥ 3.5mg/L were considered as GPS = 0. Patients with only CRP increased or albumin decreased were classified as GPS = 1 and patients with CRP > 1.0mg/L and albumin L were considered as GPS = 2. GPS of 0, 1 and 2 were observed in seven, 23 and 20 patients, respectively. A significant inverse relationship was observed between GPS scores and the survival rate. The survival rate was greatest in patients with GPS = 0 and significantly higher than those from patients with GPS = 1 and GPS = 2. Minimum 12-month survival was observed in 71% patients with GPS = 0 and in 30% patients with GPS = 1. None of the patients with GPS = 2 survived for 12 months. A significant relationship between CRP or albumin individually and the survival rate was observed. No significant relationship among nutritional, clinic pathological parameters and survival was found. Glasgow Prognostic Score is an useful tool to predict survival in patients with esophageal carcinoma.

  17. 2B-Alert Web: An Open-Access Tool for Predicting the Effects of Sleep/Wake Schedules and Caffeine Consumption on Neurobehavioral Performance.

    Science.gov (United States)

    Reifman, Jaques; Kumar, Kamal; Wesensten, Nancy J; Tountas, Nikolaos A; Balkin, Thomas J; Ramakrishnan, Sridhar

    2016-12-01

    Computational tools that predict the effects of daily sleep/wake amounts on neurobehavioral performance are critical components of fatigue management systems, allowing for the identification of periods during which individuals are at increased risk for performance errors. However, none of the existing computational tools is publicly available, and the commercially available tools do not account for the beneficial effects of caffeine on performance, limiting their practical utility. Here, we introduce 2B-Alert Web, an open-access tool for predicting neurobehavioral performance, which accounts for the effects of sleep/wake schedules, time of day, and caffeine consumption, while incorporating the latest scientific findings in sleep restriction, sleep extension, and recovery sleep. We combined our validated Unified Model of Performance and our validated caffeine model to form a single, integrated modeling framework instantiated as a Web-enabled tool. 2B-Alert Web allows users to input daily sleep/wake schedules and caffeine consumption (dosage and time) to obtain group-average predictions of neurobehavioral performance based on psychomotor vigilance tasks. 2B-Alert Web is accessible at: https://2b-alert-web.bhsai.org. The 2B-Alert Web tool allows users to obtain predictions for mean response time, mean reciprocal response time, and number of lapses. The graphing tool allows for simultaneous display of up to seven different sleep/wake and caffeine schedules. The schedules and corresponding predicted outputs can be saved as a Microsoft Excel file; the corresponding plots can be saved as an image file. The schedules and predictions are erased when the user logs off, thereby maintaining privacy and confidentiality. The publicly accessible 2B-Alert Web tool is available for operators, schedulers, and neurobehavioral scientists as well as the general public to determine the impact of any given sleep/wake schedule, caffeine consumption, and time of day on performance of a

  18. Climate impact from peat utilisation in Sweden

    Energy Technology Data Exchange (ETDEWEB)

    Uppenberg, S.; Zetterberg, L.; Aahman, M.

    2001-08-01

    The climate impact from the use of peat for energy production in Sweden has been evaluated in terms of contribution to atmospheric radiative forcing. This was done by attempting to answer the question 'What will be the climate impact if one would use 1 m{sup 2} of mire for peat extraction during 20 years?'. Two different methods of after-treatment were studied: afforestation and restoration of wetland. The climate impact from a peatland - wetland energy scenario and a peatland - forestry energy scenario was compared to the climate impact from coal, natural gas and forest residues. Sensitivity analyses were performed to evaluate which parameters that are important to take into consideration in order to minimize the climate impact from peat utilisation.

  19. Design of neutron detectors utilising luminescent glass

    International Nuclear Information System (INIS)

    Spowart, A.R.

    1983-01-01

    Impetus for the development of new neutron detector designs has derived from the worldwide commissioning of neutron spallation sources. The design concepts, and principal methods of utilisation of these major installations, have been recently reviewed. Their principal feature of interest is their broadband neutron emission allowing neutron investigations of all types of structure in materials from biological molecules to steels. Conventional neutron detectors are gas-filled devices, based on BF/sub 3/ or /sup 3/He gas. Their major advantage is their intrinsically low background count. Their principal disadvantage is their slow response time (10-100 μs), high cost and relative lack of flexibility in design to cope with large areas or complex geometry detection. They are, however, long established and the research facilities around the world have a heavy investment in the interpretative hardware for gas detectors

  20. Waste and dust utilisation in shaft furnaces

    Energy Technology Data Exchange (ETDEWEB)

    Senk, D.; Babich, A.; Gudenau, H.W. [Rhein Westfal TH Aachen, Aachen (Germany)

    2005-07-01

    Wastes and dusts from steel industry, non-ferrous metallurgy and other branches can be utilised e.g. in agglomeration processes (sintering, pelletising or briquetting) and by injection into shaft furnaces. This paper deals with the second way. Combustion and reduction behaviour of iron- and carbon-rich metallurgical dusts and sludges containing lead, zinc and alkali as well as other wastes with and without pulverised coal (PC) has been studied when injecting into shaft furnaces. Following shaft furnaces have been examined: blast furnace, cupola furnace, OxiCup furnace and imperial-smelting furnace. Investigations have been done at laboratory and industrial scale. Some dusts and wastes under certain conditions can be not only reused but can also improve combustion efficiency at the tuyeres as well as furnace performance and productivity.

  1. Climate impact from peat utilisation in Sweden

    International Nuclear Information System (INIS)

    Uppenberg, S.; Zetterberg, L.; Aahman, M.

    2001-08-01

    The climate impact from the use of peat for energy production in Sweden has been evaluated in terms of contribution to atmospheric radiative forcing. This was done by attempting to answer the question 'What will be the climate impact if one would use 1 m 2 of mire for peat extraction during 20 years?'. Two different methods of after-treatment were studied: afforestation and restoration of wetland. The climate impact from a peatland - wetland energy scenario and a peatland - forestry energy scenario was compared to the climate impact from coal, natural gas and forest residues. Sensitivity analyses were performed to evaluate which parameters that are important to take into consideration in order to minimize the climate impact from peat utilisation

  2. Optimum utilisation of the uranium resource

    International Nuclear Information System (INIS)

    Ion, S. E.; Wilson, P.D.

    1998-01-01

    The nuclear industry faces many challenges, notably to maximise safety, secure an adequate energy supply, manage wastes satisfactorily and achieve political acceptability. One way forward is to optimise together the various interdependent stages of the fuel cycle - the now familiar 'holistic approach'. Many of the issues will demand large R and D expenditure, most effectively met through international collaboration. Sustainable development requires optimum utilisation of energy potential, to which the most accessible key is recycling uranium and the plutonium bred from it. Realising anything like this full potential requires fast-neutron reactors, and therefore BNFL continues to sustain the UK involvement in their international development. Meanwhile, current R and D programmes must aim to make the nuclear option more competitive against fossil resources, while maintaining and developing the necessary skills for more advanced technologies The paper outlines the strategies being pursued and highlights BNFL 's programmes. (author)

  3. Increased health care utilisation in international adoptees

    DEFF Research Database (Denmark)

    Graff, Heidi Jeannet; Siersma, Volkert Dirk; Kragstrup, Jakob

    2015-01-01

    comprised internationallyadopted children (n = 6,820), adopted between 1994 and2005, and all non-adopted children (n = 492,374) who couldbe matched with the adopted children on sex, age, municipalityand family constellation at the time of adoption. Results: International adoption increased the use......Introduction: Several studies have documented thatinternational adoptees have an increased occurrence ofhealth problems and contacts to the health-care systemafter arriving to their new country of residence. This maybe explained by pre-adoption adversities, especially for theperiod immediately...... after adoption. Our study aimed to theassess health-care utilisation of international adoptees inprimary and secondary care for somatic and psychiatricdiagnoses in a late post-adoption period. Is there an increaseduse of the health-care system in this period, evenwhen increased morbidity in the group...

  4. Accurate approximation method for prediction of class I MHC affinities for peptides of length 8, 10 and 11 using prediction tools trained on 9mers

    DEFF Research Database (Denmark)

    Lundegaard, Claus; Lund, Ole; Nielsen, Morten

    2008-01-01

    Several accurate prediction systems have been developed for prediction of class I major histocompatibility complex (MHC):peptide binding. Most of these are trained on binding affinity data of primarily 9mer peptides. Here, we show how prediction methods trained on 9mer data can be used for accurate...

  5. The Utilisation of Facebook for Knowledge Sharing in Selected Local Government Councils in Delta State, Nigeria

    OpenAIRE

    Uzoma Heman Ononye; Anthony Igwe

    2017-01-01

    Aim/Purpose: Facebook has made it possible for organisation to embrace social and network centric knowledge processes by creating opportunities to connect, interact, and collaborate with stakeholders. We have witnessed a significant increase in the popularity and use of this tool in many organisations, especially in the private sector. But the utilisation of Facebook in public organisations is at its infancy, with many also believing that the use of Facebook is not a common practice in many p...

  6. Environmental assessment of incinerator residue utilisation.

    Science.gov (United States)

    Toller, S; Kärrman, E; Gustafsson, J P; Magnusson, Y

    2009-07-01

    Incineration ashes may be treated either as a waste to be dumped in landfill, or as a resource that is suitable for re-use. In order to choose the best management scenario, knowledge is needed on the potential environmental impact that may be expected, including not only local, but also regional and global impact. In this study, A life cycle assessment (LCA) based approach was outlined for environmental assessment of incinerator residue utilisation, in which leaching of trace elements as well as other emissions to air and water and the use of resources were regarded as constituting the potential environmental impact from the system studied. Case studies were performed for two selected ash types, bottom ash from municipal solid waste incineration (MSWI) and wood fly ash. The MSWI bottom ash was assumed to be suitable for road construction or as drainage material in landfill, whereas the wood fly ash was assumed to be suitable for road construction or as a nutrient resource to be recycled on forest land after biofuel harvesting. Different types of potential environmental impact predominated in the activities of the system and the use of natural resources and the trace element leaching were identified as being relatively important for the scenarios compared. The scenarios differed in use of resources and energy, whereas there is a potential for trace element leaching regardless of how the material is managed. Utilising MSWI bottom ash in road construction and recycling of wood ash on forest land saved more natural resources and energy than when these materials were managed according to the other scenarios investigated, including dumping in landfill.

  7. Fuels planning: science synthesis and integration; environmental consequences fact sheet 12: Water Erosion Prediction Project (WEPP) Fuel Management (FuMe) tool

    Science.gov (United States)

    William Elliot; David Hall

    2005-01-01

    The Water Erosion Prediction Project (WEPP) Fuel Management (FuMe) tool was developed to estimate sediment generated by fuel management activities. WEPP FuMe estimates sediment generated for 12 fuel-related conditions from a single input. This fact sheet identifies the intended users and uses, required inputs, what the model does, and tells the user how to obtain the...

  8. Eco-morphological Real-time Forecasting tool to predict hydrodynamic, sediment and nutrient dynamic in Coastal Louisiana

    Science.gov (United States)

    Messina, F.; Meselhe, E. A.; Buckman, L.; Twight, D.

    2017-12-01

    Louisiana coastal zone is one of the most productive and dynamic eco-geomorphic systems in the world. This unique natural environment has been alternated by human activities and natural processes such as sea level rise, subsidence, dredging of canals for oil and gas production, the Mississippi River levees which don't allow the natural river sediment. As a result of these alterations land loss, erosion and flood risk are becoming real issues for Louisiana. Costal authorities have been studying the benefits and effects of several restoration projects, e.g. freshwater and sediment diversions. The protection of communities, wildlife and of the unique environments is a high priority in this region. The Water Institute of the Gulf, together with Deltares, has developed a forecasting and information system for a pilot location in Coastal Louisiana, specifically for Barataria Bay and Breton Sound Basins in the Mississippi River Deltaic Plain. The system provides a 7-day forecast of water level, salinity, and temperature, under atmospheric and coastal forecasted conditions, such as freshwater riverine inflow, rainfall, evaporation, wind, and tide. The system also forecasts nutrient distribution (e.g., Chla and dissolved oxygen) and sediment transport. The Flood Early Warning System FEWS is used as a platform to import multivariate data from several sources, use them to monitor the pilot location and to provide boundary conditions to the model. A hindcast model is applied to compare the model results to the observed data, and to provide the initial condition to the forecast model. This system represents a unique tool which provides valuable information regarding the overall conditions of the basins. It offers the opportunity to adaptively manage existing and planned diversions to meet certain salinity and water level targets or thresholds while maximizing land-building goals. Moreover, water quality predictions provide valuable information on the current ecological

  9. Volumetric response analysis during chemoradiation as predictive tool for optimizing treatment strategy in locally advanced unresectable NSCLC

    International Nuclear Information System (INIS)

    Bral, Samuel; Duchateau, Michael; De Ridder, Mark; Everaert, Hendrik; Tournel, Koen; Schallier, Denis; Verellen, Dirk; Storme, Guy

    2009-01-01

    Purpose: To study the feasibility of measuring volumetric changes in the primary tumor on megavoltage-computed tomography (MVCT) during chemoradiation and to examine the correlation with local response. Patients and methods: Fifteen consecutive patients with stage III, inoperable, locally advanced non-small cell lung cancer (NSCLC) were treated in a prospective dose escalation study protocol of concurrent chemoradiation. They were monitored for acute toxicity and evaluated with daily MVCT imaging. The volumetric changes were fitted to a negative exponential resulting in a regression coefficient (RC). Local response evaluation was done with positron emission tomography using the radio-labeled glucose analogue F18 fluorodeoxyglucose (FDG-PET). Results: The mean volume decrease (±standard deviation) was 73% (±18%). With a mean treatment time of 42 days this treatment schedule resulted in a mean decrease of 1.74%/day. Of the 13 evaluable patients seven developed a metabolic complete remission (MCR). The mean RC of the patients with MCR is 0.050 versus a mean RC of 0.023 in non-responders (p = 0.0074). Using a proposed cut-off value for the RC of 0.03 80% of the non-responders will be detected correctly while misclassifying 16.4% of patients who will eventually achieve an MCR. The total cumulative percentage of esophageal grade 3 or more toxicity was 46.7%. Conclusion: The RC derived from volumetric analysis of daily MVCT is prognostic and predictive for local response in patients treated with chemoradiation for a locally advanced NSCLC. Because this treatment schedule is toxic in nearly half of the patient population, MVCT is a tool in the implementation of patient-individualized treatment strategies.

  10. New target prediction and visualization tools incorporating open source molecular fingerprints for TB Mobile 2.0.

    Science.gov (United States)

    Clark, Alex M; Sarker, Malabika; Ekins, Sean

    2014-01-01

    We recently developed a freely available mobile app (TB Mobile) for both iOS and Android platforms that displays Mycobacterium tuberculosis (Mtb) active molecule structures and their targets with links to associated data. The app was developed to make target information available to as large an audience as possible. We now report a major update of the iOS version of the app. This includes enhancements that use an implementation of ECFP_6 fingerprints that we have made open source. Using these fingerprints, the user can propose compounds with possible anti-TB activity, and view the compounds within a cluster landscape. Proposed compounds can also be compared to existing target data, using a näive Bayesian scoring system to rank probable targets. We have curated an additional 60 new compounds and their targets for Mtb and added these to the original set of 745 compounds. We have also curated 20 further compounds (many without targets in TB Mobile) to evaluate this version of the app with 805 compounds and associated targets. TB Mobile can now manage a small collection of compounds that can be imported from external sources, or exported by various means such as email or app-to-app inter-process communication. This means that TB Mobile can be used as a node within a growing ecosystem of mobile apps for cheminformatics. It can also cluster compounds and use internal algorithms to help identify potential targets based on molecular similarity. TB Mobile represents a valuable dataset, data-visualization aid and target prediction tool.

  11. Percentage of Body Fat and Fat Mass Index as a Screening Tool for Metabolic Syndrome Prediction in Colombian University Students

    Directory of Open Access Journals (Sweden)

    Robinson Ramírez-Vélez

    2017-09-01

    Full Text Available High body fat is related to metabolic syndrome (MetS in all ethnic groups. Based on the International Diabetes Federation (IDF definition of MetS, the aim of this study was to explore thresholds of body fat percentage (BF% and fat mass index (FMI for the prediction of MetS among Colombian University students. A cross-sectional study was conducted on 1687 volunteers (63.4% women, mean age = 20.6 years. Weight, waist circumference, serum lipids indices, blood pressure, and fasting plasma glucose were measured. Body composition was measured by bioelectrical impedance analysis (BIA and FMI was calculated. MetS was defined as including more than or equal to three of the metabolic abnormalities according to the IDF definition. Receiver operating curve (ROC analysis was used to determine optimal cut-off points for BF% and FMI in relation to the area under the curve (AUC, sensitivity, and specificity in both sexes. The overall prevalence of MetS was found to be 7.7%, higher in men than women (11.1% vs. 5.3%; p < 0.001. BF% and FMI were positively correlated to MetS components (p < 0.05. ROC analysis indicated that BF% and FMI can be used with moderate accuracy to identify MetS in university-aged students. BF% and FMI thresholds of 25.55% and 6.97 kg/m2 in men, and 38.95% and 11.86 kg/m2 in women, were found to be indicative of high MetS risk. Based on the IDF criteria, both indexes’ thresholds seem to be good tools to identify university students with unfavorable metabolic profiles.

  12. Development of a Clinical Tool to Predict Home Death of a Discharged Cancer Patient in Japan: a Case-Control Study.

    Science.gov (United States)

    Fukui, Sakiko; Morita, Tatsuya; Yoshiuchi, Kazuhiro

    2017-08-01

    The aim of this study was to investigate the predictive value of a clinical tool to predict whether discharged cancer patients die at home, comparing groups of case who died at home and control who died in hospitals or other facilities. We conducted a nationwide case-control study to identify the determinants of home death for a discharged cancer patient. We randomly selected nurses in charge of 2000 home-visit nursing agencies from all 5813 agencies in Japan by referring to the nationwide databases in January 2013. The nurses were asked to report variables of their patients' place of death, patients' and caregivers' clinical statuses, and their preferences for home death. We used logistic regression analysis and developed a clinical tool to accurately predict it and investigated their predictive values. We identified 466 case and 478 control patients. Five predictive variables of home death were obtained: patients' and caregivers' preferences for home death [OR (95% CI) 2.66 (1.99-3.55)], availability of visiting physicians [2.13 (1.67-2.70)], 24-h contact between physicians and nurses [1.68 (1.30-2.18)], caregivers' experiences of deathwatch at home [1.41 (1.13-1.75)], and patients' insights as to their own prognosis [1.23 (1.02-1.50)]. We calculated the scores predicting home death for each variable (range 6-28). When using a cutoff point of 16, home death was predicted with a sensitivity of 0.72 and a specificity of 0.81 with the Harrell's c-statistic of 0.84. This simple clinical tool for healthcare professionals can help predict whether a discharged patient is likely to die at home.

  13. A computational tool to predict the evolutionarily conserved protein-protein interaction hot-spot residues from the structure of the unbound protein.

    Science.gov (United States)

    Agrawal, Neeraj J; Helk, Bernhard; Trout, Bernhardt L

    2014-01-21

    Identifying hot-spot residues - residues that are critical to protein-protein binding - can help to elucidate a protein's function and assist in designing therapeutic molecules to target those residues. We present a novel computational tool, termed spatial-interaction-map (SIM), to predict the hot-spot residues of an evolutionarily conserved protein-protein interaction from the structure of an unbound protein alone. SIM can predict the protein hot-spot residues with an accuracy of 36-57%. Thus, the SIM tool can be used to predict the yet unknown hot-spot residues for many proteins for which the structure of the protein-protein complexes are not available, thereby providing a clue to their functions and an opportunity to design therapeutic molecules to target these proteins. Copyright © 2013 Federation of European Biochemical Societies. Published by Elsevier B.V. All rights reserved.

  14. System factors influencing utilisation of Research4Life databases by ...

    African Journals Online (AJOL)

    This is a comprehensive investigation of the influence of system factors on utilisation of Research4Life databases. It is part of a doctoral dissertation. Research4Life databases are new innovative technologies being investigated in a new context – utilisation by NARIs scientists for research. The study adopted the descriptive ...

  15. Utilisation of Antenatal Services at the Provincial Hospital, Mongomo ...

    African Journals Online (AJOL)

    Utilisation of Antenatal Services at the Provincial Hospital, Mongomo, Guinea Equatoria. AAG Jimoh. Abstract. This prospective study was carried out to evaluate the utilisation of antenatal care at the Provincial Specialist Hospital, Mongomo, Guinea Equatoria, paying close attention to the confounding factors affecting ...

  16. Globalisation and Labour Utilisation in Nigeria: Evidence from the ...

    African Journals Online (AJOL)

    This study examines the influence of globalisation on labour utilisation in Nigeria using the construction industry as a case study. It reveals that the era of globalisation has given rise to profound changes in the way labour is utilised, specifically in terms of employment patterns as well as the related issues of earnings, job ...

  17. Using high-performance mathematical modelling tools to predict erosion and sediment fluxes in peri-urban catchments

    Science.gov (United States)

    Pereira, André; Conde, Daniel; Ferreira, Carla S. S.; Walsh, Rory; Ferreira, Rui M. L.

    2017-04-01

    Deforestation and urbanization generally lead to increased soil erosion andthrough the indirect effect of increased overland flow and peak flood discharges. Mathematical modelling tools can be helpful for predicting the spatial distribution of erosion and the morphological changes on the channel network. This is especially useful to predict the impacts of land-use changes in parts of the watershed, namely due to urbanization. However, given the size of the computational domain (normally the watershed itself), the need for high spatial resolution data to model accurately sediment transport processes and possible need to model transcritical flows, the computational cost is high and requires high-performance computing techniques. The aim of this work is to present the latest developments of the hydrodynamic and morphological model STAV2D and its applicability to predict runoff and erosion at watershed scale. STAV2D was developed at CEris - Instituto Superior Técnico, Universidade de Lisboa - as a tool particularly appropriated to model strong transient flows in complex and dynamic geometries. It is based on an explicit, first-order 2DH finite-volume discretization scheme for unstructured triangular meshes, in which a flux-splitting technique is paired with a reviewed Roe-Riemann solver, yielding a model applicable to discontinuous flows over time-evolving geometries. STAV2D features solid transport in both Euleran and Lagrangian forms, with the aim of describing the transport of fine natural sediments and then the large individual debris. The model has been validated with theoretical solutions and laboratory experiments (Canelas et al., 2013 & Conde et al., 2015). STAV-2D now supports fully distributed and heterogeneous simulations where multiple different hardware devices can be used to accelerate computation time within a unified Object-Oriented approach: the source code for CPU and GPU has the same compilation units and requires no device specific branches, like

  18. Utilisation of Estonian energy wood resources

    Energy Technology Data Exchange (ETDEWEB)

    Muiste, P.; Tullus, H.; Uri, V. [Estonian Agricultural University, Tartu (Estonia)

    1996-12-31

    In the end of the Soviet period in the 1980s, a long-term energy programme for Estonia was worked out. The energy system was planned to be based on nuclear power and the share of domestic alternative sources of energy was low. The situation has greatly changed after the re-establishment of the Estonian independence, and now wood and peat fuels play an important role in the energy system. Energy consumption in Estonia decreased during the period 1970-1993, but this process has less influenced the consumption of domestic renewable fuels - peat and wood. It means that the share of these fuels has grown. The investment on substitution of imported fossil fuels and on conversion of boiler plants from fossil fuels to domestic fuels has reached the level of USD 100 million. The perspectives of the wood energy depend mainly on two factors; the resources and the price of wood energy compared with other fuels. The situation in wood market influences both the possible quantities and the price. It is typical that the quickly growing cost of labour power in Estonia is greatly affecting the price of energy wood. Though the price level of fuel peat and wood chips is lower than the world market price today, the conditions for using biofuels could be more favourable, if higher environmental fees were introduced. In conjunction with increasing utilisation of biofuels it is important to evaluate possible emissions or removal of greenhouse gases from Estonian forests 3 refs.

  19. Utilisation of Estonian energy wood resources

    Energy Technology Data Exchange (ETDEWEB)

    Muiste, P; Tullus, H; Uri, V [Estonian Agricultural University, Tartu (Estonia)

    1997-12-31

    In the end of the Soviet period in the 1980s, a long-term energy programme for Estonia was worked out. The energy system was planned to be based on nuclear power and the share of domestic alternative sources of energy was low. The situation has greatly changed after the re-establishment of the Estonian independence, and now wood and peat fuels play an important role in the energy system. Energy consumption in Estonia decreased during the period 1970-1993, but this process has less influenced the consumption of domestic renewable fuels - peat and wood. It means that the share of these fuels has grown. The investment on substitution of imported fossil fuels and on conversion of boiler plants from fossil fuels to domestic fuels has reached the level of USD 100 million. The perspectives of the wood energy depend mainly on two factors; the resources and the price of wood energy compared with other fuels. The situation in wood market influences both the possible quantities and the price. It is typical that the quickly growing cost of labour power in Estonia is greatly affecting the price of energy wood. Though the price level of fuel peat and wood chips is lower than the world market price today, the conditions for using biofuels could be more favourable, if higher environmental fees were introduced. In conjunction with increasing utilisation of biofuels it is important to evaluate possible emissions or removal of greenhouse gases from Estonian forests 3 refs.

  20. Using exposure prediction tools to link exposure and dosimetry for risk based decisions: a case study with phthalates

    Science.gov (United States)

    The Population Life-course Exposure to Health Effects Modeling (PLETHEM) platform being developed provides a tool that links results from emerging toxicity testing tools to exposure estimates for humans as defined by the USEPA. A reverse dosimetry case study using phthalates was ...

  1. Trends, determinants and inequities of 4+ ANC utilisation in Bangladesh.

    Science.gov (United States)

    Rahman, Aminur; Nisha, Monjura Khatun; Begum, Tahmina; Ahmed, Sayem; Alam, Nurul; Anwar, Iqbal

    2017-01-13

    The objectives of this study are to document the trend on utilisation of four or more (4 + ) antenatal care (ANC) over the last 22 years period and to explore the determinants and inequity of 4 + ANC utilisation as reported by the last two Bangladesh Demographic and Health surveys (BDHS) (2011 and 2014). The data related to ANC have been extracted from the BDHS data set which is available online as an open source. STATA 13 software was used for organising and analysing the data. The outcome variable considered for this study was utilisation of 4 + ANC. Trends of 4 + ANC were measured in percentage and predictors for 4 + ANC were measured through bivariate and multivariable analysis. The concentration index was estimated for assessing inequity in 4 + ANC utilisation. Utilisation of 4 + ANC has increased by about 26% between the year 1994 and 2014. Higher level of education, residing in urban region and richest wealth quintile were found to be significant predictors. The utilisation of 4 + ANC has decreased with increasing parity and maternal age. The inequity indices showed consistent inequities in 4 + ANC utilisation, and such inequities were increased between 2011 and 2014. In Bangladesh, the utilisation of any ANC rose steadily between 1994 and 2014, but progress in terms of 4 + ANC utilisation was much slower as the expectation was to achieve the national set target (50%: 4 + ANC utilisation) by 2016. Socio-economic inequities were observed in groups that failed to attend a 4 + ANC visit. Policymakers should pay special attention to increase the 4 + ANC coverage where this study can facilitate to identify the target groups whom need to be intervened on priority basis.

  2. Using image analysis as a tool for assessment of prognostic and predictive biomarkers for breast cancer: How reliable is it?

    Directory of Open Access Journals (Sweden)

    Mark C Lloyd

    2010-01-01

    Full Text Available Background : Estrogen receptor (ER, progesterone receptor (PR and human epidermal growth factor receptor-2 (HER2 are important and well-established prognostic and predictive biomarkers for breast cancers and routinely tested on patient′s tumor samples by immunohistochemical (IHC study. The accuracy of these test results has substantial impact on patient management. A critical factor that contributes to the result is the interpretation (scoring of IHC. This study investigates how computerized image analysis can play a role in a reliable scoring, and identifies potential pitfalls with common methods. Materials and Methods : Whole slide images of 33 invasive ductal carcinoma (IDC (10 ER and 23 HER2 were scored by pathologist under the light microscope and confirmed by another pathologist. The HER2 results were additionally confirmed by fluorescence in situ hybridization (FISH. The scoring criteria were adherent to the guidelines recommended by the American Society of Clinical Oncology/College of American Pathologists. Whole slide stains were then scored by commercially available image analysis algorithms from Definiens (Munich, Germany and Aperio Technologies (Vista, CA, USA. Each algorithm was modified specifically for each marker and tissue. The results were compared with the semi-quantitative manual scoring, which was considered the gold standard in this study. Results : For HER2 positive group, each algorithm scored 23/23 cases within the range established by the pathologist. For ER, both algorithms scored 10/10 cases within range. The performance of each algorithm varies somewhat from the percentage of staining as compared to the pathologist′s reading. Conclusions : Commercially available computerized image analysis can be useful in the evaluation of ER and HER2 IHC results. In order to achieve accurate results either manual pathologist region selection is necessary, or an automated region selection tool must be employed. Specificity can

  3. IRSS: a web-based tool for automatic layout and analysis of IRES secondary structure prediction and searching system in silico

    Directory of Open Access Journals (Sweden)

    Hong Jun-Jie

    2009-05-01

    Full Text Available Abstract Background Internal ribosomal entry sites (IRESs provide alternative, cap-independent translation initiation sites in eukaryotic cells. IRES elements are important factors in viral genomes and are also useful tools for bi-cistronic expression vectors. Most existing RNA structure prediction programs are unable to deal with IRES elements. Results We designed an IRES search system, named IRSS, to obtain better results for IRES prediction. RNA secondary structure prediction and comparison software programs were implemented to construct our two-stage strategy for the IRSS. Two software programs formed the backbone of IRSS: the RNAL fold program, used to predict local RNA secondary structures by minimum free energy method; and the RNA Align program, used to compare predicted structures. After complete viral genome database search, the IRSS have low error rate and up to 72.3% sensitivity in appropriated parameters. Conclusion IRSS is freely available at this website http://140.135.61.9/ires/. In addition, all source codes, precompiled binaries, examples and documentations are downloadable for local execution. This new search approach for IRES elements will provide a useful research tool on IRES related studies.

  4. Habitat Modeling and Preferences of Marine Mammals as Function of Oceanographic Characteristics: Development of Predictive Tools for Assessing the Risks and the Impacts Due to Sound Emissions

    Science.gov (United States)

    2011-09-30

    evaluate WEC projects in the perspective of the environmental cost-benefit analysis. Proceedings of the ISOPE 2011, Maui, Hawaii, USA 19-24 June, 2011...Function of Oceanographic Characteristics: Development of Predictive Tools for Assessing the Risks and the Impacts Due to Sound Emissions Dr...detections) and the available environmental predictors; - Creating the knowledge-based background about potential mitigation measures appropriate for

  5. Application of the PredictAD Decision Support Tool to a Danish Cohort of Patients with Alzheimer's Disease and Other Dementias

    DEFF Research Database (Denmark)

    Simonsen, A H; Mattila, J; Hejl, A M

    2013-01-01

    Background: The diagnosis of Alzheimer's disease (AD) is based on an ever-increasing body of data and knowledge making it a complex task. The PredictAD tool integrates heterogeneous patient data using an interactive user interface to provide decision support. The aim of this project was to invest......Background: The diagnosis of Alzheimer's disease (AD) is based on an ever-increasing body of data and knowledge making it a complex task. The PredictAD tool integrates heterogeneous patient data using an interactive user interface to provide decision support. The aim of this project...... forest. Results: The DSI performed best for this realistic dataset with an accuracy of 76.6% compared to the accuracies for the naïve Bayesian classifier and random forest of 67.4 and 66.7%, respectively. Furthermore, the DSI differentiated between the four diagnostic groups with a p value of ....0001. Conclusion: In this dataset, the DSI method used by the PredictAD tool showed a superior performance for the differentiation between patients with AD and those with other dementias. However, the methods need to be refined further in order to optimize the differential diagnosis between AD, FTD, VaD and DLB....

  6. Prediction of rainfall anomalies during the dry to wet transition season over the Southern Amazonia using machine learning tools

    Science.gov (United States)

    Shan, X.; Zhang, K.; Zhuang, Y.; Fu, R.; Hong, Y.

    2017-12-01

    Seasonal prediction of rainfall during the dry-to-wet transition season in austral spring (September-November) over southern Amazonia is central for improving planting crops and fire mitigation in that region. Previous studies have identified the key large-scale atmospheric dynamic and thermodynamics pre-conditions during the dry season (June-August) that influence the rainfall anomalies during the dry to wet transition season over Southern Amazonia. Based on these key pre-conditions during dry season, we have evaluated several statistical models and developed a Neural Network based statistical prediction system to predict rainfall during the dry to wet transition for Southern Amazonia (5-15°S, 50-70°W). Multivariate Empirical Orthogonal Function (EOF) Analysis is applied to the following four fields during JJA from the ECMWF Reanalysis (ERA-Interim) spanning from year 1979 to 2015: geopotential height at 200 hPa, surface relative humidity, convective inhibition energy (CIN) index and convective available potential energy (CAPE), to filter out noise and highlight the most coherent spatial and temporal variations. The first 10 EOF modes are retained for inputs to the statistical models, accounting for at least 70% of the total variance in the predictor fields. We have tested several linear and non-linear statistical methods. While the regularized Ridge Regression and Lasso Regression can generally capture the spatial pattern and magnitude of rainfall anomalies, we found that that Neural Network performs best with an accuracy greater than 80%, as expected from the non-linear dependence of the rainfall on the large-scale atmospheric thermodynamic conditions and circulation. Further tests of various prediction skill metrics and hindcasts also suggest this Neural Network prediction approach can significantly improve seasonal prediction skill than the dynamic predictions and regression based statistical predictions. Thus, this statistical prediction system could have

  7. Capacity Utilisation of Vehicles for Road Freight Transport

    DEFF Research Database (Denmark)

    Kveiborg, Ole; Abate, Megersa Abera

    to their analytical approach and origin of research. Findings The first approach looks at utilisation based on economic theories such as the firms’ objective to maximise profitability and considers how various firm and haul (market) characteristics influence utilisation. The second approach stems from the transport...... modelling literature and its main aim is analysing vehicle movement and usage in transport demand modelling context. A strand of this second group of contributions is the modelling of trip-chain and its implication on the level of capacity utilisation. Research limitations The review is not a comprehensive...... by combining different strands of this literature....

  8. Wood torrefaction. Pilot tests and utilisation prospects

    Energy Technology Data Exchange (ETDEWEB)

    Wilen, C.; Jukola, P.; Jarvinen, T.; Sipila, K. [VTT Technical Reseach Centre of Finland, Espoo (Finland); Verhoeff, F.; Kiel, J. [Energy research Centre of the Netherlands, LE Petten (Netherlands)

    2013-09-15

    operation. The market is expected to move forward but the available public information is very limited, especially concerning the technologies used and volumes produced. Woody feedstocks will be the main raw material source. The utilisation rate of forest industry residues and by-products is relatively high in the EU and wood supply in Central Europe remains more or less stable, hence the price of the raw material is at a fairly high level. The utilities' capability to pay for the product depends mainly on the national feed-in tariffs of green electricity. The energy price for the user is at least twice as high as that of coal. (orig.)

  9. Pilot study of nitrogen utilisation in maize

    International Nuclear Information System (INIS)

    Futo, I.; Palcsu, L.; Vodila, G.

    2012-01-01

    Complete text of publication follows. In the cooperation between KITE Ltd., Nadudvar, Hungary and the Hertelendi Laboratory of Environmental Studies, the aim was to determine the ideal locations of fertilising, the ideal distance of rows for the ideal production yield. To track the nitrogen utilisation of maize (Zea mays) 15 N enriched NH 4 NO 3 fertiliser was introduced among the usual fertilisers in the maize field of KITE Ltd. Nadudvar, Hungary on 30 th March 2012, before sowing. Four maize samples were taken from different areas of different fertiliser treatment (non-fertilised and non-labelled, fertilised and non-labelled, fertilised and labelled between the rows and fertilised and labelled within the rows) and from different development stages of the plant on 22 nd May, 8 th June, 6 th July and 7 th September being sampling periods 1-4, respectively. The plant samples were subsampled based on organs: root, stem and leaf. Samples were dried to constant mass and pulverised. The 15 N measurements were made by a Thermo Finnigan Delta PLUS XP isotope ratio mass spectrometer coupled with an elemental analyser. In case of non-fertilised and non-labelled plants, all the three organs were getting 15 N depleted with time, most intensively the stem and the less intensively the root (Figure 1). For the leaves and stems of the fertilised and non-labelled plants, the tendency in time is very similar to the ones of the non-fertilised and non-labelled plants, however, the roots of the fertilised and non-labelled plants got significantly enriched in the sample of September. In case of the fertilised and labelled between-the-rows samples, labelling is slightly seen as the delta values are positive. These values are significantly lower than the ones for the fertilised and labelled-within-the-rows plants. It is seen that fertiliser got to the vegetation in the largest extent in this layout. Labelling showed its maximum intensity in the second sampling (8 th June) showing that

  10. Estimation of an optimal chemotherapy utilisation rate for cancer: setting an evidence-based benchmark for quality cancer care.

    Science.gov (United States)

    Jacob, S A; Ng, W L; Do, V

    2015-02-01

    There is wide variation in the proportion of newly diagnosed cancer patients who receive chemotherapy, indicating the need for a benchmark rate of chemotherapy utilisation. This study describes an evidence-based model that estimates the proportion of new cancer patients in whom chemotherapy is indicated at least once (defined as the optimal chemotherapy utilisation rate). The optimal chemotherapy utilisation rate can act as a benchmark for measuring and improving the quality of care. Models of optimal chemotherapy utilisation were constructed for each cancer site based on indications for chemotherapy identified from evidence-based treatment guidelines. Data on the proportion of patient- and tumour-related attributes for which chemotherapy was indicated were obtained, using population-based data where possible. Treatment indications and epidemiological data were merged to calculate the optimal chemotherapy utilisation rate. Monte Carlo simulations and sensitivity analyses were used to assess the effect of controversial chemotherapy indications and variations in epidemiological data on our model. Chemotherapy is indicated at least once in 49.1% (95% confidence interval 48.8-49.6%) of all new cancer patients in Australia. The optimal chemotherapy utilisation rates for individual tumour sites ranged from a low of 13% in thyroid cancers to a high of 94% in myeloma. The optimal chemotherapy utilisation rate can serve as a benchmark for planning chemotherapy services on a population basis. The model can be used to evaluate service delivery by comparing the benchmark rate with patterns of care data. The overall estimate for other countries can be obtained by substituting the relevant distribution of cancer types. It can also be used to predict future chemotherapy workload and can be easily modified to take into account future changes in cancer incidence, presentation stage or chemotherapy indications. Copyright © 2014 The Royal College of Radiologists. Published by

  11. A drug utilisation study investigating prescribed daily doses of ...

    African Journals Online (AJOL)

    and drug groups. Design. Retrospective drug utilisation study using data .... drugs that were prescribed 20 or fewer times during the period under ... occurs in women and men at different ages and with different severity. group. On average, men ...

  12. Facilitating nurses' knowledge of the utilisation of reflexology in ...

    African Journals Online (AJOL)

    2012-05-18

    May 18, 2012 ... scientific evidence on the utilisation of reflexology as CAM modality to promote .... reflexology therapy, zone therapy and foot massage and ...... perceived quality of care and cultural beliefs', Family Pracfice 21(6), 654−660.

  13. Exploring the extent to which ELT students utilise smartphones for ...

    African Journals Online (AJOL)

    Zehra

    2015-11-09

    Nov 9, 2015 ... aimed to explore the extent to which English Language Teaching (ELT) students utilise ... Given the fact that almost all students have a personal smartphone, and use it ..... ears as a disadvantage for smartphones (Kétyi,.

  14. Prospective validation of American Diabetes Association risk tool for predicting pre-diabetes and diabetes in Taiwan-Taichung community health study.

    Directory of Open Access Journals (Sweden)

    Chia-Ing Li

    Full Text Available BACKGROUND: A simple diabetes risk tool that does not require laboratory tests would be beneficial in screening individuals at higher risk. Few studies have evaluated the ability of these tools to identify new cases of pre-diabetes. This study aimed to assess the ability of the American Diabetes Association Risk Tool (ADART to predict the 3-year incidence of pre-diabetes and diabetes in Taiwanese. METHODS: This was a 3-year prospective study of 1021 residents with normoglycemia at baseline, gathered from a random sample of residents aged 40-88 years in a metropolitan city in Taiwan. The areas under the curve (AUCs of three models were compared: ADART only, ADART plus lifestyle behaviors at baseline, and ADART plus lifestyle behaviors and biomarkers at baseline. The performance of ADART was compared with that of 16 tools that had been reported in the literature. RESULTS: The AUCs and their 95% confidence intervals (CIs were 0.60 (0.54-0.66 for men and 0.72 (0.66-0.77 for women in model 1; 0.62 (0.56-0.68 for men and 0.74 (0.68-0.80 for women in model 2; and 0.64 (0.58-0.71 for men and 0.75 (0.69-0.80 for women in model 3. The AUCs of these three models were all above 0.7 in women, but not in men. No significant difference in either women or men (p = 0.268 and 0.156, respectively was observed in the AUC of these three models. Compared to 16 tools published in the literature, ADART had the second largest AUC in both men and women. CONCLUSIONS: ADART is a good screening tool for predicting the three-year incidence of pre-diabetes and diabetes in females of a Taiwanese population. The performance of ADART in men was similar to the results with other tools published in the literature. Its performance was one of the best among the tools reported in the literature.

  15. Advances in In Vitro and In Silico Tools for Toxicokinetic Dose Modeling and Predictive Toxicology (WC10)

    Science.gov (United States)

    Recent advances in vitro assays, in silico tools, and systems biology approaches provide opportunities for refined mechanistic understanding for chemical safety assessment that will ultimately lead to reduced reliance on animal-based methods. With the U.S. commercial chemical lan...

  16. The Assessment of Burden of COPD (ABC) tool : a shared decision-making instrument that is predictive of healthcare costs

    NARCIS (Netherlands)

    Rutten-vanMolken, Maureen P. H. M.; Goossens, Lucas M A; Boland, Melinde R. S.; Donkers, Bas; Jonker, Marcel F.; Slok, Annerika H. M.; Salome, Philippe L.; van Schayck, Constant; In 't Veen, Johannes C C M; Stolk, Elly A.

    2017-01-01

    Background: The Assessment of Burden of COPD (ABC) tool is an instrument that supports shared decision making between patients and physicians. It includes a coloured balloon diagram to visualize a patient’s scores on a questionnaire about the experienced burden of COPD and several objective severity

  17. Comparison of Predicted pKa Values for Some Amino-Acids, Dipeptides and Tripeptides, Using COSMO-RS, ChemAxon and ACD/Labs Methods Comparaison des valeurs de pKa de quelques acides aminés, dipeptides et tripeptides, prédites en utilisant les méthodes COSMO-RS, ChemAxon et ACD/Labs

    Directory of Open Access Journals (Sweden)

    Toure O.

    2013-05-01

    as peptides. The final goal of this study is to use the pKa values in a predictive thermodynamics model for products of interest in food industry. For this purpose, the effects of several factors (like conformations set treatment in COSMO-RS calculations, ionic strength effect that can affect the comparison between observed and predicted pKa data are discussed. Les valeurs de constantes d’acidité (pKa jouent un rôle très important, en particulier dans l’industrie alimentaire. Les propriétés chimiques des molécules dépendent significativement de leurs états d’ionisation. La plupart des molécules sont capables de gagner et/ou perdre un proton dans les solutions aqueuses. Ce transfert de proton apparaît la plupart du temps entre l’eau et un atome ionisable de la molécule organique. La réponse de la molécule à la protonation ou à la déprotonation dépend significativement du site concerné par le transfert de proton. La distribution partielle des charges dans la molécule varie également en fonction des sites actifs pour la protonation du couple acide; base. Par conséquent on peut l’utiliser pour déterminer le pKa d’une molécule. Dans un premier temps, nous avons utilisé la méthode COSMO-RS, une combinaison du modèle de solvatation diélectrique (COSMO et d’un traitement de thermodynamique statistique pour des solvants plus réels (RS, pour prédire les constantes de dissociation de 50 molécules environ (des acides aminés, des dipeptides et des tripeptides. Les résultats de pKa obtenus ont été comparés aux valeurs expérimentales, ainsi qu’aux valeurs de pKa prédites par deux autres méthodes. Nous avons utilisé respectivement la méthode ChemAxon, utilisant un programme basé sur le calcul des charges partielles des atomes d’une molécule, et la méthode ACD/Labs qui permet de déterminer des valeurs de pKa pour chaque centre de dissociation en considérant que le reste de la molécule est neutre, en utilisant une base de

  18. Price and utilisation differences for statins between four countries.

    Science.gov (United States)

    Thai, Loc Phuoc; Vitry, Agnes Isabelle; Moss, John Robert

    2018-02-01

    Australia, England, France and New Zealand use different policies to regulate their medicines market, which can impact on utilisation and price. To compare the prices and utilisation of statins in Australia, England, France and New Zealand from 2011 to 2013. Utilisation of statins in the four countries was compared using Defined Daily Doses (DDD) per 1000 inhabitants per year. Pairwise Laspeyres and Paasche index comparisons were conducted comparing the price and utilisation of statins. The results showed that the price of statins in New Zealand was the cheapest. The price of statins in Australia was most expensive in 2011 and 2012 but France was more expensive in 2013. There were large differences between the Laspeyres index and Paasche index when comparing the price and utilisation of England with Australia and France. The policies that regulate the New Zealand and England medicines markets were more effective in reducing the price of expensive statins. The relative utilisation of cheaper statins was greatest in England and had a large effect on the differences between the two index results. The pricing policies in Australia have been only partly effective in reducing the price of statins compared to other countries.

  19. Landscape genetics as a tool for conservation planning: predicting the effects of landscape change on gene flow.

    Science.gov (United States)

    van Strien, Maarten J; Keller, Daniela; Holderegger, Rolf; Ghazoul, Jaboury; Kienast, Felix; Bolliger, Janine

    2014-03-01

    For conservation managers, it is important to know whether landscape changes lead to increasing or decreasing gene flow. Although the discipline of landscape genetics assesses the influence of landscape elements on gene flow, no studies have yet used landscape-genetic models to predict gene flow resulting from landscape change. A species that has already been severely affected by landscape change is the large marsh grasshopper (Stethophyma grossum), which inhabits moist areas in fragmented agricultural landscapes in Switzerland. From transects drawn between all population pairs within maximum dispersal distance (landscape composition as well as some measures of habitat configuration. Additionally, a complete sampling of all populations in our study area allowed incorporating measures of population topology. These measures together with the landscape metrics formed the predictor variables in linear models with gene flow as response variable (F(ST) and mean pairwise assignment probability). With a modified leave-one-out cross-validation approach, we selected the model with the highest predictive accuracy. With this model, we predicted gene flow under several landscape-change scenarios, which simulated construction, rezoning or restoration projects, and the establishment of a new population. For some landscape-change scenarios, significant increase or decrease in gene flow was predicted, while for others little change was forecast. Furthermore, we found that the measures of population topology strongly increase model fit in landscape genetic analysis. This study demonstrates the use of predictive landscape-genetic models in conservation and landscape planning.

  20. Jarosite characteristics and its utilisation potentials

    International Nuclear Information System (INIS)

    Pappu, Asokan; Saxena, Mohini; Asolekar, Shyam R.

    2006-01-01

    During metallic zinc extraction from zinc sulphide or sulphide ore, huge quantity of jarosite is being released universally as solid residues. The jarosite mainly contains iron, sulphur, zinc, calcium, lead, cadmium and aluminium. Jarosite released from such industrial process is complex and its quality and quantity make the task more complex for safe disposal. Apart from water contamination, jarosite already accumulated and its increasing annual production is a major source of pollution for surrounding environment including soil, vegetation and aquatic life and hence its disposal leads to major concern because of the stringent environmental protection regulations. An attempt was made to evaluate the characteristics of Indian jarosite with an objectives to understand its potentials for recycling and utilising as raw materials for developing value added products. Sand and Coal Combustion Residues (CCRs) was used as an admixture to attain good workability and detoxify the toxic substance in the jarosite. Result revealed that jarosite is silty clay loam in texture having 63.48% silt sized and 32.35% clay sized particles. The particle size of jarosite (D 9 = 16.21 ± 0.20 μm) is finer than the CCRs (D 9 = 19.72 ± 0.18 μm). The jarosite is nonuniform in structure and shape as compared to the CCRs having spherical, hollow shaped and some of them are cenosphere in nature. The major mineral phase of jarosite is Potassium Iron Sulphate Hydroxide {KFe 3 (SO 4 ) 2 (OH) 6 }and Iron Sulphate Hydrate {2Fe 2 O 3 SO 3 .5H 2 O}. In CCRs the dominant phases are quartz {SiO 2 }, mullite {3Al 2 O 3 .2SiO 2 } and hematite {Fe 2 O 3 }. The high electrical conductivity of jarosite (13.26 ± 0.437 dS/m) indicates that the presence of cations and anions are predominant over CCRs (0.498 ± 0.007 dS/m). The major portion of jarosite consists of iron (23.66 ± 0.18%), sulphur (12.23 ± 0.2%) and zinc (8.243 ± 0.075%). But CCRs main constituents are silicon ( 27.41 ± 0.74%), aluminium (15

  1. Using pharmacokinetic-pharmacodynamic modelling as a tool for prediction of therapeutic effective plasma levels of antipsychotics

    DEFF Research Database (Denmark)

    Olsen, Christina Kurre; Brennum, Lise Tøttrup; Kreilgaard, Mads

    2008-01-01

    response behaviour correlates well with the relationship between human dopamine D2 receptor occupancy and clinical effect. The aim of the present study was to evaluate how pharmacokinetic/pharmacodynamic (PK/PD) predictions of therapeutic effective steady-state plasma levels by means of conditioned...... the rat dopamine D2 receptor occupancy levels providing 50% response in the conditioned avoidance response test and the dopamine D2 receptor occupancy levels reported from responding schizophrenic patients treated with antipsychotics. Predictions of therapeutically effective steady-state levels...... for sertindole (+dehydrosertindole) and olanzapine were 3-4-fold too high whereas for haloperidol, clozapine and risperidone the predicted steady-state EC50 in conditioned avoidance responding rats correlated well with the therapeutically effective plasma levels observed in patients. Accordingly, the proposed PK...

  2. The combination of kinetic and flow cytometric semen parameters as a tool to predict fertility in cryopreserved bull semen.

    Science.gov (United States)

    Gliozzi, T M; Turri, F; Manes, S; Cassinelli, C; Pizzi, F

    2017-11-01

    Within recent years, there has been growing interest in the prediction of bull fertility through in vitro assessment of semen quality. A model for fertility prediction based on early evaluation of semen quality parameters, to exclude sires with potentially low fertility from breeding programs, would therefore be useful. The aim of the present study was to identify the most suitable parameters that would provide reliable prediction of fertility. Frozen semen from 18 Italian Holstein-Friesian proven bulls was analyzed using computer-assisted semen analysis (CASA) (motility and kinetic parameters) and flow cytometry (FCM) (viability, acrosomal integrity, mitochondrial function, lipid peroxidation, plasma membrane stability and DNA integrity). Bulls were divided into two groups (low and high fertility) based on the estimated relative conception rate (ERCR). Significant differences were found between fertility groups for total motility, active cells, straightness, linearity, viability and percentage of DNA fragmented sperm. Correlations were observed between ERCR and some kinetic parameters, and membrane instability and some DNA integrity indicators. In order to define a model with high relation between semen quality parameters and ERCR, backward stepwise multiple regression analysis was applied. Thus, we obtained a prediction model that explained almost half (R 2=0.47, P<0.05) of the variation in the conception rate and included nine variables: five kinetic parameters measured by CASA (total motility, active cells, beat cross frequency, curvilinear velocity and amplitude of lateral head displacement) and four parameters related to DNA integrity evaluated by FCM (degree of chromatin structure abnormality Alpha-T, extent of chromatin structure abnormality (Alpha-T standard deviation), percentage of DNA fragmented sperm and percentage of sperm with high green fluorescence representative of immature cells). A significant relationship (R 2=0.84, P<0.05) was observed between

  3. Appraising the performance of genotyping tools in the prediction of coreceptor tropism in HIV-1 subtype C viruses

    Directory of Open Access Journals (Sweden)

    Crous Saleema

    2012-09-01

    Full Text Available Abstract Background In human immunodeficiency virus type 1 (HIV-1 infection, transmitted viruses generally use the CCR5 chemokine receptor as a coreceptor for host cell entry. In more than 50% of subtype B infections, a switch in coreceptor tropism from CCR5- to CXCR4-use occurs during disease progression. Phenotypic or genotypic approaches can be used to test for the presence of CXCR4-using viral variants in an individual’s viral population that would result in resistance to treatment with CCR5-antagonists. While genotyping approaches for coreceptor-tropism prediction in subtype B are well established and verified, they are less so for subtype C. Methods Here, using a dataset comprising V3 loop sequences from 349 CCR5-using and 56 CXCR4-using HIV-1 subtype C viruses we perform a comparative analysis of the predictive ability of 11 genotypic algorithms in their prediction of coreceptor tropism in subtype C. We calculate the sensitivity and specificity of each of the approaches as well as determining their overall accuracy. By separating the CXCR4-using viruses into CXCR4-exclusive (25 sequences and dual-tropic (31 sequences we evaluate the effect of the possible conflicting signal from dual-tropic viruses on the ability of a of the approaches to correctly predict coreceptor phenotype. Results We determined that geno2pheno with a false positive rate of 5% is the best approach for predicting CXCR4-usage in subtype C sequences with an accuracy of 94% (89% sensitivity and 99% specificity. Contrary to what has been reported for subtype B, the optimal approaches for prediction of CXCR4-usage in sequence from viruses that use CXCR4 exclusively, also perform best at predicting CXCR4-use in dual-tropic viral variants. Conclusions The accuracy of genotyping approaches at correctly predicting the coreceptor usage of V3 sequences from subtype C viruses is very high. We suggest that genotyping approaches can be used to test for coreceptor tropism in HIV-1

  4. The REDUCE metagram: a comprehensive prediction tool for determining the utility of dutasteride chemoprevention in men at risk for prostate cancer

    Directory of Open Access Journals (Sweden)

    Carvell eNguyen

    2012-10-01

    Full Text Available Introduction: 5-alpha reductase inhibitors can reduce the risk of prostate cancer but can be associated with significant side effects. A library of nomograms which predict the risk of clinical endpoints relevant to dutasteride treatment may help determine if chemoprevention is suited to the individual patient. Methods: Data from the REDUCE trial was used to identify predictive factors for nine endpoints relevant to dutasteride treatment. Using the treatment and placebo groups from the biopsy cohort, Cox proportional hazards and competing risks regression models were used to build 18 nomograms, whose predictive ability was measured by concordance index and calibration plots. Results: A total of 18 nomograms assessing the risks of cancer, high-grade cancer, high grade prostatic intraepithelial neoplasia (HGPIN, atypical small acinar proliferation (ASAP, erectile dysfunction (ED, acute urinary retention (AUR, gynecomastia, urinary tract infection (UTI and BPH-related surgery either on or off dutasteride were created. The nomograms for cancer, high grade cancer, ED, AUR, and BPH-related surgery demonstrated good discrimination and calibration while those for gynecomastia, UTI, HGPIN, and ASAP predicted no better than random chance. Conclusions: To aid patients in determining whether the benefits of dutasteride use outweigh the risks, we have developed a comprehensive metagram that can generate individualized risks of 9 outcomes relevant to men considering chemoprevention. Better models based on more predictive markers are needed for some of the endpoints but the current metagram demonstrates potential as a tool for patient counseling and decision making that is accessible, intuitive, and clinically relevant.

  5. Buried Volume Analysis for Propene Polymerization Catalysis Promoted by Group 4 Metals: a Tool for Molecular Mass Prediction

    KAUST Repository

    Falivene, Laura; Cavallo, Luigi; Talarico, Giovanni

    2015-01-01

    A comparison of the steric properties of homogeneous single site catalysts for propene polymerization using the percentage of buried volume (%VBur) as molecular descriptor is reported. The %VBur calculated on the neutral precursors of the active species seems to be a reliable tool to explain several experimental data related to the propene insertion and to the monomer chain transfer. Interestingly, a linear correlation between the buried volume calculated for a large set of neutral precursors and the energetic difference between propagation and termination steps calculated by DFT methods is found for Group 4 metal catalysts. The “master curves” derived for Ti, Zr and Hf confirm not only that the %VBur is an appropriate molecular descriptor for the systems considered but also that it could be used as tool for a large computational screening of new ligands.

  6. Buried Volume Analysis for Propene Polymerization Catalysis Promoted by Group 4 Metals: a Tool for Molecular Mass Prediction

    KAUST Repository

    Falivene, Laura

    2015-10-02

    A comparison of the steric properties of homogeneous single site catalysts for propene polymerization using the percentage of buried volume (%VBur) as molecular descriptor is reported. The %VBur calculated on the neutral precursors of the active species seems to be a reliable tool to explain several experimental data related to the propene insertion and to the monomer chain transfer. Interestingly, a linear correlation between the buried volume calculated for a large set of neutral precursors and the energetic difference between propagation and termination steps calculated by DFT methods is found for Group 4 metal catalysts. The “master curves” derived for Ti, Zr and Hf confirm not only that the %VBur is an appropriate molecular descriptor for the systems considered but also that it could be used as tool for a large computational screening of new ligands.

  7. Final Report, DOE Early Career Award: Predictive modeling of complex physical systems: new tools for statistical inference, uncertainty quantification, and experimental design

    Energy Technology Data Exchange (ETDEWEB)

    Marzouk, Youssef [Massachusetts Inst. of Technology (MIT), Cambridge, MA (United States)

    2016-08-31

    Predictive simulation of complex physical systems increasingly rests on the interplay of experimental observations with computational models. Key inputs, parameters, or structural aspects of models may be incomplete or unknown, and must be developed from indirect and limited observations. At the same time, quantified uncertainties are needed to qualify computational predictions in the support of design and decision-making. In this context, Bayesian statistics provides a foundation for inference from noisy and limited data, but at prohibitive computional expense. This project intends to make rigorous predictive modeling *feasible* in complex physical systems, via accelerated and scalable tools for uncertainty quantification, Bayesian inference, and experimental design. Specific objectives are as follows: 1. Develop adaptive posterior approximations and dimensionality reduction approaches for Bayesian inference in high-dimensional nonlinear systems. 2. Extend accelerated Bayesian methodologies to large-scale {\\em sequential} data assimilation, fully treating nonlinear models and non-Gaussian state and parameter distributions. 3. Devise efficient surrogate-based methods for Bayesian model selection and the learning of model structure. 4. Develop scalable simulation/optimization approaches to nonlinear Bayesian experimental design, for both parameter inference and model selection. 5. Demonstrate these inferential tools on chemical kinetic models in reacting flow, constructing and refining thermochemical and electrochemical models from limited data. Demonstrate Bayesian filtering on canonical stochastic PDEs and in the dynamic estimation of inhomogeneous subsurface properties and flow fields.

  8. Feasibility of an Assessment Tool for Children's Competence to Consent to Predictive Genetic Testing: a Pilot Study

    NARCIS (Netherlands)

    Hein, Irma M.; Troost, Pieter W.; Lindeboom, Robert; Christiaans, Imke; Grisso, Thomas; van Goudoever, Johannes B.; Lindauer, Ramón J. L.

    2015-01-01

    Knowledge on children's capacities to consent to medical treatment is limited. Also, age limits for asking children's consent vary considerably between countries. Decision-making on predictive genetic testing (PGT) is especially complicated, considering the ongoing ethical debate. In order to

  9. Exploring a new ultrasound score as a clinical predictive tool in patients with rheumatoid arthritis starting abatacept

    DEFF Research Database (Denmark)

    D'Agostino, Maria-Antonietta; Boers, Maarten; Wakefield, Richard J

    2016-01-01

    Objectives: To explore whether changes in a composite ( power Doppler/greyscale ultrasound (PDUS)) synovitis score, developed by the OMERACT-EULAR-Ultrasound Task Force, predict disease activity outcomes in rheumatoid arthritis (RA). Methods: Patients with RA who were methotrexate inadequate...

  10. Prealbumin/CRP Based Prognostic Score, a New Tool for Predicting Metastasis in Patients with Inoperable Gastric Cancer

    Directory of Open Access Journals (Sweden)

    Ali Esfahani

    2016-01-01

    Full Text Available Background. There is a considerable dissimilarity in the survival duration of the patients with gastric cancer. We aimed to assess the systemic inflammatory response (SIR and nutritional status of these patients before the commencement of chemotherapy to find the appropriate prognostic factors and define a new score for predicting metastasis. Methods. SIR was assessed using Glasgow Prognostic Score (GPS. Then a score was defined as prealbumin/CRP based prognostic score (PCPS to be compared with GPS for predicting metastasis and nutritional status. Results. 71 patients with gastric cancer were recruited in the study. 87% of patients had malnutrition. There was a statistical difference between those with metastatic (n=43 and those with nonmetastatic (n=28 gastric cancer according to levels of prealbumin and CRP; however they were not different regarding patient generated subjective global assessment (PG-SGA and GPS. The best cut-off value for prealbumin was determined at 0.20 mg/dL and PCPS could predict metastasis with 76.5% sensitivity, 63.6% specificity, and 71.4% accuracy. Metastatic and nonmetastatic gastric cancer patients were different in terms of PCPS (P=0.005. Conclusion. PCPS has been suggested for predicting metastasis in patients with gastric cancer. Future studies with larger sample size have been warranted.

  11. Dietary balanced protein in broiler chickens. 1. A flexible and practical tool to predict dose-response curves

    NARCIS (Netherlands)

    Eits, R.M.; Kwakkel, R.P.; Verstegen, M.W.A.; Hartog, den L.A.

    2005-01-01

    1. An empirical model of exponential form was developed, different versions of which can be used to predict growth rate, feed conversion and carcase and breast meat yield of broiler chickens as a function of dietary balanced protein ( DBP) content. The model was developed to support decision- making

  12. Hydroregime prediction models for ephemeral groundwater-driven sinkhole wetlands: a planning tool for climate change and amphibian conservation

    Science.gov (United States)

    C. H. Greenberg; S. Goodrick; J. D. Austin; B. R. Parresol

    2015-01-01

    Hydroregimes of ephemeral wetlands affect reproductive success of many amphibian species and are sensitive to altered weather patterns associated with climate change.We used 17 years of weekly temperature, precipitation, and waterdepth measurements for eight small, ephemeral, groundwaterdriven sinkhole wetlands in Florida sandhills to develop a hydroregime predictive...

  13. PiSCES: Pi(scine) stream community estimation software: A tool for nationwide fish assemblage predictions

    Science.gov (United States)

    Background/Question/Methods What species of fish might someone find in a local stream? How might that community change as a result of changes to characteristics of the stream and its watershed? PiSCES is a browser-based toolkit developed to predict a fish community for any NHD-Pl...

  14. Prediction of toxicity and comparison of alternatives using WebTEST (Web-services Toxicity Estimation Software Tool)

    Science.gov (United States)

    A Java-based web service is being developed within the US EPA’s Chemistry Dashboard to provide real time estimates of toxicity values and physical properties. WebTEST can generate toxicity predictions directly from a simple URL which includes the endpoint, QSAR method, and ...

  15. Modeling and validation of a mechanistic tool (MEFISTO) for the prediction of critical power in BWR fuel assemblies

    International Nuclear Information System (INIS)

    Adamsson, Carl; Le Corre, Jean-Marie

    2011-01-01

    Highlights: → The MEFISTO code efficiently and accurately predicts the dryout event in a BWR fuel bundle, using a mechanistic model. → A hybrid approach between a fast and robust sub-channel analysis and a three-field two-phase analysis is adopted. → MEFISTO modeling approach, calibration, CPU usage, sensitivity, trend analysis and performance evaluation are presented. → The calibration parameters and process were carefully selected to preserve the mechanistic nature of the code. → The code dryout prediction performance is near the level of fuel-specific empirical dryout correlations. - Abstract: Westinghouse is currently developing the MEFISTO code with the main goal to achieve fast, robust, practical and reliable prediction of steady-state dryout Critical Power in Boiling Water Reactor (BWR) fuel bundle based on a mechanistic approach. A computationally efficient simulation scheme was used to achieve this goal, where the code resolves all relevant field (drop, steam and multi-film) mass balance equations, within the annular flow region, at the sub-channel level while relying on a fast and robust two-phase (liquid/steam) sub-channel solution to provide the cross-flow information. The MEFISTO code can hence provide highly detailed solution of the multi-film flow in BWR fuel bundle while enhancing flexibility and reducing the computer time by an order of magnitude as compared to a standard three-field sub-channel analysis approach. Models for the numerical computation of the one-dimensional field flowrate distributions in an open channel (e.g. a sub-channel), including the numerical treatment of field cross-flows, part-length rods, spacers grids and post-dryout conditions are presented in this paper. The MEFISTO code is then applied to dryout prediction in BWR fuel bundle using VIPRE-W as a fast and robust two-phase sub-channel driver code. The dryout power is numerically predicted by iterating on the bundle power so that the minimum film flowrate in the

  16. A strategic framework to utilise venture capital funding to develop manufacturing SMES in South Africa

    Directory of Open Access Journals (Sweden)

    Snyman, Hendrik Andries

    2014-08-01

    Full Text Available SMEs contribute considerably to the national GDP and to private sector employment, but they struggle to gain access to the funding needed to support business sustainability and growth. Venture capital provides the necessary funding, but SMEs lack understanding of the business value curve utilised by financiers to gauge the risk-reward characteristics of an investment. Strategies need to convey how the business model will evolve in order to deliver on the strategic intent. A framework is proposed through which SMEs can develop a strategy aligned with investor requirements. As a case study, the framework is applied to the local tooling sector.

  17. Predictive validity of the identification of seniors at risk screening tool in a German emergency department setting.

    Science.gov (United States)

    Singler, Katrin; Heppner, Hans Jürgen; Skutetzky, Andreas; Sieber, Cornel; Christ, Michael; Thiem, Ulrich

    2014-01-01

    The identification of patients at high risk for adverse outcomes [death, unplanned readmission to emergency department (ED)/hospital, functional decline] plays an important role in emergency medicine. The Identification of Seniors at Risk (ISAR) instrument is one of the most commonly used and best-validated screening tools. As to the authors' knowledge so far there are no data on any screening tool for the identification of older patients at risk for a negative outcome in Germany. To evaluate the validity of the ISAR screening tool in a German ED. This was a prospective single-center observational cohort study in an ED of an urban university-affiliated hospital. Participants were 520 patients aged ≥75 years consecutively admitted to the ED. The German version of the ISAR screening tool was administered directly after triage of the patients. Follow-up telephone interviews to assess outcome variables were conducted 28 and 180 days after the index visit in the ED. The primary end point was death from any cause or hospitalization or recurrent ED visit or change of residency into a long-term care facility on day 28 after the index ED visit. The mean age ± SD was 82.8 ± 5.0 years. According to ISAR, 425 patients (81.7%) scored ≥2 points, and 315 patients (60.5%) scored ≥3 points. The combined primary end point was observed in 250 of 520 patients (48.1%) on day 28 and in 260 patients (50.0%) on day 180. Using a continuous ISAR score the area under the curve on day 28 was 0.621 (95% confidence interval, CI 0.573-0.669) and 0.661 (95% CI 0.615-0.708) on day 180, respectively. The German version of the ISAR screening tool acceptably identified elderly patients in the ED with an increased risk of a negative outcome. Using the cutoff ≥3 points instead of ≥2 points yielded better overall results.

  18. Evaluation of a web based informatics system with data mining tools for predicting outcomes with quantitative imaging features in stroke rehabilitation clinical trials

    Science.gov (United States)

    Wang, Ximing; Kim, Bokkyu; Park, Ji Hoon; Wang, Erik; Forsyth, Sydney; Lim, Cody; Ravi, Ragini; Karibyan, Sarkis; Sanchez, Alexander; Liu, Brent

    2017-03-01

    Quantitative imaging biomarkers are used widely in clinical trials for tracking and evaluation of medical interventions. Previously, we have presented a web based informatics system utilizing quantitative imaging features for predicting outcomes in stroke rehabilitation clinical trials. The system integrates imaging features extraction tools and a web-based statistical analysis tool. The tools include a generalized linear mixed model(GLMM) that can investigate potential significance and correlation based on features extracted from clinical data and quantitative biomarkers. The imaging features extraction tools allow the user to collect imaging features and the GLMM module allows the user to select clinical data and imaging features such as stroke lesion characteristics from the database as regressors and regressands. This paper discusses the application scenario and evaluation results of the system in a stroke rehabilitation clinical trial. The system was utilized to manage clinical data and extract imaging biomarkers including stroke lesion volume, location and ventricle/brain ratio. The GLMM module was validated and the efficiency of data analysis was also evaluated.

  19. A predictive tool to estimate the risk of axillary metastases in breast cancer patients with negative axillary ultrasound

    DEFF Research Database (Denmark)

    Meretoja, T J; Heikkilä, P S; Mansfield, A S

    2014-01-01

    of this study was to evaluate the risk factors for axillary metastases in breast cancer patients with negative preoperative axillary ultrasound. METHODS: A total of 1,395 consecutive patients with invasive breast cancer and SNB formed the original patient series. A univariate analysis was conducted to assess...... risk factors for axillary metastases. Binary logistic regression analysis was conducted to form a predictive model based on the risk factors. The predictive model was first validated internally in a patient series of 566 further patients and then externally in a patient series of 2,463 patients from......BACKGROUND: Sentinel node biopsy (SNB) is the "gold standard" in axillary staging in clinically node-negative breast cancer patients. However, axillary treatment is undergoing a paradigm shift and studies are being conducted on whether SNB may be omitted in low-risk patients. The purpose...

  20. Basic features of the predictive tools of early warning systems for water-related natural hazards: examples for shallow landslides

    Directory of Open Access Journals (Sweden)

    R. Greco

    2017-12-01

    Full Text Available To manage natural risks, an increasing effort is being put in the development of early warning systems (EWS, namely, approaches facing catastrophic phenomena by timely forecasting and alarm spreading throughout exposed population. Research efforts aimed at the development and implementation of effective EWS should especially concern the definition and calibration of the interpretative model. This paper analyses the main features characterizing predictive models working in EWS by discussing their aims and their features in terms of model accuracy, evolutionary stage of the phenomenon at which the prediction is carried out and model architecture. Original classification criteria based on these features are developed throughout the paper and shown in their practical implementation through examples of flow-like landslides and earth flows, both of which are characterized by rapid evolution and quite representative of many applications of EWS.

  1. Basic features of the predictive tools of early warning systems for water-related natural hazards: examples for shallow landslides

    Science.gov (United States)

    Greco, Roberto; Pagano, Luca

    2017-12-01

    To manage natural risks, an increasing effort is being put in the development of early warning systems (EWS), namely, approaches facing catastrophic phenomena by timely forecasting and alarm spreading throughout exposed population. Research efforts aimed at the development and implementation of effective EWS should especially concern the definition and calibration of the interpretative model. This paper analyses the main features characterizing predictive models working in EWS by discussing their aims and their features in terms of model accuracy, evolutionary stage of the phenomenon at which the prediction is carried out and model architecture. Original classification criteria based on these features are developed throughout the paper and shown in their practical implementation through examples of flow-like landslides and earth flows, both of which are characterized by rapid evolution and quite representative of many applications of EWS.

  2. East London Modified-Broset as Decision-Making Tool to Predict Seclusion in Psychiatric Intensive Care Units

    OpenAIRE

    Loi, Felice; Marlowe, Karl

    2017-01-01

    Seclusion is a last resort intervention for management of aggressive behavior in psychiatric settings. There is no current objective and practical decision-making instrument for seclusion use on psychiatric wards. Our aim was to test the predictive and discriminatory characteristics of the East London Modified-Broset (ELMB), to delineate its decision-making profile for seclusion of adult psychiatric patients, and second to benchmark it against the psychometric properties of the Broset Violenc...

  3. Improvement of predictive tools for vapor-liquid equilibrium based on group contribution methods applied to lipid technology

    DEFF Research Database (Denmark)

    Damaceno, Daniela S.; Perederic, Olivia A.; Ceriani, Roberta

    2017-01-01

    structures that the first-order functional groups are unable to handle. In the particular case of fatty systems these models are not able to adequately predict the non-ideality in the liquid phase. Consequently, a new set of functional groups is proposed to represent the lipid compounds, requiring thereby....... There are rather small differences between the models and no single model is the best in all cases....

  4. Solution small-angle x-ray scattering as a screening and predictive tool in the fabrication of asymmetric block copolymer membranes

    KAUST Repository

    Dorin, Rachel Mika; Marques, Debora S.; Sai, Hiroaki; Vainio, Ulla; Phillip, William A.; Peinemann, Klaus; Nunes, Suzana Pereira; Wiesner, Ulrich B.

    2012-01-01

    Small-angle X-ray scattering (SAXS) analysis of the diblock copolymer poly(styrene-b-(4-vinyl)pyridine) in a ternary solvent system of 1,4-dioxane, tetrahydrofuran, and N,N-dimethylformamide, and the triblock terpolymer poly(isoprene-b-styrene-b-(4-vinyl)-pyridine) in a binary solvent system of 1,4-dioxane and tetrahydrofuran, reveals a concentration-dependent onset of ordered structure formation. Asymmetric membranes fabricated from casting solutions with polymer concentrations at or slightly below this ordering concentration possess selective layers with the desired nanostructure. In addition to rapidly screening possible polymer solution concentrations, solution SAXS analysis also predicts hexagonal and square pore lattices of the final membrane surface structure. These results suggest solution SAXS as a powerful tool for screening casting solution concentrations and predicting surface structure in the fabrication of asymmetric ultrafiltration membranes from self-assembled block copolymers. (Figure presented) © 2012 American Chemical Society.

  5. Solution small-angle x-ray scattering as a screening and predictive tool in the fabrication of asymmetric block copolymer membranes

    KAUST Repository

    Dorin, Rachel Mika

    2012-05-15

    Small-angle X-ray scattering (SAXS) analysis of the diblock copolymer poly(styrene-b-(4-vinyl)pyridine) in a ternary solvent system of 1,4-dioxane, tetrahydrofuran, and N,N-dimethylformamide, and the triblock terpolymer poly(isoprene-b-styrene-b-(4-vinyl)-pyridine) in a binary solvent system of 1,4-dioxane and tetrahydrofuran, reveals a concentration-dependent onset of ordered structure formation. Asymmetric membranes fabricated from casting solutions with polymer concentrations at or slightly below this ordering concentration possess selective layers with the desired nanostructure. In addition to rapidly screening possible polymer solution concentrations, solution SAXS analysis also predicts hexagonal and square pore lattices of the final membrane surface structure. These results suggest solution SAXS as a powerful tool for screening casting solution concentrations and predicting surface structure in the fabrication of asymmetric ultrafiltration membranes from self-assembled block copolymers. (Figure presented) © 2012 American Chemical Society.

  6. The predictive ability of the STarT Back Tool was limited in people with chronic low back pain: a prospective cohort study

    Directory of Open Access Journals (Sweden)

    Michelle Kendell

    2018-04-01

    Full Text Available Questions: In people with chronic non-specific low back pain (LBP, what is the predictive and discriminative validity of the STarT Back Tool (SBT for pain intensity, self-reported LBP-related disability, and global self-perceived change at 1-year follow-up? What is the profile of the SBT risk subgroups with respect to demographic variables, pain intensity, self-reported LBP-related disability, and psychological measures? Design: Prospective cohort study. Participants: A total of 290 adults with dominant axial LBP of ≥ 3 months’ duration recruited from the general community, and private physiotherapy, psychology, and pain-management clinics in Western Australia. Outcome measures: The 1-year follow-up measures were pain intensity, LBP-related disability, and global self-perceived change. Results: Outcomes were collected on 264 participants. The SBT categorised 82 participants (28% as low risk, 116 (40% as medium risk, and 92 (32% as high risk. The risk subgroups differed significantly (p < 0.05 on baseline pain, disability, and psychological scores. The SBT’s predictive ability was strongest for disability: RR was 2.30 (95% CI 1.28 to 4.10 in the medium-risk group and 2.86 (95% CI 1.60 to 5.11 in the high-risk group. The SBT’s predictive ability was weaker for pain: RR was 1.25 (95% CI 1.04 to 1.51 in the medium-risk group and 1.26 (95% CI 1.03 to 1.52 in the high-risk group. For the SBT total score, the AUC was 0.71 (95% CI 0.64 to 0.77 for disability and 0.63 (95% CI 0.55 to 0.71 for pain. Conclusion: This was the first large study to investigate the SBT in a population exclusively with chronic LBP. The SBT provided an acceptable indication of 1-year disability, had poor predictive and discriminative ability for future pain, and was unable to predict or discriminate global perceived change. In this cohort with chronic non-specific LBP, the SBT’s predictive and discriminative abilities were restricted to disability at 1

  7. The cardiovascular event reduction tool (CERT)--a simplified cardiac risk prediction model developed from the West of Scotland Coronary Prevention Study (WOSCOPS).

    Science.gov (United States)

    L'Italien, G; Ford, I; Norrie, J; LaPuerta, P; Ehreth, J; Jackson, J; Shepherd, J

    2000-03-15

    The clinical decision to treat hypercholesterolemia is premised on an awareness of patient risk, and cardiac risk prediction models offer a practical means of determining such risk. However, these models are based on observational cohorts where estimates of the treatment benefit are largely inferred. The West of Scotland Coronary Prevention Study (WOSCOPS) provides an opportunity to develop a risk-benefit prediction model from the actual observed primary event reduction seen in the trial. Five-year Cox model risk estimates were derived from all WOSCOPS subjects (n = 6,595 men, aged 45 to 64 years old at baseline) using factors previously shown to be predictive of definite fatal coronary heart disease or nonfatal myocardial infarction. Model risk factors included age, diastolic blood pressure, total cholesterol/ high-density lipoprotein ratio (TC/HDL), current smoking, diabetes, family history of fatal coronary heart disease, nitrate use or angina, and treatment (placebo/ 40-mg pravastatin). All risk factors were expressed as categorical variables to facilitate risk assessment. Risk estimates were incorporated into a simple, hand-held slide rule or risk tool. Risk estimates were identified for 5-year age bands (45 to 65 years), 4 categories of TC/HDL ratio ( or = 7.5), 2 levels of diastolic blood pressure ( or = 90 mm Hg), from 0 to 3 additional risk factors (current smoking, diabetes, family history of premature fatal coronary heart disease, nitrate use or angina), and pravastatin treatment. Five-year risk estimates ranged from 2% in very low-risk subjects to 61% in the very high-risk subjects. Risk reduction due to pravastatin treatment averaged 31%. Thus, the Cardiovascular Event Reduction Tool (CERT) is a risk prediction model derived from the WOSCOPS trial. Its use will help physicians identify patients who will benefit from cholesterol reduction.

  8. GLASS MELTING PHENOMENA, THEIR ORDERING AND MELTING SPACE UTILISATION

    Directory of Open Access Journals (Sweden)

    Němec L.

    2013-12-01

    Full Text Available Four aspects of effective glass melting have been defined – namely the fast kinetics of partial melting phenomena, a consideration of the melting phenomena ordering, high utilisation of the melting space, and effective utilisation of the supplied energy. The relations were defined for the specific melting performance and specific energy consumption of the glass melting process which involve the four mentioned aspects of the process and indicate the potentials of effective melting. The quantity “space utilisation” has been treated in more detail as an aspect not considered in practice till this time. The space utilisation was quantitatively defined and its values have been determined for the industrial melting facility by mathematical modelling. The definitions of the specific melting performance and specific energy consumption have been used for assessment of the potential impact of a controlled melt flow and high space utilisation on the melting process efficiency on the industrial scale. The results have shown that even the partial control of the melt flow, leading to the partial increase of the space utilisation, may considerably increase the melting performance, whereas a decrease of the specific energy consumption was determined to be between 10 - 15 %.

  9. Computational tools and resources for metabolism-related property predictions. 1. Overview of publicly available (free and commercial) databases and software.

    Science.gov (United States)

    Peach, Megan L; Zakharov, Alexey V; Liu, Ruifeng; Pugliese, Angelo; Tawa, Gregory; Wallqvist, Anders; Nicklaus, Marc C

    2012-10-01

    Metabolism has been identified as a defining factor in drug development success or failure because of its impact on many aspects of drug pharmacology, including bioavailability, half-life and toxicity. In this article, we provide an outline and descriptions of the resources for metabolism-related property predictions that are currently either freely or commercially available to the public. These resources include databases with data on, and software for prediction of, several end points: metabolite formation, sites of metabolic transformation, binding to metabolizing enzymes and metabolic stability. We attempt to place each tool in historical context and describe, wherever possible, the data it was based on. For predictions of interactions with metabolizing enzymes, we show a typical set of results for a small test set of compounds. Our aim is to give a clear overview of the areas and aspects of metabolism prediction in which the currently available resources are useful and accurate, and the areas in which they are inadequate or missing entirely.

  10. Analysis and prediction of agricultural pest dynamics with Tiko'n, a generic tool to develop agroecological food web models

    Science.gov (United States)

    Malard, J. J.; Rojas, M.; Adamowski, J. F.; Anandaraja, N.; Tuy, H.; Melgar-Quiñonez, H.

    2016-12-01

    While several well-validated crop growth models are currently widely used, very few crop pest models of the same caliber have been developed or applied, and pest models that take trophic interactions into account are even rarer. This may be due to several factors, including 1) the difficulty of representing complex agroecological food webs in a quantifiable model, and 2) the general belief that pesticides effectively remove insect pests from immediate concern. However, pests currently claim a substantial amount of harvests every year (and account for additional control costs), and the impact of insects and of their trophic interactions on agricultural crops cannot be ignored, especially in the context of changing climates and increasing pressures on crops across the globe. Unfortunately, most integrated pest management frameworks rely on very simple models (if at all), and most examples of successful agroecological management remain more anecdotal than scientifically replicable. In light of this, there is a need for validated and robust agroecological food web models that allow users to predict the response of these webs to changes in management, crops or climate, both in order to predict future pest problems under a changing climate as well as to develop effective integrated management plans. Here we present Tiko'n, a Python-based software whose API allows users to rapidly build and validate trophic web agroecological models that predict pest dynamics in the field. The programme uses a Bayesian inference approach to calibrate the models according to field data, allowing for the reuse of literature data from various sources and reducing the need for extensive field data collection. We apply the model to the cononut black-headed caterpillar (Opisina arenosella) and associated parasitoid data from Sri Lanka, showing how the modeling framework can be used to rapidly develop, calibrate and validate models that elucidate how the internal structures of food webs

  11. WaLIDD score, a new tool to diagnose dysmenorrhea and predict medical leave in university students

    Science.gov (United States)

    Teherán, Aníbal A; Piñeros, Luis Gabriel; Pulido, Fabián; Mejía Guatibonza, María Camila

    2018-01-01

    Background Dysmenorrhea is a frequent and misdiagnosed symptom affecting the quality of life in young women. A working ability, location, intensity, days of pain, dysmenorrhea (WaLIDD) score was designed to diagnose dysmenorrhea and to predict medical leave. Methods This cross-sectional design included young medical students, who completed a self-administered questionnaire that contained the verbal rating score (VRS; pain and drug subscales) and WaLIDD scales. The correlation between scales was established through Spearman test. The area under the receiver operating characteristic (ROC) curve, sensitivity, specificity, and likelihood ratio (LR +/−) were evaluated to diagnose students availing medical leave due to dysmenorrhea; moreover, to predict medical leave in students with dysmenorrhea, a binary logistic regression was performed. Results In all, 585 students, with a mean age of 21 years and menarche at 12 years, participated. Most of them had regular cycles, 5 days of menstrual blood flow and 1–2 days of lower abdominal pain. The WaLIDD scale presented an adequate internal consistency and strong correlation with VRS subscales. With a cutoff of >6 for WaLIDD and 2 for VRS subscales (drug subscale and pain subscale) to identify students with dysmenorrhea, these scales presented an area under the curve (AUC) ROC of 0.82, 0.62, and 0.67, respectively. To identify students taking medical leave due to dysmenorrhea, WaLIDD (cutoff >9) and VRS subscales (cutoff >2) presented an AUC ROC of 0.97, 0.68, and 0.81; moreover, the WaLIDD scale showed a good LR +14.2 (95% CI, 13.5–14.9), LR −0.00 (95% CI, undefined), and predictive risk (OR 5.38; 95% CI, 1.78–16.2). Conclusion This research allowed a comparison between two multidimensional scales regarding their capabilities, one previously validated and a new one, to discriminate among the general population of medical students, among those with dysmenorrhea or those availing medical leave secondary to dysmenorrhea

  12. HLArestrictor-a tool for patient-specific predictions of HLA restriction elements and optimal epitopes within peptides

    DEFF Research Database (Denmark)

    Larsen, Malene Erup; Kloverpris, H.; Stryhn, A.

    2011-01-01

    ://www.cbs.dtu.dk/services/HLArestrictor ), which is based on the highly versatile and accurate NetMHCpan predictor, which here has been optimized for the identification of both the MHC restriction element and the corresponding minimal epitope of a T cell response in a given individual. As input, it requires high-resolution (i.e., 4-digit) HLA...... HLA restrictions and minimal epitopes for about 90% of the positive peptide/patient pairs while rejecting more than 95% of the negative peptide-HLA pairs. Furthermore, for 18 peptide/HLA tetramer validated responses, HLArestrictor in all cases predicted both the HLA restriction element and minimal...

  13. Introduction of an Evaluation Tool to Predict the Probability of Success of Companies: The Innovativeness, Capabilities and Potential Model (ICP

    Directory of Open Access Journals (Sweden)

    Michael Lewrick

    2009-05-01

    Full Text Available Successful innovation requires management and in this paper a model to help manage the innovation process is presented. This model can be used to audit the management capability to innovate and to monitor how sales increase is related to innovativeness. The model was developed from a study of companies in the high technology cluster around Munich and validated using statistical procedures. The model was found to be effective at predicting the success or otherwise of the innovation strategy pursued by the company. The use of this model and how it can be used to identify areas for improvement are documented in this paper.

  14. Predicting European Union recessions in the euro era: The yield curve as a forecasting tool of economic activity

    OpenAIRE

    Gogas, Periklis; Chionis, Dionisios; Pragkidis, Ioannis

    2009-01-01

    Several studies have established the predictive power of the yield curve, ie: the difference between long and short term bond rates, in terms of real economic activity, for the U.S. and various European countries. In this paper we use data from the European Union (EU15), ranging from 1994:Q1 to 2008:Q3. The seasonally adjusted real GDP is used to extract the long run trend and the cyclical component of the European output, while the European Central Bank’s euro area government benchmark bonds...

  15. Is monocyte HLA-DR expression monitoring a useful tool to predict the risk of secondary infection?

    Science.gov (United States)

    Lukaszewicz, A-C; Faivre, V; Payen, D

    2010-09-01

    Downregulation of the immune response is common among Intensive Care Unit (ICU) patients after an acute inflammatory injury, whether it was septic or not. Such a modification could be seen as an adaptation to attenuate the effects of the inflammatory storm on tissues, but it exposes the subject to the risk of nosocomial infection and impairs recovery processes. The intensity of immunity downregulation is difficult to characterize, since clinical presentation is silent and non-specific, which urges the use of tools for immune monitoring. This review focuses on the use of monocyte HLA-DR expression to detect immune hyporesponsiveness and to select the appropriate immunomodulating therapy, as well as the efficiency of this technique in controlling secondary infections.

  16. Autoregressive models as a tool to discriminate chaos from randomness in geoelectrical time series: an application to earthquake prediction

    Directory of Open Access Journals (Sweden)

    C. Serio

    1997-06-01

    Full Text Available The time dynamics of geoelectrical precursory time series has been investigated and a method to discriminate chaotic behaviour in geoelectrical precursory time series is proposed. It allows us to detect low-dimensional chaos when the only information about the time series comes from the time series themselves. The short-term predictability of these time series is evaluated using two possible forecasting approaches: global autoregressive approximation and local autoregressive approximation. The first views the data as a realization of a linear stochastic process, whereas the second considers the data points as a realization of a deterministic process, supposedly non-linear. The comparison of the predictive skill of the two techniques is a test to discriminate between low-dimensional chaos and random dynamics. The analyzed time series are geoelectrical measurements recorded by an automatic station located in Tito (Southern Italy in one of the most seismic areas of the Mediterranean region. Our findings are that the global (linear approach is superior to the local one and the physical system governing the phenomena of electrical nature is characterized by a large number of degrees of freedom. Power spectra of the filtered time series follow a P(f = F-a scaling law: they exhibit the typical behaviour of a broad class of fractal stochastic processes and they are a signature of the self-organized systems.

  17. Generalized additive models used to predict species abundance in the Gulf of Mexico: an ecosystem modeling tool.

    Directory of Open Access Journals (Sweden)

    Michael Drexler

    Full Text Available Spatially explicit ecosystem models of all types require an initial allocation of biomass, often in areas where fisheries independent abundance estimates do not exist. A generalized additive modelling (GAM approach is used to describe the abundance of 40 species groups (i.e. functional groups across the Gulf of Mexico (GoM using a large fisheries independent data set (SEAMAP and climate scale oceanographic conditions. Predictor variables included in the model are chlorophyll a, sediment type, dissolved oxygen, temperature, and depth. Despite the presence of a large number of zeros in the data, a single GAM using a negative binomial distribution was suitable to make predictions of abundance for multiple functional groups. We present an example case study using pink shrimp (Farfantepenaeus duroarum and compare the results to known distributions. The model successfully predicts the known areas of high abundance in the GoM, including those areas where no data was inputted into the model fitting. Overall, the model reliably captures areas of high and low abundance for the large majority of functional groups observed in SEAMAP. The result of this method allows for the objective setting of spatial distributions for numerous functional groups across a modeling domain, even where abundance data may not exist.

  18. The predictive ability of the STarT Back Tool was limited in people with chronic low back pain: a prospective cohort study.

    Science.gov (United States)

    Kendell, Michelle; Beales, Darren; O'Sullivan, Peter; Rabey, Martin; Hill, Jonathan; Smith, Anne

    2018-04-01

    In people with chronic non-specific low back pain (LBP), what is the predictive and discriminative validity of the STarT Back Tool (SBT) for pain intensity, self-reported LBP-related disability, and global self-perceived change at 1-year follow-up? What is the profile of the SBT risk subgroups with respect to demographic variables, pain intensity, self-reported LBP-related disability, and psychological measures? Prospective cohort study. A total of 290 adults with dominant axial LBP of≥3months' duration recruited from the general community, and private physiotherapy, psychology, and pain-management clinics in Western Australia. The 1-year follow-up measures were pain intensity, LBP-related disability, and global self-perceived change. Outcomes were collected on 264 participants. The SBT categorised 82 participants (28%) as low risk, 116 (40%) as medium risk, and 92 (32%) as high risk. The risk subgroups differed significantly (ppredictive ability was strongest for disability: RR was 2.30 (95% CI 1.28 to 4.10) in the medium-risk group and 2.86 (95% CI 1.60 to 5.11) in the high-risk group. The SBT's predictive ability was weaker for pain: RR was 1.25 (95% CI 1.04 to 1.51) in the medium-risk group and 1.26 (95% CI 1.03 to 1.52) in the high-risk group. For the SBT total score, the AUC was 0.71 (95% CI 0.64 to 0.77) for disability and 0.63 (95% CI 0.55 to 0.71) for pain. This was the first large study to investigate the SBT in a population exclusively with chronic LBP. The SBT provided an acceptable indication of 1-year disability, had poor predictive and discriminative ability for future pain, and was unable to predict or discriminate global perceived change. In this cohort with chronic non-specific LBP, the SBT's predictive and discriminative abilities were restricted to disability at 1year. [Kendell M, Beales D, O'Sullivan P, Rabey M, Hill J, Smith A (2018) The predictive ability of the STarT Back Tool was limited in people with chronic low back pain: a prospective

  19. Utilisation of phosphate by jute from jute growing soils

    International Nuclear Information System (INIS)

    Ray, P.K.; Suiha, A.K.

    1974-01-01

    The uptake and utilisation of phosphate from different jute growing soils of West Bengal viz., Humaipur (24-Parganas), Haripal (Hooghly), Panagarh (Burdwan) and the Jute Agricultural Research Institute, Barrackpore (24-Parganas) were studied in pot under fertilizer combination of NP, PK and NPK. The soils from 24-Parganas district behaved in a similar manner with respect to dry matter yield, fertilizer P uptake and per cent utilisation of added P. The P deficient between the two, Humaipur soil, showed comparatively higher P utilisation. Other two soils, Haripal and Panagarh, though of different origin behaved similarly, highest soil P has been contributed by the P rich soil (J.A.R.I.) to the crop, though it showed minimum P fixation. (author)

  20. Utilisation of antibiotic therapy in community practice.

    LENUS (Irish Health Repository)

    McGowan, B

    2008-10-01

    The aim of the study was to identify outpatient antibiotic consumption between Jan 2000 and Dec 2005 through analysis of the HSE-Primary Care Reimbursement Services (PCRS) database as part of the Surveillance of Antimicrobial Resistance in Ireland (SARI) project. Total antibiotic consumption on the PCRS scheme between January 2000 and December 2005 expressed in Defined Daily Dose per 1000 PCRS inhabitants per day increased by 26%. The penicillin group represents the highest consumption accounting for approximately 50% of the total outpatient antibiotic use. Total DIDs for this group increased by 25% between 2000 and 2005. Co-amoxiclav and amoxicillin account for 80% of the total consumption of this group of anti-infectives. With the exception of aminoglycosides and sulfonamides which demonstrated a decrease in DID consumption of 47% and 8% respectively, all other groups of anti-infectives had an increase in DID consumption of greater than 25% during the study period. Antibiotic prescribing data is a valuable tool for assessing public health strategies aiming to optimise antibiotic prescribing.