WorldWideScience

Sample records for novo computational prediction

  1. MRUniNovo: an efficient tool for de novo peptide sequencing utilizing the hadoop distributed computing framework.

    Science.gov (United States)

    Li, Chuang; Chen, Tao; He, Qiang; Zhu, Yunping; Li, Kenli

    2017-03-15

    Tandem mass spectrometry-based de novo peptide sequencing is a complex and time-consuming process. The current algorithms for de novo peptide sequencing cannot rapidly and thoroughly process large mass spectrometry datasets. In this paper, we propose MRUniNovo, a novel tool for parallel de novo peptide sequencing. MRUniNovo parallelizes UniNovo based on the Hadoop compute platform. Our experimental results demonstrate that MRUniNovo significantly reduces the computation time of de novo peptide sequencing without sacrificing the correctness and accuracy of the results, and thus can process very large datasets that UniNovo cannot. MRUniNovo is an open source software tool implemented in java. The source code and the parameter settings are available at http://bioinfo.hupo.org.cn/MRUniNovo/index.php. s131020002@hnu.edu.cn ; taochen1019@163.com. Supplementary data are available at Bioinformatics online. © The Author 2016. Published by Oxford University Press. All rights reserved. For Permissions, please e-mail: journals.permissions@oup.com

  2. de novo computational enzyme design.

    Science.gov (United States)

    Zanghellini, Alexandre

    2014-10-01

    Recent advances in systems and synthetic biology as well as metabolic engineering are poised to transform industrial biotechnology by allowing us to design cell factories for the sustainable production of valuable fuels and chemicals. To deliver on their promises, such cell factories, as much as their brick-and-mortar counterparts, will require appropriate catalysts, especially for classes of reactions that are not known to be catalyzed by enzymes in natural organisms. A recently developed methodology, de novo computational enzyme design can be used to create enzymes catalyzing novel reactions. Here we review the different classes of chemical reactions for which active protein catalysts have been designed as well as the results of detailed biochemical and structural characterization studies. We also discuss how combining de novo computational enzyme design with more traditional protein engineering techniques can alleviate the shortcomings of state-of-the-art computational design techniques and create novel enzymes with catalytic proficiencies on par with natural enzymes. Copyright © 2014 Elsevier Ltd. All rights reserved.

  3. Sequential search leads to faster, more efficient fragment-based de novo protein structure prediction.

    Science.gov (United States)

    de Oliveira, Saulo H P; Law, Eleanor C; Shi, Jiye; Deane, Charlotte M

    2018-04-01

    Most current de novo structure prediction methods randomly sample protein conformations and thus require large amounts of computational resource. Here, we consider a sequential sampling strategy, building on ideas from recent experimental work which shows that many proteins fold cotranslationally. We have investigated whether a pseudo-greedy search approach, which begins sequentially from one of the termini, can improve the performance and accuracy of de novo protein structure prediction. We observed that our sequential approach converges when fewer than 20 000 decoys have been produced, fewer than commonly expected. Using our software, SAINT2, we also compared the run time and quality of models produced in a sequential fashion against a standard, non-sequential approach. Sequential prediction produces an individual decoy 1.5-2.5 times faster than non-sequential prediction. When considering the quality of the best model, sequential prediction led to a better model being produced for 31 out of 41 soluble protein validation cases and for 18 out of 24 transmembrane protein cases. Correct models (TM-Score > 0.5) were produced for 29 of these cases by the sequential mode and for only 22 by the non-sequential mode. Our comparison reveals that a sequential search strategy can be used to drastically reduce computational time of de novo protein structure prediction and improve accuracy. Data are available for download from: http://opig.stats.ox.ac.uk/resources. SAINT2 is available for download from: https://github.com/sauloho/SAINT2. saulo.deoliveira@dtc.ox.ac.uk. Supplementary data are available at Bioinformatics online.

  4. Building a better fragment library for de novo protein structure prediction.

    Directory of Open Access Journals (Sweden)

    Saulo H P de Oliveira

    Full Text Available Fragment-based approaches are the current standard for de novo protein structure prediction. These approaches rely on accurate and reliable fragment libraries to generate good structural models. In this work, we describe a novel method for structure fragment library generation and its application in fragment-based de novo protein structure prediction. The importance of correct testing procedures in assessing the quality of fragment libraries is demonstrated. In particular, the exclusion of homologs to the target from the libraries to correctly simulate a de novo protein structure prediction scenario, something which surprisingly is not always done. We demonstrate that fragments presenting different predominant predicted secondary structures should be treated differently during the fragment library generation step and that exhaustive and random search strategies should both be used. This information was used to develop a novel method, Flib. On a validation set of 41 structurally diverse proteins, Flib libraries presents both a higher precision and coverage than two of the state-of-the-art methods, NNMake and HHFrag. Flib also achieves better precision and coverage on the set of 275 protein domains used in the two previous experiments of the the Critical Assessment of Structure Prediction (CASP9 and CASP10. We compared Flib libraries against NNMake libraries in a structure prediction context. Of the 13 cases in which a correct answer was generated, Flib models were more accurate than NNMake models for 10. "Flib is available for download at: http://www.stats.ox.ac.uk/research/proteins/resources".

  5. Building a Better Fragment Library for De Novo Protein Structure Prediction

    Science.gov (United States)

    de Oliveira, Saulo H. P.; Shi, Jiye; Deane, Charlotte M.

    2015-01-01

    Fragment-based approaches are the current standard for de novo protein structure prediction. These approaches rely on accurate and reliable fragment libraries to generate good structural models. In this work, we describe a novel method for structure fragment library generation and its application in fragment-based de novo protein structure prediction. The importance of correct testing procedures in assessing the quality of fragment libraries is demonstrated. In particular, the exclusion of homologs to the target from the libraries to correctly simulate a de novo protein structure prediction scenario, something which surprisingly is not always done. We demonstrate that fragments presenting different predominant predicted secondary structures should be treated differently during the fragment library generation step and that exhaustive and random search strategies should both be used. This information was used to develop a novel method, Flib. On a validation set of 41 structurally diverse proteins, Flib libraries presents both a higher precision and coverage than two of the state-of-the-art methods, NNMake and HHFrag. Flib also achieves better precision and coverage on the set of 275 protein domains used in the two previous experiments of the the Critical Assessment of Structure Prediction (CASP9 and CASP10). We compared Flib libraries against NNMake libraries in a structure prediction context. Of the 13 cases in which a correct answer was generated, Flib models were more accurate than NNMake models for 10. “Flib is available for download at: http://www.stats.ox.ac.uk/research/proteins/resources”. PMID:25901595

  6. Predicting survival of de novo metastatic breast cancer in Asian women: systematic review and validation study.

    Science.gov (United States)

    Miao, Hui; Hartman, Mikael; Bhoo-Pathy, Nirmala; Lee, Soo-Chin; Taib, Nur Aishah; Tan, Ern-Yu; Chan, Patrick; Moons, Karel G M; Wong, Hoong-Seam; Goh, Jeremy; Rahim, Siti Mastura; Yip, Cheng-Har; Verkooijen, Helena M

    2014-01-01

    In Asia, up to 25% of breast cancer patients present with distant metastases at diagnosis. Given the heterogeneous survival probabilities of de novo metastatic breast cancer, individual outcome prediction is challenging. The aim of the study is to identify existing prognostic models for patients with de novo metastatic breast cancer and validate them in Asia. We performed a systematic review to identify prediction models for metastatic breast cancer. Models were validated in 642 women with de novo metastatic breast cancer registered between 2000 and 2010 in the Singapore Malaysia Hospital Based Breast Cancer Registry. Survival curves for low, intermediate and high-risk groups according to each prognostic score were compared by log-rank test and discrimination of the models was assessed by concordance statistic (C-statistic). We identified 16 prediction models, seven of which were for patients with brain metastases only. Performance status, estrogen receptor status, metastatic site(s) and disease-free interval were the most common predictors. We were able to validate nine prediction models. The capacity of the models to discriminate between poor and good survivors varied from poor to fair with C-statistics ranging from 0.50 (95% CI, 0.48-0.53) to 0.63 (95% CI, 0.60-0.66). The discriminatory performance of existing prediction models for de novo metastatic breast cancer in Asia is modest. Development of an Asian-specific prediction model is needed to improve prognostication and guide decision making.

  7. Predicting survival of de novo metastatic breast cancer in Asian women: systematic review and validation study.

    Directory of Open Access Journals (Sweden)

    Hui Miao

    Full Text Available BACKGROUND: In Asia, up to 25% of breast cancer patients present with distant metastases at diagnosis. Given the heterogeneous survival probabilities of de novo metastatic breast cancer, individual outcome prediction is challenging. The aim of the study is to identify existing prognostic models for patients with de novo metastatic breast cancer and validate them in Asia. MATERIALS AND METHODS: We performed a systematic review to identify prediction models for metastatic breast cancer. Models were validated in 642 women with de novo metastatic breast cancer registered between 2000 and 2010 in the Singapore Malaysia Hospital Based Breast Cancer Registry. Survival curves for low, intermediate and high-risk groups according to each prognostic score were compared by log-rank test and discrimination of the models was assessed by concordance statistic (C-statistic. RESULTS: We identified 16 prediction models, seven of which were for patients with brain metastases only. Performance status, estrogen receptor status, metastatic site(s and disease-free interval were the most common predictors. We were able to validate nine prediction models. The capacity of the models to discriminate between poor and good survivors varied from poor to fair with C-statistics ranging from 0.50 (95% CI, 0.48-0.53 to 0.63 (95% CI, 0.60-0.66. CONCLUSION: The discriminatory performance of existing prediction models for de novo metastatic breast cancer in Asia is modest. Development of an Asian-specific prediction model is needed to improve prognostication and guide decision making.

  8. Generative Recurrent Networks for De Novo Drug Design.

    Science.gov (United States)

    Gupta, Anvita; Müller, Alex T; Huisman, Berend J H; Fuchs, Jens A; Schneider, Petra; Schneider, Gisbert

    2018-01-01

    Generative artificial intelligence models present a fresh approach to chemogenomics and de novo drug design, as they provide researchers with the ability to narrow down their search of the chemical space and focus on regions of interest. We present a method for molecular de novo design that utilizes generative recurrent neural networks (RNN) containing long short-term memory (LSTM) cells. This computational model captured the syntax of molecular representation in terms of SMILES strings with close to perfect accuracy. The learned pattern probabilities can be used for de novo SMILES generation. This molecular design concept eliminates the need for virtual compound library enumeration. By employing transfer learning, we fine-tuned the RNN's predictions for specific molecular targets. This approach enables virtual compound design without requiring secondary or external activity prediction, which could introduce error or unwanted bias. The results obtained advocate this generative RNN-LSTM system for high-impact use cases, such as low-data drug discovery, fragment based molecular design, and hit-to-lead optimization for diverse drug targets. © 2017 The Authors. Published by Wiley-VCH Verlag GmbH & Co. KGaA.

  9. Extreme-Scale De Novo Genome Assembly

    Energy Technology Data Exchange (ETDEWEB)

    Georganas, Evangelos [Intel Corporation, Santa Clara, CA (United States); Hofmeyr, Steven [Lawrence Berkeley National Lab. (LBNL), Berkeley, CA (United States). Joint Genome Inst.; Egan, Rob [Lawrence Berkeley National Lab. (LBNL), Berkeley, CA (United States). Computational Research Division; Buluc, Aydin [Lawrence Berkeley National Lab. (LBNL), Berkeley, CA (United States). Joint Genome Inst.; Oliker, Leonid [Lawrence Berkeley National Lab. (LBNL), Berkeley, CA (United States). Joint Genome Inst.; Rokhsar, Daniel [Lawrence Berkeley National Lab. (LBNL), Berkeley, CA (United States). Computational Research Division; Yelick, Katherine [Lawrence Berkeley National Lab. (LBNL), Berkeley, CA (United States). Joint Genome Inst.

    2017-09-26

    De novo whole genome assembly reconstructs genomic sequence from short, overlapping, and potentially erroneous DNA segments and is one of the most important computations in modern genomics. This work presents HipMER, a high-quality end-to-end de novo assembler designed for extreme scale analysis, via efficient parallelization of the Meraculous code. Genome assembly software has many components, each of which stresses different components of a computer system. This chapter explains the computational challenges involved in each step of the HipMer pipeline, the key distributed data structures, and communication costs in detail. We present performance results of assembling the human genome and the large hexaploid wheat genome on large supercomputers up to tens of thousands of cores.

  10. De novo prediction of human chromosome structures: Epigenetic marking patterns encode genome architecture.

    Science.gov (United States)

    Di Pierro, Michele; Cheng, Ryan R; Lieberman Aiden, Erez; Wolynes, Peter G; Onuchic, José N

    2017-11-14

    Inside the cell nucleus, genomes fold into organized structures that are characteristic of cell type. Here, we show that this chromatin architecture can be predicted de novo using epigenetic data derived from chromatin immunoprecipitation-sequencing (ChIP-Seq). We exploit the idea that chromosomes encode a 1D sequence of chromatin structural types. Interactions between these chromatin types determine the 3D structural ensemble of chromosomes through a process similar to phase separation. First, a neural network is used to infer the relation between the epigenetic marks present at a locus, as assayed by ChIP-Seq, and the genomic compartment in which those loci reside, as measured by DNA-DNA proximity ligation (Hi-C). Next, types inferred from this neural network are used as an input to an energy landscape model for chromatin organization [Minimal Chromatin Model (MiChroM)] to generate an ensemble of 3D chromosome conformations at a resolution of 50 kilobases (kb). After training the model, dubbed Maximum Entropy Genomic Annotation from Biomarkers Associated to Structural Ensembles (MEGABASE), on odd-numbered chromosomes, we predict the sequences of chromatin types and the subsequent 3D conformational ensembles for the even chromosomes. We validate these structural ensembles by using ChIP-Seq tracks alone to predict Hi-C maps, as well as distances measured using 3D fluorescence in situ hybridization (FISH) experiments. Both sets of experiments support the hypothesis of phase separation being the driving process behind compartmentalization. These findings strongly suggest that epigenetic marking patterns encode sufficient information to determine the global architecture of chromosomes and that de novo structure prediction for whole genomes may be increasingly possible. Copyright © 2017 the Author(s). Published by PNAS.

  11. Use of transient elastography to predict de novo recurrence after radiofrequency ablation for hepatocellular carcinoma.

    Science.gov (United States)

    Lee, Sang Hoon; Kim, Seung Up; Jang, Jeong Won; Bae, Si Hyun; Lee, Sanghun; Kim, Beom Kyung; Park, Jun Yong; Kim, Do Young; Ahn, Sang Hoon; Han, Kwang-Hyub

    2015-01-01

    Liver stiffness (LS) measurement using transient elastography can accurately assess the degree of liver fibrosis, which is associated with the risk of the development of hepatocellular carcinoma (HCC). We investigated whether LS values could predict HCC de novo recurrence after radiofrequency ablation (RFA). This retrospective, multicenter study analyzed 111 patients with HCC who underwent RFA and LS measurement using transient elastography between May 2005 and April 2011. All patients were followed until March 2013 to monitor for HCC recurrence. This study included 76 men and 35 women with a mean age of 62.4 years, and the mean LS value was 21.2 kPa. During the follow-up period (median 22.4 months), 47 (42.3%) patients experienced HCC de novo recurrence, and 18 (16.2%) died. Patients with recurrence had significantly more frequent liver cirrhosis, more frequent history of previous treatment for HCC, higher total bilirubin, larger spleen size, larger total tumor size, higher tumor number, higher LS values, and lower platelet counts than those without recurrence (all P13.0 kPa were at significantly greater risk for recurrence after RFA, with a hazard ratio (HR) of 3.115 (95% confidence interval [CI], 1.238-7.842, Pmeasurement is a useful predictor of HCC de novo recurrence and overall survival after RFA.

  12. De novo structural modeling and computational sequence analysis ...

    African Journals Online (AJOL)

    Different bioinformatics tools and machine learning techniques were used for protein structural classification. De novo protein modeling was performed by using I-TASSER server. The final model obtained was accessed by PROCHECK and DFIRE2, which confirmed that the final model is reliable. Until complete biochemical ...

  13. Pushing the size limit of de novo structure ensemble prediction guided by sparse SDSL-EPR restraints to 200 residues: The monomeric and homodimeric forms of BAX

    Science.gov (United States)

    Fischer, Axel W.; Bordignon, Enrica; Bleicken, Stephanie; García-Sáez, Ana J.; Jeschke, Gunnar; Meiler, Jens

    2016-01-01

    Structure determination remains a challenge for many biologically important proteins. In particular, proteins that adopt multiple conformations often evade crystallization in all biologically relevant states. Although computational de novo protein folding approaches often sample biologically relevant conformations, the selection of the most accurate model for different functional states remains a formidable challenge, in particular, for proteins with more than about 150 residues. Electron paramagnetic resonance (EPR) spectroscopy can obtain limited structural information for proteins in well-defined biological states and thereby assist in selecting biologically relevant conformations. The present study demonstrates that de novo folding methods are able to accurately sample the folds of 192-residue long soluble monomeric Bcl-2-associated X protein (BAX). The tertiary structures of the monomeric and homodimeric forms of BAX were predicted using the primary structure as well as 25 and 11 EPR distance restraints, respectively. The predicted models were subsequently compared to respective NMR/X-ray structures of BAX. EPR restraints improve the protein-size normalized root-mean-square-deviation (RMSD100) of the most accurate models with respect to the NMR/crystal structure from 5.9 Å to 3.9 Å and from 5.7 Å to 3.3 Å, respectively. Additionally, the model discrimination is improved, which is demonstrated by an improvement of the enrichment from 5% to 15% and from 13% to 21%, respectively. PMID:27129417

  14. UniNovo: a universal tool for de novo peptide sequencing.

    Science.gov (United States)

    Jeong, Kyowon; Kim, Sangtae; Pevzner, Pavel A

    2013-08-15

    Mass spectrometry (MS) instruments and experimental protocols are rapidly advancing, but de novo peptide sequencing algorithms to analyze tandem mass (MS/MS) spectra are lagging behind. Although existing de novo sequencing tools perform well on certain types of spectra [e.g. Collision Induced Dissociation (CID) spectra of tryptic peptides], their performance often deteriorates on other types of spectra, such as Electron Transfer Dissociation (ETD), Higher-energy Collisional Dissociation (HCD) spectra or spectra of non-tryptic digests. Thus, rather than developing a new algorithm for each type of spectra, we develop a universal de novo sequencing algorithm called UniNovo that works well for all types of spectra or even for spectral pairs (e.g. CID/ETD spectral pairs). UniNovo uses an improved scoring function that captures the dependences between different ion types, where such dependencies are learned automatically using a modified offset frequency function. The performance of UniNovo is compared with PepNovo+, PEAKS and pNovo using various types of spectra. The results show that the performance of UniNovo is superior to other tools for ETD spectra and superior or comparable with others for CID and HCD spectra. UniNovo also estimates the probability that each reported reconstruction is correct, using simple statistics that are readily obtained from a small training dataset. We demonstrate that the estimation is accurate for all tested types of spectra (including CID, HCD, ETD, CID/ETD and HCD/ETD spectra of trypsin, LysC or AspN digested peptides). UniNovo is implemented in JAVA and tested on Windows, Ubuntu and OS X machines. UniNovo is available at http://proteomics.ucsd.edu/Software/UniNovo.html along with the manual.

  15. Modeling fructose-load-induced hepatic de-novo lipogenesis by model simplification

    Directory of Open Access Journals (Sweden)

    Richard J Allen

    2017-03-01

    Full Text Available Hepatic de-novo lipogenesis is a metabolic process implemented in the pathogenesis of type 2 diabetes. Clinically, the rate of this process can be ascertained by use of labeled acetate and stimulation by fructose administration. A systems pharmacology model of this process is desirable because it facilitates the description, analysis, and prediction of this experiment. Due to the multiple enzymes involved in de-novo lipogenesis, and the limited data, it is desirable to use single functional expressions to encapsulate the flux between multiple enzymes. To accomplish this we developed a novel simplification technique which uses the available information about the properties of the individual enzymes to bound the parameters of a single governing ‘transfer function’. This method should be applicable to any model with linear chains of enzymes that are well stimulated. We validated this approach with computational simulations and analytical justification in a limiting case. Using this technique we generated a simple model of hepatic de-novo lipogenesis in these experimental conditions that matched prior data. This model can be used to assess pharmacological intervention at specific points on this pathway. We have demonstrated this with prospective simulation of acetyl-CoA carboxylase inhibition. This simplification technique suggests how the constituent properties of an enzymatic chain of reactions gives rise to the sensitivity (to substrate of the pathway as a whole.

  16. Modeling fructose-load-induced hepatic de-novo lipogenesis by model simplification.

    Science.gov (United States)

    Allen, Richard J; Musante, Cynthia J

    2017-01-01

    Hepatic de-novo lipogenesis is a metabolic process implemented in the pathogenesis of type 2 diabetes. Clinically, the rate of this process can be ascertained by use of labeled acetate and stimulation by fructose administration. A systems pharmacology model of this process is desirable because it facilitates the description, analysis, and prediction of this experiment. Due to the multiple enzymes involved in de-novo lipogenesis, and the limited data, it is desirable to use single functional expressions to encapsulate the flux between multiple enzymes. To accomplish this we developed a novel simplification technique which uses the available information about the properties of the individual enzymes to bound the parameters of a single governing 'transfer function'. This method should be applicable to any model with linear chains of enzymes that are well stimulated. We validated this approach with computational simulations and analytical justification in a limiting case. Using this technique we generated a simple model of hepatic de-novo lipogenesis in these experimental conditions that matched prior data. This model can be used to assess pharmacological intervention at specific points on this pathway. We have demonstrated this with prospective simulation of acetyl-CoA carboxylase inhibition. This simplification technique suggests how the constituent properties of an enzymatic chain of reactions gives rise to the sensitivity (to substrate) of the pathway as a whole.

  17. Use of transient elastography to predict de novo recurrence after radiofrequency ablation for hepatocellular carcinoma

    Directory of Open Access Journals (Sweden)

    Lee SH

    2015-02-01

    Full Text Available Sang Hoon Lee,1 Seung Up Kim,1–3 Jeong Won Jang,4 Si Hyun Bae,4 Sanghun Lee,1,3 Beom Kyung Kim,1–3 Jun Yong Park,1–3 Do Young Kim,1–3 Sang Hoon Ahn,1–3 Kwang–Hyub Han1–31Department of Internal Medicine, 2Institute of Gastroenterology, Yonsei University College of Medicine, 3Liver Cirrhosis Clinical Research Center, 4Department of Internal Medicine, College of Medicine, Catholic University of Korea, Seoul, KoreaBackground/purpose: Liver stiffness (LS measurement using transient elastography can accurately assess the degree of liver fibrosis, which is associated with the risk of the development of hepatocellular carcinoma (HCC. We investigated whether LS values could predict HCC de novo recurrence after radiofrequency ablation (RFA.Methods: This retrospective, multicenter study analyzed 111 patients with HCC who underwent RFA and LS measurement using transient elastography between May 2005 and April 2011. All patients were followed until March 2013 to monitor for HCC recurrence.Results: This study included 76 men and 35 women with a mean age of 62.4 years, and the mean LS value was 21.2 kPa. During the follow-up period (median 22.4 months, 47 (42.3% patients experienced HCC de novo recurrence, and 18 (16.2% died. Patients with recurrence had significantly more frequent liver cirrhosis, more frequent history of previous treatment for HCC, higher total bilirubin, larger spleen size, larger total tumor size, higher tumor number, higher LS values, and lower platelet counts than those without recurrence (all P<0.05. On multivariate analysis, together with previous anti-HCC treatment history, patients with LS values >13.0 kPa were at significantly greater risk for recurrence after RFA, with a hazard ratio (HR of 3.115 (95% confidence interval [CI], 1.238–7.842, P<0.05. Moreover, LS values independently predicted the mortality after RFA, with a HR of 9.834 (95% CI, 1.148–84.211, P<0.05, together with total bilirubin.Conclusions: Our

  18. A computational model predicting disruption of blood vessel development.

    Directory of Open Access Journals (Sweden)

    Nicole Kleinstreuer

    2013-04-01

    Full Text Available Vascular development is a complex process regulated by dynamic biological networks that vary in topology and state across different tissues and developmental stages. Signals regulating de novo blood vessel formation (vasculogenesis and remodeling (angiogenesis come from a variety of biological pathways linked to endothelial cell (EC behavior, extracellular matrix (ECM remodeling and the local generation of chemokines and growth factors. Simulating these interactions at a systems level requires sufficient biological detail about the relevant molecular pathways and associated cellular behaviors, and tractable computational models that offset mathematical and biological complexity. Here, we describe a novel multicellular agent-based model of vasculogenesis using the CompuCell3D (http://www.compucell3d.org/ modeling environment supplemented with semi-automatic knowledgebase creation. The model incorporates vascular endothelial growth factor signals, pro- and anti-angiogenic inflammatory chemokine signals, and the plasminogen activating system of enzymes and proteases linked to ECM interactions, to simulate nascent EC organization, growth and remodeling. The model was shown to recapitulate stereotypical capillary plexus formation and structural emergence of non-coded cellular behaviors, such as a heterologous bridging phenomenon linking endothelial tip cells together during formation of polygonal endothelial cords. Molecular targets in the computational model were mapped to signatures of vascular disruption derived from in vitro chemical profiling using the EPA's ToxCast high-throughput screening (HTS dataset. Simulating the HTS data with the cell-agent based model of vascular development predicted adverse effects of a reference anti-angiogenic thalidomide analog, 5HPP-33, on in vitro angiogenesis with respect to both concentration-response and morphological consequences. These findings support the utility of cell agent-based models for simulating a

  19. Precise detection of de novo single nucleotide variants in human genomes.

    Science.gov (United States)

    Gómez-Romero, Laura; Palacios-Flores, Kim; Reyes, José; García, Delfino; Boege, Margareta; Dávila, Guillermo; Flores, Margarita; Schatz, Michael C; Palacios, Rafael

    2018-05-07

    The precise determination of de novo genetic variants has enormous implications across different fields of biology and medicine, particularly personalized medicine. Currently, de novo variations are identified by mapping sample reads from a parent-offspring trio to a reference genome, allowing for a certain degree of differences. While widely used, this approach often introduces false-positive (FP) results due to misaligned reads and mischaracterized sequencing errors. In a previous study, we developed an alternative approach to accurately identify single nucleotide variants (SNVs) using only perfect matches. However, this approach could be applied only to haploid regions of the genome and was computationally intensive. In this study, we present a unique approach, coverage-based single nucleotide variant identification (COBASI), which allows the exploration of the entire genome using second-generation short sequence reads without extensive computing requirements. COBASI identifies SNVs using changes in coverage of exactly matching unique substrings, and is particularly suited for pinpointing de novo SNVs. Unlike other approaches that require population frequencies across hundreds of samples to filter out any methodological biases, COBASI can be applied to detect de novo SNVs within isolated families. We demonstrate this capability through extensive simulation studies and by studying a parent-offspring trio we sequenced using short reads. Experimental validation of all 58 candidate de novo SNVs and a selection of non-de novo SNVs found in the trio confirmed zero FP calls. COBASI is available as open source at https://github.com/Laura-Gomez/COBASI for any researcher to use. Copyright © 2018 the Author(s). Published by PNAS.

  20. De Novo Discovery of Structured ncRNA Motifs in Genomic Sequences

    DEFF Research Database (Denmark)

    Ruzzo, Walter L; Gorodkin, Jan

    2014-01-01

    De novo discovery of "motifs" capturing the commonalities among related noncoding ncRNA structured RNAs is among the most difficult problems in computational biology. This chapter outlines the challenges presented by this problem, together with some approaches towards solving them, with an emphas...... on an approach based on the CMfinder CMfinder program as a case study. Applications to genomic screens for novel de novo structured ncRNA ncRNA s, including structured RNA elements in untranslated portions of protein-coding genes, are presented.......De novo discovery of "motifs" capturing the commonalities among related noncoding ncRNA structured RNAs is among the most difficult problems in computational biology. This chapter outlines the challenges presented by this problem, together with some approaches towards solving them, with an emphasis...

  1. Genome-wide prediction models that incorporate de novo GWAS are a powerful new tool for tropical rice improvement

    Science.gov (United States)

    Spindel, J E; Begum, H; Akdemir, D; Collard, B; Redoña, E; Jannink, J-L; McCouch, S

    2016-01-01

    To address the multiple challenges to food security posed by global climate change, population growth and rising incomes, plant breeders are developing new crop varieties that can enhance both agricultural productivity and environmental sustainability. Current breeding practices, however, are unable to keep pace with demand. Genomic selection (GS) is a new technique that helps accelerate the rate of genetic gain in breeding by using whole-genome data to predict the breeding value of offspring. Here, we describe a new GS model that combines RR-BLUP with markers fit as fixed effects selected from the results of a genome-wide-association study (GWAS) on the RR-BLUP training data. We term this model GS + de novo GWAS. In a breeding population of tropical rice, GS + de novo GWAS outperformed six other models for a variety of traits and in multiple environments. On the basis of these results, we propose an extended, two-part breeding design that can be used to efficiently integrate novel variation into elite breeding populations, thus expanding genetic diversity and enhancing the potential for sustainable productivity gains. PMID:26860200

  2. iScreen: world's first cloud-computing web server for virtual screening and de novo drug design based on TCM database@Taiwan.

    Science.gov (United States)

    Tsai, Tsung-Ying; Chang, Kai-Wei; Chen, Calvin Yu-Chian

    2011-06-01

    The rapidly advancing researches on traditional Chinese medicine (TCM) have greatly intrigued pharmaceutical industries worldwide. To take initiative in the next generation of drug development, we constructed a cloud-computing system for TCM intelligent screening system (iScreen) based on TCM Database@Taiwan. iScreen is compacted web server for TCM docking and followed by customized de novo drug design. We further implemented a protein preparation tool that both extract protein of interest from a raw input file and estimate the size of ligand bind site. In addition, iScreen is designed in user-friendly graphic interface for users who have less experience with the command line systems. For customized docking, multiple docking services, including standard, in-water, pH environment, and flexible docking modes are implemented. Users can download first 200 TCM compounds of best docking results. For TCM de novo drug design, iScreen provides multiple molecular descriptors for a user's interest. iScreen is the world's first web server that employs world's largest TCM database for virtual screening and de novo drug design. We believe our web server can lead TCM research to a new era of drug development. The TCM docking and screening server is available at http://iScreen.cmu.edu.tw/.

  3. iScreen: world's first cloud-computing web server for virtual screening and de novo drug design based on TCM database@Taiwan

    Science.gov (United States)

    Tsai, Tsung-Ying; Chang, Kai-Wei; Chen, Calvin Yu-Chian

    2011-06-01

    The rapidly advancing researches on traditional Chinese medicine (TCM) have greatly intrigued pharmaceutical industries worldwide. To take initiative in the next generation of drug development, we constructed a cloud-computing system for TCM intelligent screening system (iScreen) based on TCM Database@Taiwan. iScreen is compacted web server for TCM docking and followed by customized de novo drug design. We further implemented a protein preparation tool that both extract protein of interest from a raw input file and estimate the size of ligand bind site. In addition, iScreen is designed in user-friendly graphic interface for users who have less experience with the command line systems. For customized docking, multiple docking services, including standard, in-water, pH environment, and flexible docking modes are implemented. Users can download first 200 TCM compounds of best docking results. For TCM de novo drug design, iScreen provides multiple molecular descriptors for a user's interest. iScreen is the world's first web server that employs world's largest TCM database for virtual screening and de novo drug design. We believe our web server can lead TCM research to a new era of drug development. The TCM docking and screening server is available at http://iScreen.cmu.edu.tw/.

  4. De-novo discovery of differentially abundant transcription factor binding sites including their positional preference.

    Science.gov (United States)

    Keilwagen, Jens; Grau, Jan; Paponov, Ivan A; Posch, Stefan; Strickert, Marc; Grosse, Ivo

    2011-02-10

    Transcription factors are a main component of gene regulation as they activate or repress gene expression by binding to specific binding sites in promoters. The de-novo discovery of transcription factor binding sites in target regions obtained by wet-lab experiments is a challenging problem in computational biology, which has not been fully solved yet. Here, we present a de-novo motif discovery tool called Dispom for finding differentially abundant transcription factor binding sites that models existing positional preferences of binding sites and adjusts the length of the motif in the learning process. Evaluating Dispom, we find that its prediction performance is superior to existing tools for de-novo motif discovery for 18 benchmark data sets with planted binding sites, and for a metazoan compendium based on experimental data from micro-array, ChIP-chip, ChIP-DSL, and DamID as well as Gene Ontology data. Finally, we apply Dispom to find binding sites differentially abundant in promoters of auxin-responsive genes extracted from Arabidopsis thaliana microarray data, and we find a motif that can be interpreted as a refined auxin responsive element predominately positioned in the 250-bp region upstream of the transcription start site. Using an independent data set of auxin-responsive genes, we find in genome-wide predictions that the refined motif is more specific for auxin-responsive genes than the canonical auxin-responsive element. In general, Dispom can be used to find differentially abundant motifs in sequences of any origin. However, the positional distribution learned by Dispom is especially beneficial if all sequences are aligned to some anchor point like the transcription start site in case of promoter sequences. We demonstrate that the combination of searching for differentially abundant motifs and inferring a position distribution from the data is beneficial for de-novo motif discovery. Hence, we make the tool freely available as a component of the open

  5. Application of Generative Autoencoder in De Novo Molecular Design.

    Science.gov (United States)

    Blaschke, Thomas; Olivecrona, Marcus; Engkvist, Ola; Bajorath, Jürgen; Chen, Hongming

    2018-01-01

    A major challenge in computational chemistry is the generation of novel molecular structures with desirable pharmacological and physiochemical properties. In this work, we investigate the potential use of autoencoder, a deep learning methodology, for de novo molecular design. Various generative autoencoders were used to map molecule structures into a continuous latent space and vice versa and their performance as structure generator was assessed. Our results show that the latent space preserves chemical similarity principle and thus can be used for the generation of analogue structures. Furthermore, the latent space created by autoencoders were searched systematically to generate novel compounds with predicted activity against dopamine receptor type 2 and compounds similar to known active compounds not included in the trainings set were identified. © 2018 The Authors. Published by Wiley-VCH Verlag GmbH & Co. KGaA.

  6. DNApi: A De Novo Adapter Prediction Algorithm for Small RNA Sequencing Data.

    Science.gov (United States)

    Tsuji, Junko; Weng, Zhiping

    2016-01-01

    With the rapid accumulation of publicly available small RNA sequencing datasets, third-party meta-analysis across many datasets is becoming increasingly powerful. Although removing the 3´ adapter is an essential step for small RNA sequencing analysis, the adapter sequence information is not always available in the metadata. The information can be also erroneous even when it is available. In this study, we developed DNApi, a lightweight Python software package that predicts the 3´ adapter sequence de novo and provides the user with cleansed small RNA sequences ready for down stream analysis. Tested on 539 publicly available small RNA libraries accompanied with 3´ adapter sequences in their metadata, DNApi shows near-perfect accuracy (98.5%) with fast runtime (~2.85 seconds per library) and efficient memory usage (~43 MB on average). In addition to 3´ adapter prediction, it is also important to classify whether the input small RNA libraries were already processed, i.e. the 3´ adapters were removed. DNApi perfectly judged that given another batch of datasets, 192 publicly available processed libraries were "ready-to-map" small RNA sequence. DNApi is compatible with Python 2 and 3, and is available at https://github.com/jnktsj/DNApi. The 731 small RNA libraries used for DNApi evaluation were from human tissues and were carefully and manually collected. This study also provides readers with the curated datasets that can be integrated into their studies.

  7. Identification of a novel Plasmopara halstedii elicitor protein combining de novo peptide sequencing algorithms and RACE-PCR

    Directory of Open Access Journals (Sweden)

    Madlung Johannes

    2010-05-01

    Full Text Available Abstract Background Often high-quality MS/MS spectra of tryptic peptides do not match to any database entry because of only partially sequenced genomes and therefore, protein identification requires de novo peptide sequencing. To achieve protein identification of the economically important but still unsequenced plant pathogenic oomycete Plasmopara halstedii, we first evaluated the performance of three different de novo peptide sequencing algorithms applied to a protein digests of standard proteins using a quadrupole TOF (QStar Pulsar i. Results The performance order of the algorithms was PEAKS online > PepNovo > CompNovo. In summary, PEAKS online correctly predicted 45% of measured peptides for a protein test data set. All three de novo peptide sequencing algorithms were used to identify MS/MS spectra of tryptic peptides of an unknown 57 kDa protein of P. halstedii. We found ten de novo sequenced peptides that showed homology to a Phytophthora infestans protein, a closely related organism of P. halstedii. Employing a second complementary approach, verification of peptide prediction and protein identification was performed by creation of degenerate primers for RACE-PCR and led to an ORF of 1,589 bp for a hypothetical phosphoenolpyruvate carboxykinase. Conclusions Our study demonstrated that identification of proteins within minute amounts of sample material improved significantly by combining sensitive LC-MS methods with different de novo peptide sequencing algorithms. In addition, this is the first study that verified protein prediction from MS data by also employing a second complementary approach, in which RACE-PCR led to identification of a novel elicitor protein in P. halstedii.

  8. Automated de novo phasing and model building of coiled-coil proteins.

    Science.gov (United States)

    Rämisch, Sebastian; Lizatović, Robert; André, Ingemar

    2015-03-01

    Models generated by de novo structure prediction can be very useful starting points for molecular replacement for systems where suitable structural homologues cannot be readily identified. Protein-protein complexes and de novo-designed proteins are examples of systems that can be challenging to phase. In this study, the potential of de novo models of protein complexes for use as starting points for molecular replacement is investigated. The approach is demonstrated using homomeric coiled-coil proteins, which are excellent model systems for oligomeric systems. Despite the stereotypical fold of coiled coils, initial phase estimation can be difficult and many structures have to be solved with experimental phasing. A method was developed for automatic structure determination of homomeric coiled coils from X-ray diffraction data. In a benchmark set of 24 coiled coils, ranging from dimers to pentamers with resolutions down to 2.5 Å, 22 systems were automatically solved, 11 of which had previously been solved by experimental phasing. The generated models contained 71-103% of the residues present in the deposited structures, had the correct sequence and had free R values that deviated on average by 0.01 from those of the respective reference structures. The electron-density maps were of sufficient quality that only minor manual editing was necessary to produce final structures. The method, named CCsolve, combines methods for de novo structure prediction, initial phase estimation and automated model building into one pipeline. CCsolve is robust against errors in the initial models and can readily be modified to make use of alternative crystallographic software. The results demonstrate the feasibility of de novo phasing of protein-protein complexes, an approach that could also be employed for other small systems beyond coiled coils.

  9. DeNovoGUI: an open source graphical user interface for de novo sequencing of tandem mass spectra.

    Science.gov (United States)

    Muth, Thilo; Weilnböck, Lisa; Rapp, Erdmann; Huber, Christian G; Martens, Lennart; Vaudel, Marc; Barsnes, Harald

    2014-02-07

    De novo sequencing is a popular technique in proteomics for identifying peptides from tandem mass spectra without having to rely on a protein sequence database. Despite the strong potential of de novo sequencing algorithms, their adoption threshold remains quite high. We here present a user-friendly and lightweight graphical user interface called DeNovoGUI for running parallelized versions of the freely available de novo sequencing software PepNovo+, greatly simplifying the use of de novo sequencing in proteomics. Our platform-independent software is freely available under the permissible Apache2 open source license. Source code, binaries, and additional documentation are available at http://denovogui.googlecode.com .

  10. Computational predictions of zinc oxide hollow structures

    Science.gov (United States)

    Tuoc, Vu Ngoc; Huan, Tran Doan; Thao, Nguyen Thi

    2018-03-01

    Nanoporous materials are emerging as potential candidates for a wide range of technological applications in environment, electronic, and optoelectronics, to name just a few. Within this active research area, experimental works are predominant while theoretical/computational prediction and study of these materials face some intrinsic challenges, one of them is how to predict porous structures. We propose a computationally and technically feasible approach for predicting zinc oxide structures with hollows at the nano scale. The designed zinc oxide hollow structures are studied with computations using the density functional tight binding and conventional density functional theory methods, revealing a variety of promising mechanical and electronic properties, which can potentially find future realistic applications.

  11. LTRsift: a graphical user interface for semi-automatic classification and postprocessing of de novo detected LTR retrotransposons.

    Science.gov (United States)

    Steinbiss, Sascha; Kastens, Sascha; Kurtz, Stefan

    2012-11-07

    Long terminal repeat (LTR) retrotransposons are a class of eukaryotic mobile elements characterized by a distinctive sequence similarity-based structure. Hence they are well suited for computational identification. Current software allows for a comprehensive genome-wide de novo detection of such elements. The obvious next step is the classification of newly detected candidates resulting in (super-)families. Such a de novo classification approach based on sequence-based clustering of transposon features has been proposed before, resulting in a preliminary assignment of candidates to families as a basis for subsequent manual refinement. However, such a classification workflow is typically split across a heterogeneous set of glue scripts and generic software (for example, spreadsheets), making it tedious for a human expert to inspect, curate and export the putative families produced by the workflow. We have developed LTRsift, an interactive graphical software tool for semi-automatic postprocessing of de novo predicted LTR retrotransposon annotations. Its user-friendly interface offers customizable filtering and classification functionality, displaying the putative candidate groups, their members and their internal structure in a hierarchical fashion. To ease manual work, it also supports graphical user interface-driven reassignment, splitting and further annotation of candidates. Export of grouped candidate sets in standard formats is possible. In two case studies, we demonstrate how LTRsift can be employed in the context of a genome-wide LTR retrotransposon survey effort. LTRsift is a useful and convenient tool for semi-automated classification of newly detected LTR retrotransposons based on their internal features. Its efficient implementation allows for convenient and seamless filtering and classification in an integrated environment. Developed for life scientists, it is helpful in postprocessing and refining the output of software for predicting LTR

  12. Seismic activity prediction using computational intelligence techniques in northern Pakistan

    Science.gov (United States)

    Asim, Khawaja M.; Awais, Muhammad; Martínez-Álvarez, F.; Iqbal, Talat

    2017-10-01

    Earthquake prediction study is carried out for the region of northern Pakistan. The prediction methodology includes interdisciplinary interaction of seismology and computational intelligence. Eight seismic parameters are computed based upon the past earthquakes. Predictive ability of these eight seismic parameters is evaluated in terms of information gain, which leads to the selection of six parameters to be used in prediction. Multiple computationally intelligent models have been developed for earthquake prediction using selected seismic parameters. These models include feed-forward neural network, recurrent neural network, random forest, multi layer perceptron, radial basis neural network, and support vector machine. The performance of every prediction model is evaluated and McNemar's statistical test is applied to observe the statistical significance of computational methodologies. Feed-forward neural network shows statistically significant predictions along with accuracy of 75% and positive predictive value of 78% in context of northern Pakistan.

  13. Predictive modeling of liquid-sodium thermal–hydraulics experiments and computations

    International Nuclear Information System (INIS)

    Arslan, Erkan; Cacuci, Dan G.

    2014-01-01

    Highlights: • We applied the predictive modeling method of Cacuci and Ionescu-Bujor (2010). • We assimilated data from sodium flow experiments. • We used computational fluid dynamics simulations of sodium experiments. • The predictive modeling method greatly reduced uncertainties in predicted results. - Abstract: This work applies the predictive modeling procedure formulated by Cacuci and Ionescu-Bujor (2010) to assimilate data from liquid-sodium thermal–hydraulics experiments in order to reduce systematically the uncertainties in the predictions of computational fluid dynamics (CFD) simulations. The predicted CFD-results for the best-estimate model parameters and results describing sodium-flow velocities and temperature distributions are shown to be significantly more precise than the original computations and experiments, in that the predicted uncertainties for the best-estimate results and model parameters are significantly smaller than both the originally computed and the experimental uncertainties

  14. Pesquisa de novos elementos Pesquisa de novos elementos

    Directory of Open Access Journals (Sweden)

    Gil Mário de Macedo Grassi

    1978-11-01

    Full Text Available The present study deals with the discovery of new elements synthesized by man. The introduction discusses in general the theories about nuclear transmutation, which is the method employed in these syntheses. The study shows the importance of the Periodical Table since it is through this table that one can reach a prevision of new elements and its, properties. The discoveries of the transuranic elements, together wich the data of their first preparations are also tabulated The stability of these elements is also discussed, and future speculations are showedNeste trabalho estuda-se, teoricamente, a descoberta de novos elementos sintetizados pelo homem Na introdução apresentamos um apanhado geral sobre as teorias a respeito da transmutação nuclear, que é o método utilizado nestas sínteses. Em seguida, mostramos a importância da Tabela Periódica, pois é através dela que se chega à previsão dos novos elementos e de suas propriedades. As descobertas dos transurânicos, Já realizadas com êxito, juntamente com os dados de suas primeiras preparações são tabelados. A estabilidade destes novos elementos também é discutida, e apresentadas futuras especulações.

  15. De novo pathway-based biomarker identification

    DEFF Research Database (Denmark)

    Alcaraz, Nicolas; List, Markus; Batra, Richa

    2017-01-01

    in a large cohort of breast cancer samples from The Cancer Genome Atlas (TCGA) revealed that MGs are considerably more stable than SG models, while also providing valuable insight into the cancer hallmarks that drive them. In addition, when tested on an independent benchmark non-TCGA dataset, MG features......Gene expression profiles have been extensively discussed as an aid to guide the therapy by predicting disease outcome for the patients suffering from complex diseases, such as cancer. However, prediction models built upon single-gene (SG) features show poor stability and performance on independent...... on their molecular subtypes can provide a detailed view of the disease and lead to more personalized therapies. We propose and discuss a novel MG approach based on de novo pathways, which for the first time have been used as features in a multi-class setting to predict cancer subtypes. Comprehensive evaluation...

  16. Time-Predictable Computer Architecture

    Directory of Open Access Journals (Sweden)

    Schoeberl Martin

    2009-01-01

    Full Text Available Today's general-purpose processors are optimized for maximum throughput. Real-time systems need a processor with both a reasonable and a known worst-case execution time (WCET. Features such as pipelines with instruction dependencies, caches, branch prediction, and out-of-order execution complicate WCET analysis and lead to very conservative estimates. In this paper, we evaluate the issues of current architectures with respect to WCET analysis. Then, we propose solutions for a time-predictable computer architecture. The proposed architecture is evaluated with implementation of some features in a Java processor. The resulting processor is a good target for WCET analysis and still performs well in the average case.

  17. Soft Computing Methods for Disulfide Connectivity Prediction.

    Science.gov (United States)

    Márquez-Chamorro, Alfonso E; Aguilar-Ruiz, Jesús S

    2015-01-01

    The problem of protein structure prediction (PSP) is one of the main challenges in structural bioinformatics. To tackle this problem, PSP can be divided into several subproblems. One of these subproblems is the prediction of disulfide bonds. The disulfide connectivity prediction problem consists in identifying which nonadjacent cysteines would be cross-linked from all possible candidates. Determining the disulfide bond connectivity between the cysteines of a protein is desirable as a previous step of the 3D PSP, as the protein conformational search space is highly reduced. The most representative soft computing approaches for the disulfide bonds connectivity prediction problem of the last decade are summarized in this paper. Certain aspects, such as the different methodologies based on soft computing approaches (artificial neural network or support vector machine) or features of the algorithms, are used for the classification of these methods.

  18. Brain systems for probabilistic and dynamic prediction: computational specificity and integration.

    Directory of Open Access Journals (Sweden)

    Jill X O'Reilly

    2013-09-01

    Full Text Available A computational approach to functional specialization suggests that brain systems can be characterized in terms of the types of computations they perform, rather than their sensory or behavioral domains. We contrasted the neural systems associated with two computationally distinct forms of predictive model: a reinforcement-learning model of the environment obtained through experience with discrete events, and continuous dynamic forward modeling. By manipulating the precision with which each type of prediction could be used, we caused participants to shift computational strategies within a single spatial prediction task. Hence (using fMRI we showed that activity in two brain systems (typically associated with reward learning and motor control could be dissociated in terms of the forms of computations that were performed there, even when both systems were used to make parallel predictions of the same event. A region in parietal cortex, which was sensitive to the divergence between the predictions of the models and anatomically connected to both computational networks, is proposed to mediate integration of the two predictive modes to produce a single behavioral output.

  19. RNA secondary structure prediction using soft computing.

    Science.gov (United States)

    Ray, Shubhra Sankar; Pal, Sankar K

    2013-01-01

    Prediction of RNA structure is invaluable in creating new drugs and understanding genetic diseases. Several deterministic algorithms and soft computing-based techniques have been developed for more than a decade to determine the structure from a known RNA sequence. Soft computing gained importance with the need to get approximate solutions for RNA sequences by considering the issues related with kinetic effects, cotranscriptional folding, and estimation of certain energy parameters. A brief description of some of the soft computing-based techniques, developed for RNA secondary structure prediction, is presented along with their relevance. The basic concepts of RNA and its different structural elements like helix, bulge, hairpin loop, internal loop, and multiloop are described. These are followed by different methodologies, employing genetic algorithms, artificial neural networks, and fuzzy logic. The role of various metaheuristics, like simulated annealing, particle swarm optimization, ant colony optimization, and tabu search is also discussed. A relative comparison among different techniques, in predicting 12 known RNA secondary structures, is presented, as an example. Future challenging issues are then mentioned.

  20. Ensemble Architecture for Prediction of Enzyme-ligand Binding Residues Using Evolutionary Information.

    Science.gov (United States)

    Pai, Priyadarshini P; Dattatreya, Rohit Kadam; Mondal, Sukanta

    2017-11-01

    Enzyme interactions with ligands are crucial for various biochemical reactions governing life. Over many years attempts to identify these residues for biotechnological manipulations have been made using experimental and computational techniques. The computational approaches have gathered impetus with the accruing availability of sequence and structure information, broadly classified into template-based and de novo methods. One of the predominant de novo methods using sequence information involves application of biological properties for supervised machine learning. Here, we propose a support vector machines-based ensemble for prediction of protein-ligand interacting residues using one of the most important discriminative contributing properties in the interacting residue neighbourhood, i. e., evolutionary information in the form of position-specific- scoring matrix (PSSM). The study has been performed on a non-redundant dataset comprising of 9269 interacting and 91773 non-interacting residues for prediction model generation and further evaluation. Of the various PSSM-based models explored, the proposed method named ROBBY (pRediction Of Biologically relevant small molecule Binding residues on enzYmes) shows an accuracy of 84.0 %, Matthews Correlation Coefficient of 0.343 and F-measure of 39.0 % on 78 test enzymes. Further, scope of adding domain knowledge such as pocket information has also been investigated; results showed significant enhancement in method precision. Findings are hoped to boost the reliability of small-molecule ligand interaction prediction for enzyme applications and drug design. © 2017 Wiley-VCH Verlag GmbH & Co. KGaA, Weinheim.

  1. Robust de novo pathway enrichment with KeyPathwayMiner 5

    DEFF Research Database (Denmark)

    Alcaraz, Nicolas; List, Markus; Dissing-Hansen, Martin

    2016-01-01

    Identifying functional modules or novel active pathways, recently termed de novo pathway enrichment, is a computational systems biology challenge that has gained much attention during the last decade. Given a large biological interaction network, KeyPathwayMiner extracts connected subnetworks tha...

  2. de novo'' aneurysms following endovascular procedures

    International Nuclear Information System (INIS)

    Briganti, F.; Cirillo, S.; Caranci, F.; Esposito, F.; Maiuri, F.

    2002-01-01

    Two personal cases of ''de novo'' aneurysms of the anterior communicating artery (ACoA) occurring 9 and 4 years, respectively, after endovascular carotid occlusion are described. A review of the 30 reported cases (including our own two) of ''de novo'' aneurysms after occlusion of the major cerebral vessels has shown some features, including a rather long time interval after the endovascular procedure of up to 20-25 years (average 9.6 years), a preferential ACoA (36.3%) and internal carotid artery-posterior communicating artery (ICA-PCoA) (33.3%) location of the ''de novo'' aneurysms, and a 10% rate of multiple aneurysms. These data are compared with those of the group of reported spontaneous ''de novo'' aneurysms after SAH or previous aneurysm clipping. We agree that the frequency of ''de novo'' aneurysms after major-vessel occlusion (two among ten procedures in our series, or 20%) is higher than commonly reported (0 to 11%). For this reason, we suggest that patients who have been submitted to endovascular major-vessel occlusion be followed up for up to 20-25 years after the procedure, using non-invasive imaging studies such as MR angiography and high-resolution CT angiography. On the other hand, periodic digital angiography has a questionable risk-benefit ratio; it may be used when a ''de novo'' aneurysm is detected or suspected on non-invasive studies. The progressive enlargement of the ACoA after carotid occlusion, as described in our case 1, must be considered a radiological finding of risk for ''de novo'' aneurysm formation. (orig.)

  3. A comparative analysis of soft computing techniques for gene prediction.

    Science.gov (United States)

    Goel, Neelam; Singh, Shailendra; Aseri, Trilok Chand

    2013-07-01

    The rapid growth of genomic sequence data for both human and nonhuman species has made analyzing these sequences, especially predicting genes in them, very important and is currently the focus of many research efforts. Beside its scientific interest in the molecular biology and genomics community, gene prediction is of considerable importance in human health and medicine. A variety of gene prediction techniques have been developed for eukaryotes over the past few years. This article reviews and analyzes the application of certain soft computing techniques in gene prediction. First, the problem of gene prediction and its challenges are described. These are followed by different soft computing techniques along with their application to gene prediction. In addition, a comparative analysis of different soft computing techniques for gene prediction is given. Finally some limitations of the current research activities and future research directions are provided. Copyright © 2013 Elsevier Inc. All rights reserved.

  4. The limits of de novo DNA motif discovery.

    Directory of Open Access Journals (Sweden)

    David Simcha

    Full Text Available A major challenge in molecular biology is reverse-engineering the cis-regulatory logic that plays a major role in the control of gene expression. This program includes searching through DNA sequences to identify "motifs" that serve as the binding sites for transcription factors or, more generally, are predictive of gene expression across cellular conditions. Several approaches have been proposed for de novo motif discovery-searching sequences without prior knowledge of binding sites or nucleotide patterns. However, unbiased validation is not straightforward. We consider two approaches to unbiased validation of discovered motifs: testing the statistical significance of a motif using a DNA "background" sequence model to represent the null hypothesis and measuring performance in predicting membership in gene clusters. We demonstrate that the background models typically used are "too null," resulting in overly optimistic assessments of significance, and argue that performance in predicting TF binding or expression patterns from DNA motifs should be assessed by held-out data, as in predictive learning. Applying this criterion to common motif discovery methods resulted in universally poor performance, although there is a marked improvement when motifs are statistically significant against real background sequences. Moreover, on synthetic data where "ground truth" is known, discriminative performance of all algorithms is far below the theoretical upper bound, with pronounced "over-fitting" in training. A key conclusion from this work is that the failure of de novo discovery approaches to accurately identify motifs is basically due to statistical intractability resulting from the fixed size of co-regulated gene clusters, and thus such failures do not necessarily provide evidence that unfound motifs are not active biologically. Consequently, the use of prior knowledge to enhance motif discovery is not just advantageous but necessary. An implementation of

  5. Real-time Tsunami Inundation Prediction Using High Performance Computers

    Science.gov (United States)

    Oishi, Y.; Imamura, F.; Sugawara, D.

    2014-12-01

    Recently off-shore tsunami observation stations based on cabled ocean bottom pressure gauges are actively being deployed especially in Japan. These cabled systems are designed to provide real-time tsunami data before tsunamis reach coastlines for disaster mitigation purposes. To receive real benefits of these observations, real-time analysis techniques to make an effective use of these data are necessary. A representative study was made by Tsushima et al. (2009) that proposed a method to provide instant tsunami source prediction based on achieving tsunami waveform data. As time passes, the prediction is improved by using updated waveform data. After a tsunami source is predicted, tsunami waveforms are synthesized from pre-computed tsunami Green functions of linear long wave equations. Tsushima et al. (2014) updated the method by combining the tsunami waveform inversion with an instant inversion of coseismic crustal deformation and improved the prediction accuracy and speed in the early stages. For disaster mitigation purposes, real-time predictions of tsunami inundation are also important. In this study, we discuss the possibility of real-time tsunami inundation predictions, which require faster-than-real-time tsunami inundation simulation in addition to instant tsunami source analysis. Although the computational amount is large to solve non-linear shallow water equations for inundation predictions, it has become executable through the recent developments of high performance computing technologies. We conducted parallel computations of tsunami inundation and achieved 6.0 TFLOPS by using 19,000 CPU cores. We employed a leap-frog finite difference method with nested staggered grids of which resolution range from 405 m to 5 m. The resolution ratio of each nested domain was 1/3. Total number of grid points were 13 million, and the time step was 0.1 seconds. Tsunami sources of 2011 Tohoku-oki earthquake were tested. The inundation prediction up to 2 hours after the

  6. LTRsift: a graphical user interface for semi-automatic classification and postprocessing of de novo detected LTR retrotransposons

    Directory of Open Access Journals (Sweden)

    Steinbiss Sascha

    2012-11-01

    Full Text Available Abstract Background Long terminal repeat (LTR retrotransposons are a class of eukaryotic mobile elements characterized by a distinctive sequence similarity-based structure. Hence they are well suited for computational identification. Current software allows for a comprehensive genome-wide de novo detection of such elements. The obvious next step is the classification of newly detected candidates resulting in (super-families. Such a de novo classification approach based on sequence-based clustering of transposon features has been proposed before, resulting in a preliminary assignment of candidates to families as a basis for subsequent manual refinement. However, such a classification workflow is typically split across a heterogeneous set of glue scripts and generic software (for example, spreadsheets, making it tedious for a human expert to inspect, curate and export the putative families produced by the workflow. Results We have developed LTRsift, an interactive graphical software tool for semi-automatic postprocessing of de novo predicted LTR retrotransposon annotations. Its user-friendly interface offers customizable filtering and classification functionality, displaying the putative candidate groups, their members and their internal structure in a hierarchical fashion. To ease manual work, it also supports graphical user interface-driven reassignment, splitting and further annotation of candidates. Export of grouped candidate sets in standard formats is possible. In two case studies, we demonstrate how LTRsift can be employed in the context of a genome-wide LTR retrotransposon survey effort. Conclusions LTRsift is a useful and convenient tool for semi-automated classification of newly detected LTR retrotransposons based on their internal features. Its efficient implementation allows for convenient and seamless filtering and classification in an integrated environment. Developed for life scientists, it is helpful in postprocessing and refining

  7. De novo-based transcriptome profiling of male-sterile and fertile watermelon lines.

    Science.gov (United States)

    Rhee, Sun-Ju; Kwon, Taehyung; Seo, Minseok; Jang, Yoon Jeong; Sim, Tae Yong; Cho, Seoae; Han, Sang-Wook; Lee, Gung Pyo

    2017-01-01

    The whole-genome sequence of watermelon (Citrullus lanatus (Thunb.) Matsum. & Nakai), a valuable horticultural crop worldwide, was released in 2013. Here, we compared a de novo-based approach (DBA) to a reference-based approach (RBA) using RNA-seq data, to aid in efforts to improve the annotation of the watermelon reference genome and to obtain biological insight into male-sterility in watermelon. We applied these techniques to available data from two watermelon lines: the male-sterile line DAH3615-MS and the male-fertile line DAH3615. Using DBA, we newly annotated 855 watermelon transcripts, and found gene functional clusters predicted to be related to stimulus responses, nucleic acid binding, transmembrane transport, homeostasis, and Golgi/vesicles. Among the DBA-annotated transcripts, 138 de novo-exclusive differentially-expressed genes (DEDEGs) related to male sterility were detected. Out of 33 randomly selected newly annotated transcripts and DEDEGs, 32 were validated by RT-qPCR. This study demonstrates the usefulness and reliability of the de novo transcriptome assembly in watermelon, and provides new insights for researchers exploring transcriptional blueprints with regard to the male sterility.

  8. De Novo Glutamine Synthesis

    Science.gov (United States)

    He, Qiao; Shi, Xinchong; Zhang, Linqi; Yi, Chang; Zhang, Xuezhen

    2016-01-01

    Purpose: The aim of this study was to investigate the role of de novo glutamine (Gln) synthesis in the proliferation of C6 glioma cells and its detection with 13N-ammonia. Methods: Chronic Gln-deprived C6 glioma (0.06C6) cells were established. The proliferation rates of C6 and 0.06C6 cells were measured under the conditions of Gln deprivation along with or without the addition of ammonia or glutamine synthetase (GS) inhibitor. 13N-ammonia uptake was assessed in C6 cells by gamma counting and in rats with C6 and 0.06C6 xenografts by micro–positron emission tomography (PET) scanning. The expression of GS in C6 cells and xenografts was assessed by Western blotting and immunohistochemistry, respectively. Results: The Gln-deprived C6 cells showed decreased proliferation ability but had a significant increase in GS expression. Furthermore, we found that low concentration of ammonia was sufficient to maintain the proliferation of Gln-deprived C6 cells, and 13N-ammonia uptake in C6 cells showed Gln-dependent decrease, whereas inhibition of GS markedly reduced the proliferation of C6 cells as well as the uptake of 13N-ammoina. Additionally, microPET/computed tomography exhibited that subcutaneous 0.06C6 xenografts had higher 13N-ammonia uptake and GS expression in contrast to C6 xenografts. Conclusion: De novo Gln synthesis through ammonia–glutamate reaction plays an important role in the proliferation of C6 cells. 13N-ammonia can be a potential metabolic PET tracer for Gln-dependent tumors. PMID:27118759

  9. De novo malignancy after pancreas transplantation in Japan.

    Science.gov (United States)

    Tomimaru, Y; Ito, T; Marubashi, S; Kawamoto, K; Tomokuni, A; Asaoka, T; Wada, H; Eguchi, H; Mori, M; Doki, Y; Nagano, H

    2015-04-01

    Long-term immunosuppression is associated with an increased risk of cancer. Especially, the immunosuppression in pancreas transplantation is more intensive than that in other organ transplantation because of its strong immunogenicity. Therefore, it suggests that the risk of post-transplant de novo malignancy might increase in pancreas transplantation. However, there have been few studies of de novo malignancy after pancreas transplantation. The aim of this study was to analyze the incidence of de novo malignancy after pancreas transplantation in Japan. Post-transplant patients with de novo malignancy were surveyed and characterized in Japan. Among 107 cases receiving pancreas transplantation in Japan between 2001 and 2010, de novo malignancy developed in 9 cases (8.4%): post-transplant lymphoproliferative disorders in 6 cases, colon cancer in 1 case, renal cancer in 1 case, and brain tumor in 1 case. We clarified the incidence of de novo malignancy after pancreas transplantation in Japan. Copyright © 2015 Elsevier Inc. All rights reserved.

  10. Crius: A Novel Fragment-Based Algorithm of De Novo Substrate Prediction for Enzymes.

    Science.gov (United States)

    Yao, Zhiqiang; Jiang, Shuiqin; Zhang, Lujia; Gao, Bei; He, Xiao; Zhang, John Z H; Wei, Dongzhi

    2018-05-03

    The study of enzyme substrate specificity is vital for developing potential applications of enzymes. However, the routine experimental procedures require lot of resources in the discovery of novel substrates. This article reports an in silico structure-based algorithm called Crius, which predicts substrates for enzyme. The results of this fragment-based algorithm show good agreements between the simulated and experimental substrate specificities, using a lipase from Candida antarctica (CALB), a nitrilase from Cyanobacterium syechocystis sp. PCC6803 (Nit6803), and an aldo-keto reductase from Gluconobacter oxydans (Gox0644). This opens new prospects of developing computer algorithms that can effectively predict substrates for an enzyme. This article is protected by copyright. All rights reserved. © 2018 The Protein Society.

  11. Computational prediction of chemical reactions: current status and outlook.

    Science.gov (United States)

    Engkvist, Ola; Norrby, Per-Ola; Selmi, Nidhal; Lam, Yu-Hong; Peng, Zhengwei; Sherer, Edward C; Amberg, Willi; Erhard, Thomas; Smyth, Lynette A

    2018-06-01

    Over the past few decades, various computational methods have become increasingly important for discovering and developing novel drugs. Computational prediction of chemical reactions is a key part of an efficient drug discovery process. In this review, we discuss important parts of this field, with a focus on utilizing reaction data to build predictive models, the existing programs for synthesis prediction, and usage of quantum mechanics and molecular mechanics (QM/MM) to explore chemical reactions. We also outline potential future developments with an emphasis on pre-competitive collaboration opportunities. Copyright © 2018 Elsevier Ltd. All rights reserved.

  12. Scaling predictive modeling in drug development with cloud computing.

    Science.gov (United States)

    Moghadam, Behrooz Torabi; Alvarsson, Jonathan; Holm, Marcus; Eklund, Martin; Carlsson, Lars; Spjuth, Ola

    2015-01-26

    Growing data sets with increased time for analysis is hampering predictive modeling in drug discovery. Model building can be carried out on high-performance computer clusters, but these can be expensive to purchase and maintain. We have evaluated ligand-based modeling on cloud computing resources where computations are parallelized and run on the Amazon Elastic Cloud. We trained models on open data sets of varying sizes for the end points logP and Ames mutagenicity and compare with model building parallelized on a traditional high-performance computing cluster. We show that while high-performance computing results in faster model building, the use of cloud computing resources is feasible for large data sets and scales well within cloud instances. An additional advantage of cloud computing is that the costs of predictive models can be easily quantified, and a choice can be made between speed and economy. The easy access to computational resources with no up-front investments makes cloud computing an attractive alternative for scientists, especially for those without access to a supercomputer, and our study shows that it enables cost-efficient modeling of large data sets on demand within reasonable time.

  13. PSPP: a protein structure prediction pipeline for computing clusters.

    Directory of Open Access Journals (Sweden)

    Michael S Lee

    2009-07-01

    Full Text Available Protein structures are critical for understanding the mechanisms of biological systems and, subsequently, for drug and vaccine design. Unfortunately, protein sequence data exceed structural data by a factor of more than 200 to 1. This gap can be partially filled by using computational protein structure prediction. While structure prediction Web servers are a notable option, they often restrict the number of sequence queries and/or provide a limited set of prediction methodologies. Therefore, we present a standalone protein structure prediction software package suitable for high-throughput structural genomic applications that performs all three classes of prediction methodologies: comparative modeling, fold recognition, and ab initio. This software can be deployed on a user's own high-performance computing cluster.The pipeline consists of a Perl core that integrates more than 20 individual software packages and databases, most of which are freely available from other research laboratories. The query protein sequences are first divided into domains either by domain boundary recognition or Bayesian statistics. The structures of the individual domains are then predicted using template-based modeling or ab initio modeling. The predicted models are scored with a statistical potential and an all-atom force field. The top-scoring ab initio models are annotated by structural comparison against the Structural Classification of Proteins (SCOP fold database. Furthermore, secondary structure, solvent accessibility, transmembrane helices, and structural disorder are predicted. The results are generated in text, tab-delimited, and hypertext markup language (HTML formats. So far, the pipeline has been used to study viral and bacterial proteomes.The standalone pipeline that we introduce here, unlike protein structure prediction Web servers, allows users to devote their own computing assets to process a potentially unlimited number of queries as well as perform

  14. NovoPen Echo® insulin delivery device

    Directory of Open Access Journals (Sweden)

    Hyllested-Winge J

    2016-01-01

    Full Text Available Jacob Hyllested-Winge,1 Thomas Sparre,2 Line Kynemund Pedersen2 1Novo Nordisk Pharma Ltd, Tokyo, Japan; 2Novo Nordisk A/S, Søborg, Denmark Abstract: The introduction of insulin pen devices has provided easier, well-tolerated, and more convenient treatment regimens for patients with diabetes mellitus. When compared with vial and syringe regimens, insulin pens offer a greater clinical efficacy, improved quality of life, and increased dosing accuracy, particularly at low doses. The portable and discreet nature of pen devices reduces the burden on the patient, facilitates adherence, and subsequently contributes to the improvement in glycemic control. NovoPen Echo® is one of the latest members of the NovoPen® family that has been specifically designed for the pediatric population and is the first to combine half-unit increment (=0.5 U of insulin dosing with a simple memory function. The half-unit increment dosing amendments and accurate injection of 0.5 U of insulin are particularly beneficial for children (and insulin-sensitive adults/elders, who often require small insulin doses. The memory function can be used to record the time and amount of the last dose, reducing the fear of double dosing or missing a dose. The memory function also provides parents with extra confidence and security that their child is taking insulin at the correct doses and times. NovoPen Echo is a lightweight, durable insulin delivery pen; it is available in two different colors, which may help to distinguish between different types of insulin, providing more confidence for both users and caregivers. Studies have demonstrated a high level of patient satisfaction, with 80% of users preferring NovoPen Echo to other pediatric insulin pens. Keywords: NovoPen Echo®, memory function, half-unit increment dosing, adherence, children, adolescents 

  15. Web Access to Digitised Content of the Exhibition Novo Mesto 1848-1918 at the Dolenjska Museum, Novo Mesto

    Directory of Open Access Journals (Sweden)

    Majda Pungerčar

    2013-09-01

    individually or in a group of visitors as a quick test at the end of a visit to the exhibition. The computer game Cray Fish Hunting on the River of Krka was tailored to young visitors. It described the geographical position of Novo mesto on the Krka peninsula and some objects on the riverside (wooden baths, the old bridge, the dam and the mill. Hunting noble cray-fish was very extensive in the Novo mesto region until the outburst of the cray-fish plague. It is said that up to 100,000 cay fish were caught annually and transported to all major towns of the then Austrian Empire. The original museum collections were preserved by scanning precious items which were kept in depots. Digitised materials accessible on the museum website met the needs of the majority of users and provided expanded access to the museum collection through presentation of graphic materials in virtual exhibition on the web which was much more comprehensive than a printed catalogue. There was also the possibility of remote access to the museum website which was very important for tourists and museum visitors with special needs. It was possible to design customer tailored presentations on specific themes (postcards from the period of national awakening, local societies, middle class garments etc.. Digital content was also available to school teachers to discuss the themes of interest with their pupils or students either before or after visiting the exhibition. There are many pitfalls and limitations of digitisation, the biggest two being staff deficiency and shrinking budget. Furthermore, there are difficulties with the technical support, gathering information on the implementation process, seeking partners to digitise materials etc. Besides, the question of long term digital archiving is of vital importance.

  16. A systematic investigation of computation models for predicting Adverse Drug Reactions (ADRs).

    Science.gov (United States)

    Kuang, Qifan; Wang, MinQi; Li, Rong; Dong, YongCheng; Li, Yizhou; Li, Menglong

    2014-01-01

    Early and accurate identification of adverse drug reactions (ADRs) is critically important for drug development and clinical safety. Computer-aided prediction of ADRs has attracted increasing attention in recent years, and many computational models have been proposed. However, because of the lack of systematic analysis and comparison of the different computational models, there remain limitations in designing more effective algorithms and selecting more useful features. There is therefore an urgent need to review and analyze previous computation models to obtain general conclusions that can provide useful guidance to construct more effective computational models to predict ADRs. In the current study, the main work is to compare and analyze the performance of existing computational methods to predict ADRs, by implementing and evaluating additional algorithms that have been earlier used for predicting drug targets. Our results indicated that topological and intrinsic features were complementary to an extent and the Jaccard coefficient had an important and general effect on the prediction of drug-ADR associations. By comparing the structure of each algorithm, final formulas of these algorithms were all converted to linear model in form, based on this finding we propose a new algorithm called the general weighted profile method and it yielded the best overall performance among the algorithms investigated in this paper. Several meaningful conclusions and useful findings regarding the prediction of ADRs are provided for selecting optimal features and algorithms.

  17. Prediction of Software Reliability using Bio Inspired Soft Computing Techniques.

    Science.gov (United States)

    Diwaker, Chander; Tomar, Pradeep; Poonia, Ramesh C; Singh, Vijander

    2018-04-10

    A lot of models have been made for predicting software reliability. The reliability models are restricted to using particular types of methodologies and restricted number of parameters. There are a number of techniques and methodologies that may be used for reliability prediction. There is need to focus on parameters consideration while estimating reliability. The reliability of a system may increase or decreases depending on the selection of different parameters used. Thus there is need to identify factors that heavily affecting the reliability of the system. In present days, reusability is mostly used in the various area of research. Reusability is the basis of Component-Based System (CBS). The cost, time and human skill can be saved using Component-Based Software Engineering (CBSE) concepts. CBSE metrics may be used to assess those techniques which are more suitable for estimating system reliability. Soft computing is used for small as well as large-scale problems where it is difficult to find accurate results due to uncertainty or randomness. Several possibilities are available to apply soft computing techniques in medicine related problems. Clinical science of medicine using fuzzy-logic, neural network methodology significantly while basic science of medicine using neural-networks-genetic algorithm most frequently and preferably. There is unavoidable interest shown by medical scientists to use the various soft computing methodologies in genetics, physiology, radiology, cardiology and neurology discipline. CBSE boost users to reuse the past and existing software for making new products to provide quality with a saving of time, memory space, and money. This paper focused on assessment of commonly used soft computing technique like Genetic Algorithm (GA), Neural-Network (NN), Fuzzy Logic, Support Vector Machine (SVM), Ant Colony Optimization (ACO), Particle Swarm Optimization (PSO), and Artificial Bee Colony (ABC). This paper presents working of soft computing

  18. Analytical predictions of SGEMP response and comparisons with computer calculations

    International Nuclear Information System (INIS)

    de Plomb, E.P.

    1976-01-01

    An analytical formulation for the prediction of SGEMP surface current response is presented. Only two independent dimensionless parameters are required to predict the peak magnitude and rise time of SGEMP induced surface currents. The analysis applies to limited (high fluence) emission as well as unlimited (low fluence) emission. Cause-effect relationships for SGEMP response are treated quantitatively, and yield simple power law dependencies between several physical variables. Analytical predictions for a large matrix of SGEMP cases are compared with an array of about thirty-five computer solutions of similar SGEMP problems, which were collected from three independent research groups. The theoretical solutions generally agree with the computer solutions as well as the computer solutions agree with one another. Such comparisons typically show variations less than a ''factor of two.''

  19. Three-dimensional protein structure prediction: Methods and computational strategies.

    Science.gov (United States)

    Dorn, Márcio; E Silva, Mariel Barbachan; Buriol, Luciana S; Lamb, Luis C

    2014-10-12

    A long standing problem in structural bioinformatics is to determine the three-dimensional (3-D) structure of a protein when only a sequence of amino acid residues is given. Many computational methodologies and algorithms have been proposed as a solution to the 3-D Protein Structure Prediction (3-D-PSP) problem. These methods can be divided in four main classes: (a) first principle methods without database information; (b) first principle methods with database information; (c) fold recognition and threading methods; and (d) comparative modeling methods and sequence alignment strategies. Deterministic computational techniques, optimization techniques, data mining and machine learning approaches are typically used in the construction of computational solutions for the PSP problem. Our main goal with this work is to review the methods and computational strategies that are currently used in 3-D protein prediction. Copyright © 2014 Elsevier Ltd. All rights reserved.

  20. A large-scale evaluation of computational protein function prediction

    NARCIS (Netherlands)

    Radivojac, P.; Clark, W.T.; Oron, T.R.; Schnoes, A.M.; Wittkop, T.; Kourmpetis, Y.A.I.; Dijk, van A.D.J.; Friedberg, I.

    2013-01-01

    Automated annotation of protein function is challenging. As the number of sequenced genomes rapidly grows, the overwhelming majority of protein products can only be annotated computationally. If computational predictions are to be relied upon, it is crucial that the accuracy of these methods be

  1. Algorithm for selection of optimized EPR distance restraints for de novo protein structure determination

    Science.gov (United States)

    Kazmier, Kelli; Alexander, Nathan S.; Meiler, Jens; Mchaourab, Hassane S.

    2010-01-01

    A hybrid protein structure determination approach combining sparse Electron Paramagnetic Resonance (EPR) distance restraints and Rosetta de novo protein folding has been previously demonstrated to yield high quality models (Alexander et al., 2008). However, widespread application of this methodology to proteins of unknown structures is hindered by the lack of a general strategy to place spin label pairs in the primary sequence. In this work, we report the development of an algorithm that optimally selects spin labeling positions for the purpose of distance measurements by EPR. For the α-helical subdomain of T4 lysozyme (T4L), simulated restraints that maximize sequence separation between the two spin labels while simultaneously ensuring pairwise connectivity of secondary structure elements yielded vastly improved models by Rosetta folding. 50% of all these models have the correct fold compared to only 21% and 8% correctly folded models when randomly placed restraints or no restraints are used, respectively. Moreover, the improvements in model quality require a limited number of optimized restraints, the number of which is determined by the pairwise connectivities of T4L α-helices. The predicted improvement in Rosetta model quality was verified by experimental determination of distances between spin labels pairs selected by the algorithm. Overall, our results reinforce the rationale for the combined use of sparse EPR distance restraints and de novo folding. By alleviating the experimental bottleneck associated with restraint selection, this algorithm sets the stage for extending computational structure determination to larger, traditionally elusive protein topologies of critical structural and biochemical importance. PMID:21074624

  2. A systematic investigation of computation models for predicting Adverse Drug Reactions (ADRs.

    Directory of Open Access Journals (Sweden)

    Qifan Kuang

    Full Text Available Early and accurate identification of adverse drug reactions (ADRs is critically important for drug development and clinical safety. Computer-aided prediction of ADRs has attracted increasing attention in recent years, and many computational models have been proposed. However, because of the lack of systematic analysis and comparison of the different computational models, there remain limitations in designing more effective algorithms and selecting more useful features. There is therefore an urgent need to review and analyze previous computation models to obtain general conclusions that can provide useful guidance to construct more effective computational models to predict ADRs.In the current study, the main work is to compare and analyze the performance of existing computational methods to predict ADRs, by implementing and evaluating additional algorithms that have been earlier used for predicting drug targets. Our results indicated that topological and intrinsic features were complementary to an extent and the Jaccard coefficient had an important and general effect on the prediction of drug-ADR associations. By comparing the structure of each algorithm, final formulas of these algorithms were all converted to linear model in form, based on this finding we propose a new algorithm called the general weighted profile method and it yielded the best overall performance among the algorithms investigated in this paper.Several meaningful conclusions and useful findings regarding the prediction of ADRs are provided for selecting optimal features and algorithms.

  3. Predictive Behavior of a Computational Foot/Ankle Model through Artificial Neural Networks.

    Science.gov (United States)

    Chande, Ruchi D; Hargraves, Rosalyn Hobson; Ortiz-Robinson, Norma; Wayne, Jennifer S

    2017-01-01

    Computational models are useful tools to study the biomechanics of human joints. Their predictive performance is heavily dependent on bony anatomy and soft tissue properties. Imaging data provides anatomical requirements while approximate tissue properties are implemented from literature data, when available. We sought to improve the predictive capability of a computational foot/ankle model by optimizing its ligament stiffness inputs using feedforward and radial basis function neural networks. While the former demonstrated better performance than the latter per mean square error, both networks provided reasonable stiffness predictions for implementation into the computational model.

  4. A prognostic scoring model for survival after locoregional therapy in de novo stage IV breast cancer.

    Science.gov (United States)

    Kommalapati, Anuhya; Tella, Sri Harsha; Goyal, Gaurav; Ganti, Apar Kishor; Krishnamurthy, Jairam; Tandra, Pavan Kumar

    2018-05-02

    The role of locoregional treatment (LRT) remains controversial in de novo stage IV breast cancer (BC). We sought to analyze the role of LRT and prognostic factors of overall survival (OS) in de novo stage IV BC patients treated with LRT utilizing the National Cancer Data Base (NCDB). The objective of the current study is to create and internally validate a prognostic scoring model to predict the long-term OS for de novo stage IV BC patients treated with LRT. We included de novo stage IV BC patients reported to NCDB between 2004 and 2015. Patients were divided into LRT and no-LRT subsets. We randomized LRT subset to training and validation cohorts. In the training cohort, a seventeen-point prognostic scoring system was developed based on the hazard ratios calculated using Cox-proportional method. We stratified both training and validation cohorts into two "groups" [group 1 (0-7 points) and group 2 (7-17 points)]. Kaplan-Meier method and log-rank test were used to compare OS between the two groups. Our prognostic score was validated internally by comparing the OS between the respective groups in both the training and validation cohorts. Among 67,978 patients, LRT subset (21,200) had better median OS as compared to that of no-LRT (45 vs. 24 months; p < 0.0001). The group 1 and group 2 in the training cohort showed a significant difference in the 3-year OS (p < 0.0001) (68 vs. 26%). On internal validation, comparable OS was seen between the respective groups in each cohort (p = 0.77). Our prognostic scoring system will help oncologists to predict the prognosis in de novo stage IV BC patients treated with LRT. Although firm treatment-related conclusions cannot be made due to the retrospective nature of the study, LRT appears to be associated with a better OS in specific subgroups.

  5. Computer loss experience and predictions

    Science.gov (United States)

    Parker, Donn B.

    1996-03-01

    The types of losses organizations must anticipate have become more difficult to predict because of the eclectic nature of computers and the data communications and the decrease in news media reporting of computer-related losses as they become commonplace. Total business crime is conjectured to be decreasing in frequency and increasing in loss per case as a result of increasing computer use. Computer crimes are probably increasing, however, as their share of the decreasing business crime rate grows. Ultimately all business crime will involve computers in some way, and we could see a decline of both together. The important information security measures in high-loss business crime generally concern controls over authorized people engaged in unauthorized activities. Such controls include authentication of users, analysis of detailed audit records, unannounced audits, segregation of development and production systems and duties, shielding the viewing of screens, and security awareness and motivation controls in high-value transaction areas. Computer crimes that involve highly publicized intriguing computer misuse methods, such as privacy violations, radio frequency emanations eavesdropping, and computer viruses, have been reported in waves that periodically have saturated the news media during the past 20 years. We must be able to anticipate such highly publicized crimes and reduce the impact and embarrassment they cause. On the basis of our most recent experience, I propose nine new types of computer crime to be aware of: computer larceny (theft and burglary of small computers), automated hacking (use of computer programs to intrude), electronic data interchange fraud (business transaction fraud), Trojan bomb extortion and sabotage (code security inserted into others' systems that can be triggered to cause damage), LANarchy (unknown equipment in use), desktop forgery (computerized forgery and counterfeiting of documents), information anarchy (indiscriminate use of

  6. Coupling of EIT with computational lung modeling for predicting patient-specific ventilatory responses.

    Science.gov (United States)

    Roth, Christian J; Becher, Tobias; Frerichs, Inéz; Weiler, Norbert; Wall, Wolfgang A

    2017-04-01

    Providing optimal personalized mechanical ventilation for patients with acute or chronic respiratory failure is still a challenge within a clinical setting for each case anew. In this article, we integrate electrical impedance tomography (EIT) monitoring into a powerful patient-specific computational lung model to create an approach for personalizing protective ventilatory treatment. The underlying computational lung model is based on a single computed tomography scan and able to predict global airflow quantities, as well as local tissue aeration and strains for any ventilation maneuver. For validation, a novel "virtual EIT" module is added to our computational lung model, allowing to simulate EIT images based on the patient's thorax geometry and the results of our numerically predicted tissue aeration. Clinically measured EIT images are not used to calibrate the computational model. Thus they provide an independent method to validate the computational predictions at high temporal resolution. The performance of this coupling approach has been tested in an example patient with acute respiratory distress syndrome. The method shows good agreement between computationally predicted and clinically measured airflow data and EIT images. These results imply that the proposed framework can be used for numerical prediction of patient-specific responses to certain therapeutic measures before applying them to an actual patient. In the long run, definition of patient-specific optimal ventilation protocols might be assisted by computational modeling. NEW & NOTEWORTHY In this work, we present a patient-specific computational lung model that is able to predict global and local ventilatory quantities for a given patient and any selected ventilation protocol. For the first time, such a predictive lung model is equipped with a virtual electrical impedance tomography module allowing real-time validation of the computed results with the patient measurements. First promising results

  7. De Novo Construction of Redox Active Proteins.

    Science.gov (United States)

    Moser, C C; Sheehan, M M; Ennist, N M; Kodali, G; Bialas, C; Englander, M T; Discher, B M; Dutton, P L

    2016-01-01

    Relatively simple principles can be used to plan and construct de novo proteins that bind redox cofactors and participate in a range of electron-transfer reactions analogous to those seen in natural oxidoreductase proteins. These designed redox proteins are called maquettes. Hydrophobic/hydrophilic binary patterning of heptad repeats of amino acids linked together in a single-chain self-assemble into 4-alpha-helix bundles. These bundles form a robust and adaptable frame for uncovering the default properties of protein embedded cofactors independent of the complexities introduced by generations of natural selection and allow us to better understand what factors can be exploited by man or nature to manipulate the physical chemical properties of these cofactors. Anchoring of redox cofactors such as hemes, light active tetrapyrroles, FeS clusters, and flavins by His and Cys residues allow cofactors to be placed at positions in which electron-tunneling rates between cofactors within or between proteins can be predicted in advance. The modularity of heptad repeat designs facilitates the construction of electron-transfer chains and novel combinations of redox cofactors and new redox cofactor assisted functions. Developing de novo designs that can support cofactor incorporation upon expression in a cell is needed to support a synthetic biology advance that integrates with natural bioenergetic pathways. © 2016 Elsevier Inc. All rights reserved.

  8. From structure prediction to genomic screens for novel non-coding RNAs

    DEFF Research Database (Denmark)

    Gorodkin, Jan; Hofacker, Ivo L.

    2011-01-01

    Abstract: Non-coding RNAs (ncRNAs) are receiving more and more attention not only as an abundant class of genes, but also as regulatory structural elements (some located in mRNAs). A key feature of RNA function is its structure. Computational methods were developed early for folding and prediction....... This and the increased amount of available genomes have made it possible to employ structure-based methods for genomic screens. The field has moved from folding prediction of single sequences to computational screens for ncRNAs in genomic sequence using the RNA structure as the main characteristic feature. Whereas early...... upon some of the concepts in current methods that have been applied in genomic screens for de novo RNA structures in searches for novel ncRNA genes and regulatory RNA structure on mRNAs. We discuss the strengths and weaknesses of the different strategies and how they can complement each other....

  9. De novo origin of human protein-coding genes.

    Directory of Open Access Journals (Sweden)

    Dong-Dong Wu

    2011-11-01

    Full Text Available The de novo origin of a new protein-coding gene from non-coding DNA is considered to be a very rare occurrence in genomes. Here we identify 60 new protein-coding genes that originated de novo on the human lineage since divergence from the chimpanzee. The functionality of these genes is supported by both transcriptional and proteomic evidence. RNA-seq data indicate that these genes have their highest expression levels in the cerebral cortex and testes, which might suggest that these genes contribute to phenotypic traits that are unique to humans, such as improved cognitive ability. Our results are inconsistent with the traditional view that the de novo origin of new genes is very rare, thus there should be greater appreciation of the importance of the de novo origination of genes.

  10. De Novo Origin of Human Protein-Coding Genes

    Science.gov (United States)

    Wu, Dong-Dong; Irwin, David M.; Zhang, Ya-Ping

    2011-01-01

    The de novo origin of a new protein-coding gene from non-coding DNA is considered to be a very rare occurrence in genomes. Here we identify 60 new protein-coding genes that originated de novo on the human lineage since divergence from the chimpanzee. The functionality of these genes is supported by both transcriptional and proteomic evidence. RNA–seq data indicate that these genes have their highest expression levels in the cerebral cortex and testes, which might suggest that these genes contribute to phenotypic traits that are unique to humans, such as improved cognitive ability. Our results are inconsistent with the traditional view that the de novo origin of new genes is very rare, thus there should be greater appreciation of the importance of the de novo origination of genes. PMID:22102831

  11. Macromolecular target prediction by self-organizing feature maps.

    Science.gov (United States)

    Schneider, Gisbert; Schneider, Petra

    2017-03-01

    Rational drug discovery would greatly benefit from a more nuanced appreciation of the activity of pharmacologically active compounds against a diverse panel of macromolecular targets. Already, computational target-prediction models assist medicinal chemists in library screening, de novo molecular design, optimization of active chemical agents, drug re-purposing, in the spotting of potential undesired off-target activities, and in the 'de-orphaning' of phenotypic screening hits. The self-organizing map (SOM) algorithm has been employed successfully for these and other purposes. Areas covered: The authors recapitulate contemporary artificial neural network methods for macromolecular target prediction, and present the basic SOM algorithm at a conceptual level. Specifically, they highlight consensus target-scoring by the employment of multiple SOMs, and discuss the opportunities and limitations of this technique. Expert opinion: Self-organizing feature maps represent a straightforward approach to ligand clustering and classification. Some of the appeal lies in their conceptual simplicity and broad applicability domain. Despite known algorithmic shortcomings, this computational target prediction concept has been proven to work in prospective settings with high success rates. It represents a prototypic technique for future advances in the in silico identification of the modes of action and macromolecular targets of bioactive molecules.

  12. Combined "de novo" and "ex novo" lipid fermentation in a mix-medium of corncob acid hydrolysate and soybean oil by Trichosporon dermatis.

    Science.gov (United States)

    Huang, Chao; Luo, Mu-Tan; Chen, Xue-Fang; Qi, Gao-Xiang; Xiong, Lian; Lin, Xiao-Qing; Wang, Can; Li, Hai-Long; Chen, Xin-De

    2017-01-01

    Microbial oil is one important bio-product for its important function in energy, chemical, and food industry. Finding suitable substrates is one key issue for its industrial application. Both hydrophilic and hydrophobic substrates can be utilized by oleaginous microorganisms with two different bio-pathways (" de novo " lipid fermentation and " ex novo " lipid fermentation). To date, most of the research on lipid fermentation has focused mainly on only one fermentation pathway and little work was carried out on both " de novo " and " ex novo " lipid fermentation simultaneously; thus, the advantages of both lipid fermentation cannot be fulfilled comprehensively. In this study, corncob acid hydrolysate with soybean oil was used as a mix-medium for combined " de novo " and " ex novo " lipid fermentation by oleaginous yeast Trichosporon dermatis . Both hydrophilic and hydrophobic substrates (sugars and soybean oil) in the medium can be utilized simultaneously and efficiently by T. dermatis . Different fermentation modes were compared and the batch mode was the most suitable for the combined fermentation. The influence of soybean oil concentration, inoculum size, and initial pH on the lipid fermentation was evaluated and 20 g/L soybean oil, 5% inoculum size, and initial pH 6.0 were suitable for this bioprocess. By this technology, the lipid composition of extracellular hydrophobic substrate (soybean oil) can be modified. Although adding emulsifier showed little beneficial effect on lipid production, it can modify the intracellular lipid composition of T. dermatis . The present study proves the potential and possibility of combined " de novo " and " ex novo " lipid fermentation. This technology can use hydrophilic and hydrophobic sustainable bio-resources to generate lipid feedstock for the production of biodiesel or other lipid-based chemical compounds and to treat some special wastes such as oil-containing wastewater.

  13. Predictive Behavior of a Computational Foot/Ankle Model through Artificial Neural Networks

    Directory of Open Access Journals (Sweden)

    Ruchi D. Chande

    2017-01-01

    Full Text Available Computational models are useful tools to study the biomechanics of human joints. Their predictive performance is heavily dependent on bony anatomy and soft tissue properties. Imaging data provides anatomical requirements while approximate tissue properties are implemented from literature data, when available. We sought to improve the predictive capability of a computational foot/ankle model by optimizing its ligament stiffness inputs using feedforward and radial basis function neural networks. While the former demonstrated better performance than the latter per mean square error, both networks provided reasonable stiffness predictions for implementation into the computational model.

  14. Genes from scratch--the evolutionary fate of de novo genes.

    Science.gov (United States)

    Schlötterer, Christian

    2015-04-01

    Although considered an extremely unlikely event, many genes emerge from previously noncoding genomic regions. This review covers the entire life cycle of such de novo genes. Two competing hypotheses about the process of de novo gene birth are discussed as well as the high death rate of de novo genes. Despite the high death rate, some de novo genes are retained and remain functional, even in distantly related species, through their integration into gene networks. Further studies combining gene expression with ribosome profiling in multiple populations across different species will be instrumental for an improved understanding of the evolutionary processes operating on de novo genes. Copyright © 2015 The Author. Published by Elsevier Ltd.. All rights reserved.

  15. De novo transcriptome assembly of Setatria italica variety Taejin

    Directory of Open Access Journals (Sweden)

    Yeonhwa Jo

    2016-06-01

    Full Text Available Foxtail millet (Setaria italica belonging to the family Poaceae is an important millet that is widely cultivated in East Asia. Of the cultivated millets, the foxtail millet has the longest history and is one of the main food crops in South India and China. Moreover, foxtail millet is a model plant system for biofuel generation utilizing the C4 photosynthetic pathway. In this study, we carried out de novo transcriptome assembly for the foxtail millet variety Taejin collected from Korea using next-generation sequencing. We obtained a total of 8.676 GB raw data by paired-end sequencing. The raw data in this study can be available in NCBI SRA database with accession number of SRR3406552. The Trinity program was used to de novo assemble 145,332 transcripts. Using the TransDecoder program, we predicted 82,925 putative proteins. BLASTP was performed against the Swiss-Prot protein sequence database to annotate the functions of identified proteins, resulting in 20,555 potentially novel proteins. Taken together, this study provides transcriptome data for the foxtail millet variety Taejin by RNA-Seq.

  16. Computational Prediction of Hot Spot Residues

    Science.gov (United States)

    Morrow, John Kenneth; Zhang, Shuxing

    2013-01-01

    Most biological processes involve multiple proteins interacting with each other. It has been recently discovered that certain residues in these protein-protein interactions, which are called hot spots, contribute more significantly to binding affinity than others. Hot spot residues have unique and diverse energetic properties that make them challenging yet important targets in the modulation of protein-protein complexes. Design of therapeutic agents that interact with hot spot residues has proven to be a valid methodology in disrupting unwanted protein-protein interactions. Using biological methods to determine which residues are hot spots can be costly and time consuming. Recent advances in computational approaches to predict hot spots have incorporated a myriad of features, and have shown increasing predictive successes. Here we review the state of knowledge around protein-protein interactions, hot spots, and give an overview of multiple in silico prediction techniques of hot spot residues. PMID:22316154

  17. Designing sgRNAs with CRISPy web

    DEFF Research Database (Denmark)

    Blin, Kai; Lee, Sang Yup; Weber, Tilmann

    2017-01-01

    Tilmann Weber’s group at the Novo Nordisk Foundation Center for Biosustainability developed a user-friendly, web server implementation of the sgRNA prediction software, CRISPy, for non-computer scientists.......Tilmann Weber’s group at the Novo Nordisk Foundation Center for Biosustainability developed a user-friendly, web server implementation of the sgRNA prediction software, CRISPy, for non-computer scientists....

  18. The origins of computer weather prediction and climate modeling

    International Nuclear Information System (INIS)

    Lynch, Peter

    2008-01-01

    Numerical simulation of an ever-increasing range of geophysical phenomena is adding enormously to our understanding of complex processes in the Earth system. The consequences for mankind of ongoing climate change will be far-reaching. Earth System Models are capable of replicating climate regimes of past millennia and are the best means we have of predicting the future of our climate. The basic ideas of numerical forecasting and climate modeling were developed about a century ago, long before the first electronic computer was constructed. There were several major practical obstacles to be overcome before numerical prediction could be put into practice. A fuller understanding of atmospheric dynamics allowed the development of simplified systems of equations; regular radiosonde observations of the free atmosphere and, later, satellite data, provided the initial conditions; stable finite difference schemes were developed; and powerful electronic computers provided a practical means of carrying out the prodigious calculations required to predict the changes in the weather. Progress in weather forecasting and in climate modeling over the past 50 years has been dramatic. In this presentation, we will trace the history of computer forecasting through the ENIAC integrations to the present day. The useful range of deterministic prediction is increasing by about one day each decade, and our understanding of climate change is growing rapidly as Earth System Models of ever-increasing sophistication are developed

  19. Verifying a computational method for predicting extreme ground motion

    Science.gov (United States)

    Harris, R.A.; Barall, M.; Andrews, D.J.; Duan, B.; Ma, S.; Dunham, E.M.; Gabriel, A.-A.; Kaneko, Y.; Kase, Y.; Aagaard, Brad T.; Oglesby, D.D.; Ampuero, J.-P.; Hanks, T.C.; Abrahamson, N.

    2011-01-01

    In situations where seismological data is rare or nonexistent, computer simulations may be used to predict ground motions caused by future earthquakes. This is particularly practical in the case of extreme ground motions, where engineers of special buildings may need to design for an event that has not been historically observed but which may occur in the far-distant future. Once the simulations have been performed, however, they still need to be tested. The SCEC-USGS dynamic rupture code verification exercise provides a testing mechanism for simulations that involve spontaneous earthquake rupture. We have performed this examination for the specific computer code that was used to predict maximum possible ground motion near Yucca Mountain. Our SCEC-USGS group exercises have demonstrated that the specific computer code that was used for the Yucca Mountain simulations produces similar results to those produced by other computer codes when tackling the same science problem. We also found that the 3D ground motion simulations produced smaller ground motions than the 2D simulations.

  20. Predicting Forearm Physical Exposures During Computer Work Using Self-Reports, Software-Recorded Computer Usage Patterns, and Anthropometric and Workstation Measurements.

    Science.gov (United States)

    Huysmans, Maaike A; Eijckelhof, Belinda H W; Garza, Jennifer L Bruno; Coenen, Pieter; Blatter, Birgitte M; Johnson, Peter W; van Dieën, Jaap H; van der Beek, Allard J; Dennerlein, Jack T

    2017-12-15

    Alternative techniques to assess physical exposures, such as prediction models, could facilitate more efficient epidemiological assessments in future large cohort studies examining physical exposures in relation to work-related musculoskeletal symptoms. The aim of this study was to evaluate two types of models that predict arm-wrist-hand physical exposures (i.e. muscle activity, wrist postures and kinematics, and keyboard and mouse forces) during computer use, which only differed with respect to the candidate predicting variables; (i) a full set of predicting variables, including self-reported factors, software-recorded computer usage patterns, and worksite measurements of anthropometrics and workstation set-up (full models); and (ii) a practical set of predicting variables, only including the self-reported factors and software-recorded computer usage patterns, that are relatively easy to assess (practical models). Prediction models were build using data from a field study among 117 office workers who were symptom-free at the time of measurement. Arm-wrist-hand physical exposures were measured for approximately two hours while workers performed their own computer work. Each worker's anthropometry and workstation set-up were measured by an experimenter, computer usage patterns were recorded using software and self-reported factors (including individual factors, job characteristics, computer work behaviours, psychosocial factors, workstation set-up characteristics, and leisure-time activities) were collected by an online questionnaire. We determined the predictive quality of the models in terms of R2 and root mean squared (RMS) values and exposure classification agreement to low-, medium-, and high-exposure categories (in the practical model only). The full models had R2 values that ranged from 0.16 to 0.80, whereas for the practical models values ranged from 0.05 to 0.43. Interquartile ranges were not that different for the two models, indicating that only for some

  1. Computational prediction of protein-protein interactions in Leishmania predicted proteomes.

    Directory of Open Access Journals (Sweden)

    Antonio M Rezende

    Full Text Available The Trypanosomatids parasites Leishmania braziliensis, Leishmania major and Leishmania infantum are important human pathogens. Despite of years of study and genome availability, effective vaccine has not been developed yet, and the chemotherapy is highly toxic. Therefore, it is clear just interdisciplinary integrated studies will have success in trying to search new targets for developing of vaccines and drugs. An essential part of this rationale is related to protein-protein interaction network (PPI study which can provide a better understanding of complex protein interactions in biological system. Thus, we modeled PPIs for Trypanosomatids through computational methods using sequence comparison against public database of protein or domain interaction for interaction prediction (Interolog Mapping and developed a dedicated combined system score to address the predictions robustness. The confidence evaluation of network prediction approach was addressed using gold standard positive and negative datasets and the AUC value obtained was 0.94. As result, 39,420, 43,531 and 45,235 interactions were predicted for L. braziliensis, L. major and L. infantum respectively. For each predicted network the top 20 proteins were ranked by MCC topological index. In addition, information related with immunological potential, degree of protein sequence conservation among orthologs and degree of identity compared to proteins of potential parasite hosts was integrated. This information integration provides a better understanding and usefulness of the predicted networks that can be valuable to select new potential biological targets for drug and vaccine development. Network modularity which is a key when one is interested in destabilizing the PPIs for drug or vaccine purposes along with multiple alignments of the predicted PPIs were performed revealing patterns associated with protein turnover. In addition, around 50% of hypothetical protein present in the networks

  2. In silico toxicology: computational methods for the prediction of chemical toxicity

    KAUST Repository

    Raies, Arwa B.; Bajic, Vladimir B.

    2016-01-01

    Determining the toxicity of chemicals is necessary to identify their harmful effects on humans, animals, plants, or the environment. It is also one of the main steps in drug design. Animal models have been used for a long time for toxicity testing. However, in vivo animal tests are constrained by time, ethical considerations, and financial burden. Therefore, computational methods for estimating the toxicity of chemicals are considered useful. In silico toxicology is one type of toxicity assessment that uses computational methods to analyze, simulate, visualize, or predict the toxicity of chemicals. In silico toxicology aims to complement existing toxicity tests to predict toxicity, prioritize chemicals, guide toxicity tests, and minimize late-stage failures in drugs design. There are various methods for generating models to predict toxicity endpoints. We provide a comprehensive overview, explain, and compare the strengths and weaknesses of the existing modeling methods and algorithms for toxicity prediction with a particular (but not exclusive) emphasis on computational tools that can implement these methods and refer to expert systems that deploy the prediction models. Finally, we briefly review a number of new research directions in in silico toxicology and provide recommendations for designing in silico models.

  3. In silico toxicology: computational methods for the prediction of chemical toxicity

    KAUST Repository

    Raies, Arwa B.

    2016-01-06

    Determining the toxicity of chemicals is necessary to identify their harmful effects on humans, animals, plants, or the environment. It is also one of the main steps in drug design. Animal models have been used for a long time for toxicity testing. However, in vivo animal tests are constrained by time, ethical considerations, and financial burden. Therefore, computational methods for estimating the toxicity of chemicals are considered useful. In silico toxicology is one type of toxicity assessment that uses computational methods to analyze, simulate, visualize, or predict the toxicity of chemicals. In silico toxicology aims to complement existing toxicity tests to predict toxicity, prioritize chemicals, guide toxicity tests, and minimize late-stage failures in drugs design. There are various methods for generating models to predict toxicity endpoints. We provide a comprehensive overview, explain, and compare the strengths and weaknesses of the existing modeling methods and algorithms for toxicity prediction with a particular (but not exclusive) emphasis on computational tools that can implement these methods and refer to expert systems that deploy the prediction models. Finally, we briefly review a number of new research directions in in silico toxicology and provide recommendations for designing in silico models.

  4. A Computational Model Predicting Disruption of Blood Vessel Development

    Science.gov (United States)

    Vascular development is a complex process regulated by dynamic biological networks that vary in topology and state across different tissues and developmental stages. Signals regulating de novo blood vessel formation (vasculogenesis) and remodeling (angiogenesis) come from a varie...

  5. Genome sequencing of bacteria: sequencing, de novo assembly and rapid analysis using open source tools.

    Science.gov (United States)

    Kisand, Veljo; Lettieri, Teresa

    2013-04-01

    De novo genome sequencing of previously uncharacterized microorganisms has the potential to open up new frontiers in microbial genomics by providing insight into both functional capabilities and biodiversity. Until recently, Roche 454 pyrosequencing was the NGS method of choice for de novo assembly because it generates hundreds of thousands of long reads (tools for processing NGS data are increasingly free and open source and are often adopted for both their high quality and role in promoting academic freedom. The error rate of pyrosequencing the Alcanivorax borkumensis genome was such that thousands of insertions and deletions were artificially introduced into the finished genome. Despite a high coverage (~30 fold), it did not allow the reference genome to be fully mapped. Reads from regions with errors had low quality, low coverage, or were missing. The main defect of the reference mapping was the introduction of artificial indels into contigs through lower than 100% consensus and distracting gene calling due to artificial stop codons. No assembler was able to perform de novo assembly comparable to reference mapping. Automated annotation tools performed similarly on reference mapped and de novo draft genomes, and annotated most CDSs in the de novo assembled draft genomes. Free and open source software (FOSS) tools for assembly and annotation of NGS data are being developed rapidly to provide accurate results with less computational effort. Usability is not high priority and these tools currently do not allow the data to be processed without manual intervention. Despite this, genome assemblers now readily assemble medium short reads into long contigs (>97-98% genome coverage). A notable gap in pyrosequencing technology is the quality of base pair calling and conflicting base pairs between single reads at the same nucleotide position. Regardless, using draft whole genomes that are not finished and remain fragmented into tens of contigs allows one to characterize

  6. Microarray-based cancer prediction using soft computing approach.

    Science.gov (United States)

    Wang, Xiaosheng; Gotoh, Osamu

    2009-05-26

    One of the difficulties in using gene expression profiles to predict cancer is how to effectively select a few informative genes to construct accurate prediction models from thousands or ten thousands of genes. We screen highly discriminative genes and gene pairs to create simple prediction models involved in single genes or gene pairs on the basis of soft computing approach and rough set theory. Accurate cancerous prediction is obtained when we apply the simple prediction models for four cancerous gene expression datasets: CNS tumor, colon tumor, lung cancer and DLBCL. Some genes closely correlated with the pathogenesis of specific or general cancers are identified. In contrast with other models, our models are simple, effective and robust. Meanwhile, our models are interpretable for they are based on decision rules. Our results demonstrate that very simple models may perform well on cancerous molecular prediction and important gene markers of cancer can be detected if the gene selection approach is chosen reasonably.

  7. Computer Prediction of Air Quality in Livestock Buildings

    DEFF Research Database (Denmark)

    Svidt, Kjeld; Bjerg, Bjarne

    In modem livestock buildings the design of ventilation systems is important in order to obtain good air quality. The use of Computational Fluid Dynamics for predicting the air distribution makes it possible to include the effect of room geometry and heat sources in the design process. This paper...... presents numerical prediction of air flow in a livestock building compared with laboratory measurements. An example of the calculation of contaminant distribution is given, and the future possibilities of the method are discussed....

  8. Free energy minimization to predict RNA secondary structures and computational RNA design.

    Science.gov (United States)

    Churkin, Alexander; Weinbrand, Lina; Barash, Danny

    2015-01-01

    Determining the RNA secondary structure from sequence data by computational predictions is a long-standing problem. Its solution has been approached in two distinctive ways. If a multiple sequence alignment of a collection of homologous sequences is available, the comparative method uses phylogeny to determine conserved base pairs that are more likely to form as a result of billions of years of evolution than by chance. In the case of single sequences, recursive algorithms that compute free energy structures by using empirically derived energy parameters have been developed. This latter approach of RNA folding prediction by energy minimization is widely used to predict RNA secondary structure from sequence. For a significant number of RNA molecules, the secondary structure of the RNA molecule is indicative of its function and its computational prediction by minimizing its free energy is important for its functional analysis. A general method for free energy minimization to predict RNA secondary structures is dynamic programming, although other optimization methods have been developed as well along with empirically derived energy parameters. In this chapter, we introduce and illustrate by examples the approach of free energy minimization to predict RNA secondary structures.

  9. DEEP: a general computational framework for predicting enhancers

    KAUST Repository

    Kleftogiannis, Dimitrios A.

    2014-11-05

    Transcription regulation in multicellular eukaryotes is orchestrated by a number of DNA functional elements located at gene regulatory regions. Some regulatory regions (e.g. enhancers) are located far away from the gene they affect. Identification of distal regulatory elements is a challenge for the bioinformatics research. Although existing methodologies increased the number of computationally predicted enhancers, performance inconsistency of computational models across different cell-lines, class imbalance within the learning sets and ad hoc rules for selecting enhancer candidates for supervised learning, are some key questions that require further examination. In this study we developed DEEP, a novel ensemble prediction framework. DEEP integrates three components with diverse characteristics that streamline the analysis of enhancer\\'s properties in a great variety of cellular conditions. In our method we train many individual classification models that we combine to classify DNA regions as enhancers or non-enhancers. DEEP uses features derived from histone modification marks or attributes coming from sequence characteristics. Experimental results indicate that DEEP performs better than four state-of-the-art methods on the ENCODE data. We report the first computational enhancer prediction results on FANTOM5 data where DEEP achieves 90.2% accuracy and 90% geometric mean (GM) of specificity and sensitivity across 36 different tissues. We further present results derived using in vivo-derived enhancer data from VISTA database. DEEP-VISTA, when tested on an independent test set, achieved GM of 80.1% and accuracy of 89.64%. DEEP framework is publicly available at http://cbrc.kaust.edu.sa/deep/.

  10. DEEP: a general computational framework for predicting enhancers

    KAUST Repository

    Kleftogiannis, Dimitrios A.; Kalnis, Panos; Bajic, Vladimir B.

    2014-01-01

    Transcription regulation in multicellular eukaryotes is orchestrated by a number of DNA functional elements located at gene regulatory regions. Some regulatory regions (e.g. enhancers) are located far away from the gene they affect. Identification of distal regulatory elements is a challenge for the bioinformatics research. Although existing methodologies increased the number of computationally predicted enhancers, performance inconsistency of computational models across different cell-lines, class imbalance within the learning sets and ad hoc rules for selecting enhancer candidates for supervised learning, are some key questions that require further examination. In this study we developed DEEP, a novel ensemble prediction framework. DEEP integrates three components with diverse characteristics that streamline the analysis of enhancer's properties in a great variety of cellular conditions. In our method we train many individual classification models that we combine to classify DNA regions as enhancers or non-enhancers. DEEP uses features derived from histone modification marks or attributes coming from sequence characteristics. Experimental results indicate that DEEP performs better than four state-of-the-art methods on the ENCODE data. We report the first computational enhancer prediction results on FANTOM5 data where DEEP achieves 90.2% accuracy and 90% geometric mean (GM) of specificity and sensitivity across 36 different tissues. We further present results derived using in vivo-derived enhancer data from VISTA database. DEEP-VISTA, when tested on an independent test set, achieved GM of 80.1% and accuracy of 89.64%. DEEP framework is publicly available at http://cbrc.kaust.edu.sa/deep/.

  11. Massively parallel de novo protein design for targeted therapeutics

    KAUST Repository

    Chevalier, Aaron

    2017-09-26

    De novo protein design holds promise for creating small stable proteins with shapes customized to bind therapeutic targets. We describe a massively parallel approach for designing, manufacturing and screening mini-protein binders, integrating large-scale computational design, oligonucleotide synthesis, yeast display screening and next-generation sequencing. We designed and tested 22,660 mini-proteins of 37-43 residues that target influenza haemagglutinin and botulinum neurotoxin B, along with 6,286 control sequences to probe contributions to folding and binding, and identified 2,618 high-affinity binders. Comparison of the binding and non-binding design sets, which are two orders of magnitude larger than any previously investigated, enabled the evaluation and improvement of the computational model. Biophysical characterization of a subset of the binder designs showed that they are extremely stable and, unlike antibodies, do not lose activity after exposure to high temperatures. The designs elicit little or no immune response and provide potent prophylactic and therapeutic protection against influenza, even after extensive repeated dosing.

  12. Massively parallel de novo protein design for targeted therapeutics

    KAUST Repository

    Chevalier, Aaron; Silva, Daniel-Adriano; Rocklin, Gabriel J.; Hicks, Derrick R.; Vergara, Renan; Murapa, Patience; Bernard, Steffen M.; Zhang, Lu; Lam, Kwok-Ho; Yao, Guorui; Bahl, Christopher D.; Miyashita, Shin-Ichiro; Goreshnik, Inna; Fuller, James T.; Koday, Merika T.; Jenkins, Cody M.; Colvin, Tom; Carter, Lauren; Bohn, Alan; Bryan, Cassie M.; Ferná ndez-Velasco, D. Alejandro; Stewart, Lance; Dong, Min; Huang, Xuhui; Jin, Rongsheng; Wilson, Ian A.; Fuller, Deborah H.; Baker, David

    2017-01-01

    De novo protein design holds promise for creating small stable proteins with shapes customized to bind therapeutic targets. We describe a massively parallel approach for designing, manufacturing and screening mini-protein binders, integrating large-scale computational design, oligonucleotide synthesis, yeast display screening and next-generation sequencing. We designed and tested 22,660 mini-proteins of 37-43 residues that target influenza haemagglutinin and botulinum neurotoxin B, along with 6,286 control sequences to probe contributions to folding and binding, and identified 2,618 high-affinity binders. Comparison of the binding and non-binding design sets, which are two orders of magnitude larger than any previously investigated, enabled the evaluation and improvement of the computational model. Biophysical characterization of a subset of the binder designs showed that they are extremely stable and, unlike antibodies, do not lose activity after exposure to high temperatures. The designs elicit little or no immune response and provide potent prophylactic and therapeutic protection against influenza, even after extensive repeated dosing.

  13. Massively parallel de novo protein design for targeted therapeutics

    Science.gov (United States)

    Chevalier, Aaron; Silva, Daniel-Adriano; Rocklin, Gabriel J.; Hicks, Derrick R.; Vergara, Renan; Murapa, Patience; Bernard, Steffen M.; Zhang, Lu; Lam, Kwok-Ho; Yao, Guorui; Bahl, Christopher D.; Miyashita, Shin-Ichiro; Goreshnik, Inna; Fuller, James T.; Koday, Merika T.; Jenkins, Cody M.; Colvin, Tom; Carter, Lauren; Bohn, Alan; Bryan, Cassie M.; Fernández-Velasco, D. Alejandro; Stewart, Lance; Dong, Min; Huang, Xuhui; Jin, Rongsheng; Wilson, Ian A.; Fuller, Deborah H.; Baker, David

    2018-01-01

    De novo protein design holds promise for creating small stable proteins with shapes customized to bind therapeutic targets. We describe a massively parallel approach for designing, manufacturing and screening mini-protein binders, integrating large-scale computational design, oligonucleotide synthesis, yeast display screening and next-generation sequencing. We designed and tested 22,660 mini-proteins of 37–43 residues that target influenza haemagglutinin and botulinum neurotoxin B, along with 6,286 control sequences to probe contributions to folding and binding, and identified 2,618 high-affinity binders. Comparison of the binding and non-binding design sets, which are two orders of magnitude larger than any previously investigated, enabled the evaluation and improvement of the computational model. Biophysical characterization of a subset of the binder designs showed that they are extremely stable and, unlike antibodies, do not lose activity after exposure to high temperatures. The designs elicit little or no immune response and provide potent prophylactic and therapeutic protection against influenza, even after extensive repeated dosing. PMID:28953867

  14. Genome-wide patterns and properties of de novo mutations in humans

    NARCIS (Netherlands)

    Francioli, Laurent C.; Polak, Paz P.; Koren, Amnon; Menelaou, Androniki; Chun, Sung; Renkens, Ivo; van Duijn, Cornelia M.; Swertz, Morris; Wijmenga, Cisca; van Ommen, Gertjan; Slagboom, P. Eline; Boomsma, Dorret I.; Ye, Kai; Guryev, Victor; Arndt, Peter F.; Kloosterman, Wigard P.; de Bakker, Paul I. W.; Sunyaev, Shamil R.

    Mutations create variation in the population, fuel evolution and cause genetic diseases. Current knowledge about de novo mutations is incomplete and mostly indirect(1-10). Here we analyze 11,020 de novo mutations from the whole genomes of 250 families. We show that de novo mutations in the offspring

  15. Genome-wide patterns and properties of de novo mutations in humans

    NARCIS (Netherlands)

    Francioli, L.C.; Polak, P.P.; Koren, A.; Menelaou, A.; Chun, S.; Renkens, I.; van Duijn, C.M.; Swertz, M.A.; Wijmenga, C.; van Ommen, G.J.; Slagboom, P.E.; Boomsma, D.I.; Ye, K.; Guryev, V.; Arndt, P.F.; Kloosterman, W.P.; Bakker, P.I.W.; Sunyaev, S.R.; Dijk, F.; Neerincx, P.B.T.; Pulit, S.L.; Deelen, P.; Elbers, C.C.; Palamara, P.F.; Pe'er, I.; Abdellaoui, A.; van Oven, M.; Vermaat, M.; Li, M.; Laros, J.F.J.; Stoneking, M.; de Knijff, P.; Kayser, M.; Veldink, J.H.; Van den Berg, L.H.; Byelas, H.; den Dunnen, J.T.; Dijkstra, M.; Amin, N.; van der Velde, K.J.; Hottenga, J.J.; van Setten, J.; van Leeuwen, E.M.; Kanterakis, A.; Kattenberg, V.M.; Karssen, L.C.; van Schaik, B.D.C.; Bot, J.; Nijman, I.J.; van Enckevort, D.; Mei, H.; Koval, V.; Estrada, K.; Medina-Gomez, C.; Lameijer, E.W.; Moed, M.H.; Hehir-Kwa, J.Y.; Handsaker, R.E.; McCarroll, S.A.; Vuzman, D.; Sohail, M.; Hormozdiari, F.; Marschall, T.; Schönhuth, A.; Beekman, M.; de Craen, A.J.; Suchiman, H.E.D.; Hofman, A.; Oostra, B.; Isaacs, A.; Rivadeneira, F.; Uitterlinden, A.G.; Willemsen, G.; Platteel, M.; Pitts, S.J.; Potluri, S.; Sundar, P.; Cox, D.R.; Li, Q.; Li, Y.; Du, Y.; Chen, R.; Cao, H.; Li, N.; Cao, S.; Wang, J.; Bovenberg, J.A.; Brandsma, M.

    2015-01-01

    Mutations create variation in the population, fuel evolution and cause genetic diseases. Current knowledge about de novo mutations is incomplete and mostly indirect. Here we analyze 11,020 de novo mutations from the whole genomes of 250 families. We show that de novo mutations in the offspring of

  16. ScanIndel: a hybrid framework for indel detection via gapped alignment, split reads and de novo assembly.

    Science.gov (United States)

    Yang, Rendong; Nelson, Andrew C; Henzler, Christine; Thyagarajan, Bharat; Silverstein, Kevin A T

    2015-12-07

    Comprehensive identification of insertions/deletions (indels) across the full size spectrum from second generation sequencing is challenging due to the relatively short read length inherent in the technology. Different indel calling methods exist but are limited in detection to specific sizes with varying accuracy and resolution. We present ScanIndel, an integrated framework for detecting indels with multiple heuristics including gapped alignment, split reads and de novo assembly. Using simulation data, we demonstrate ScanIndel's superior sensitivity and specificity relative to several state-of-the-art indel callers across various coverage levels and indel sizes. ScanIndel yields higher predictive accuracy with lower computational cost compared with existing tools for both targeted resequencing data from tumor specimens and high coverage whole-genome sequencing data from the human NIST standard NA12878. Thus, we anticipate ScanIndel will improve indel analysis in both clinical and research settings. ScanIndel is implemented in Python, and is freely available for academic use at https://github.com/cauyrd/ScanIndel.

  17. Clinicopathologic factors associated with de novo metastatic breast cancer.

    Science.gov (United States)

    Shen, Tiansheng; Siegal, Gene P; Wei, Shi

    2016-12-01

    While breast cancers with distant metastasis at presentation (de novo metastasis) harbor significantly inferior clinical outcomes, there have been limited studies analyzing the clinicopathologic characteristics in this subset of patients. In this study, we analyzed 6126 breast cancers diagnosed between 1998 and 2013 to identify factors associated with de novo metastatic breast cancer. When compared to patients without metastasis at presentation, race, histologic grade, estrogen/progesterone receptor (ER/PR) and HER2 statuses were significantly associated with de novo metastasis in the entire cohort, whereas age, histologic grade, PR and HER2 status were the significant parameters in the subset of patients with locally advanced breast cancer (Stage IIB/III). The patients with de novo metastatic breast cancer had a significant older mean age and a lower proportion of HER2-positive tumors when compared to those with metastatic recurrence. Further, the HER2-rich subtype demonstrated a drastically higher incidence of de novo metastasis when compared to the luminal and triple-negative breast cancers in the entire cohort [odds ratio (OR)=5.68 and 2.27, respectively] and in the patients with locally advanced disease (OR=4.02 and 2.12, respectively), whereas no significant difference was seen between de novo metastatic cancers and those with metastatic recurrence. Moreover, the luminal and HER2-rich subtypes showed bone-seeking (OR=1.92) and liver-homing (OR=2.99) characteristics, respectively, for the sites of de novo metastasis, while the latter was not observed in those with metastatic recurrence. Our data suggest that an algorithm incorporating clinicopathologic factors, especially histologic grade and receptor profile, remains of significant benefit during decision making in newly diagnosed breast cancer in the pursuit of precision medicine. Copyright © 2016 Elsevier GmbH. All rights reserved.

  18. Adopting De Novo Programming Approach on IC Design Service Firms Resources Integration

    Directory of Open Access Journals (Sweden)

    James K. C. Chen

    2014-01-01

    Full Text Available The semiconductor industry has very important position in computer industry, ICT field, and new electronic technology developing. The IC design service is one of key factor of semiconductor industry development. There are more than 365 IC design service firms have been established around Hsinchu Science Park in Taiwan. Building an efficient planning model for IC design service firm resources integrating is very interest issue. This study aims to construct a planning model for IC design service firm implementation resources integration. This study uses the De Novo programming as an approach of criteria alternative to achieve optimal resource allocation on IC design firm. Results show the IC design service firm should conduct open innovation concept and utilizes design outsourcing obtains cost down and enhance IC design service business performance. This plan model of De Novo programming is not only for IC design service firm and also can apply to the other industrial implementation strategic alliance/integrating resource. This plan model is a universal model for the others industries field.

  19. Foldability of a Natural De Novo Evolved Protein.

    Science.gov (United States)

    Bungard, Dixie; Copple, Jacob S; Yan, Jing; Chhun, Jimmy J; Kumirov, Vlad K; Foy, Scott G; Masel, Joanna; Wysocki, Vicki H; Cordes, Matthew H J

    2017-11-07

    The de novo evolution of protein-coding genes from noncoding DNA is emerging as a source of molecular innovation in biology. Studies of random sequence libraries, however, suggest that young de novo proteins will not fold into compact, specific structures typical of native globular proteins. Here we show that Bsc4, a functional, natural de novo protein encoded by a gene that evolved recently from noncoding DNA in the yeast S. cerevisiae, folds to a partially specific three-dimensional structure. Bsc4 forms soluble, compact oligomers with high β sheet content and a hydrophobic core, and undergoes cooperative, reversible denaturation. Bsc4 lacks a specific quaternary state, however, existing instead as a continuous distribution of oligomer sizes, and binds dyes indicative of amyloid oligomers or molten globules. The combination of native-like and non-native-like properties suggests a rudimentary fold that could potentially act as a functional intermediate in the emergence of new folded proteins de novo. Copyright © 2017 Elsevier Ltd. All rights reserved.

  20. Particulated articular cartilage: CAIS and DeNovo NT.

    Science.gov (United States)

    Farr, Jack; Cole, Brian J; Sherman, Seth; Karas, Vasili

    2012-03-01

    Cartilage Autograft Implantation System (CAIS; DePuy/Mitek, Raynham, MA) and DeNovo Natural Tissue (NT; ISTO, St. Louis, MO) are novel treatment options for focal articular cartilage defects in the knee. These methods involve the implantation of particulated articular cartilage from either autograft or juvenile allograft donor, respectively. In the laboratory and in animal models, both CAIS and DeNovo NT have demonstrated the ability of the transplanted cartilage cells to "escape" from the extracellular matrix, migrate, multiply, and form a new hyaline-like cartilage tissue matrix that integrates with the surrounding host tissue. In clinical practice, the technique for both CAIS and DeNovo NT is straightforward, requiring only a single surgery to affect cartilage repair. Clinical experience is limited, with short-term studies demonstrating both procedures to be safe, feasible, and effective, with improvements in subjective patient scores, and with magnetic resonance imaging evidence of good defect fill. While these treatment options appear promising, prospective randomized controlled studies are necessary to refine the indications and contraindications for both CAIS and DeNovo NT.

  1. Thermal sensation prediction by soft computing methodology.

    Science.gov (United States)

    Jović, Srđan; Arsić, Nebojša; Vilimonović, Jovana; Petković, Dalibor

    2016-12-01

    Thermal comfort in open urban areas is very factor based on environmental point of view. Therefore it is need to fulfill demands for suitable thermal comfort during urban planning and design. Thermal comfort can be modeled based on climatic parameters and other factors. The factors are variables and they are changed throughout the year and days. Therefore there is need to establish an algorithm for thermal comfort prediction according to the input variables. The prediction results could be used for planning of time of usage of urban areas. Since it is very nonlinear task, in this investigation was applied soft computing methodology in order to predict the thermal comfort. The main goal was to apply extreme leaning machine (ELM) for forecasting of physiological equivalent temperature (PET) values. Temperature, pressure, wind speed and irradiance were used as inputs. The prediction results are compared with some benchmark models. Based on the results ELM can be used effectively in forecasting of PET. Copyright © 2016 Elsevier Ltd. All rights reserved.

  2. A survey on computational intelligence approaches for predictive modeling in prostate cancer

    OpenAIRE

    Cosma, G; Brown, D; Archer, M; Khan, M; Pockley, AG

    2017-01-01

    Predictive modeling in medicine involves the development of computational models which are capable of analysing large amounts of data in order to predict healthcare outcomes for individual patients. Computational intelligence approaches are suitable when the data to be modelled are too complex forconventional statistical techniques to process quickly and eciently. These advanced approaches are based on mathematical models that have been especially developed for dealing with the uncertainty an...

  3. Computational prediction of protein hot spot residues.

    Science.gov (United States)

    Morrow, John Kenneth; Zhang, Shuxing

    2012-01-01

    Most biological processes involve multiple proteins interacting with each other. It has been recently discovered that certain residues in these protein-protein interactions, which are called hot spots, contribute more significantly to binding affinity than others. Hot spot residues have unique and diverse energetic properties that make them challenging yet important targets in the modulation of protein-protein complexes. Design of therapeutic agents that interact with hot spot residues has proven to be a valid methodology in disrupting unwanted protein-protein interactions. Using biological methods to determine which residues are hot spots can be costly and time consuming. Recent advances in computational approaches to predict hot spots have incorporated a myriad of features, and have shown increasing predictive successes. Here we review the state of knowledge around protein-protein interactions, hot spots, and give an overview of multiple in silico prediction techniques of hot spot residues.

  4. Prediction of intestinal absorption and blood-brain barrier penetration by computational methods.

    Science.gov (United States)

    Clark, D E

    2001-09-01

    This review surveys the computational methods that have been developed with the aim of identifying drug candidates likely to fail later on the road to market. The specifications for such computational methods are outlined, including factors such as speed, interpretability, robustness and accuracy. Then, computational filters aimed at predicting "drug-likeness" in a general sense are discussed before methods for the prediction of more specific properties--intestinal absorption and blood-brain barrier penetration--are reviewed. Directions for future research are discussed and, in concluding, the impact of these methods on the drug discovery process, both now and in the future, is briefly considered.

  5. Towards pattern generation and chaotic series prediction with photonic reservoir computers

    Science.gov (United States)

    Antonik, Piotr; Hermans, Michiel; Duport, François; Haelterman, Marc; Massar, Serge

    2016-03-01

    Reservoir Computing is a bio-inspired computing paradigm for processing time dependent signals that is particularly well suited for analog implementations. Our team has demonstrated several photonic reservoir computers with performance comparable to digital algorithms on a series of benchmark tasks such as channel equalisation and speech recognition. Recently, we showed that our opto-electronic reservoir computer could be trained online with a simple gradient descent algorithm programmed on an FPGA chip. This setup makes it in principle possible to feed the output signal back into the reservoir, and thus highly enrich the dynamics of the system. This will allow to tackle complex prediction tasks in hardware, such as pattern generation and chaotic and financial series prediction, which have so far only been studied in digital implementations. Here we report simulation results of our opto-electronic setup with an FPGA chip and output feedback applied to pattern generation and Mackey-Glass chaotic series prediction. The simulations take into account the major aspects of our experimental setup. We find that pattern generation can be easily implemented on the current setup with very good results. The Mackey-Glass series prediction task is more complex and requires a large reservoir and more elaborate training algorithm. With these adjustments promising result are obtained, and we now know what improvements are needed to match previously reported numerical results. These simulation results will serve as basis of comparison for experiments we will carry out in the coming months.

  6. SOFT COMPUTING SINGLE HIDDEN LAYER MODELS FOR SHELF LIFE PREDICTION OF BURFI

    Directory of Open Access Journals (Sweden)

    Sumit Goyal

    2012-05-01

    Full Text Available Burfi is an extremely popular sweetmeat, which is prepared by desiccating the standardized water buffalo milk. Soft computing feedforward single layer models were developed for predicting the shelf life of burfi stored at 30g.C. The data of the product relating to moisture, titratable acidity, free fatty acids, tyrosine, and peroxide value were used as input variables, and the overall acceptability score as output variable. The results showed excellent agreement between the experimental and the predicted data, suggesting that the developed soft computing model can alternatively be used for predicting the shelf life of burfi.

  7. Predictive Control of Networked Multiagent Systems via Cloud Computing.

    Science.gov (United States)

    Liu, Guo-Ping

    2017-01-18

    This paper studies the design and analysis of networked multiagent predictive control systems via cloud computing. A cloud predictive control scheme for networked multiagent systems (NMASs) is proposed to achieve consensus and stability simultaneously and to compensate for network delays actively. The design of the cloud predictive controller for NMASs is detailed. The analysis of the cloud predictive control scheme gives the necessary and sufficient conditions of stability and consensus of closed-loop networked multiagent control systems. The proposed scheme is verified to characterize the dynamical behavior and control performance of NMASs through simulations. The outcome provides a foundation for the development of cooperative and coordinative control of NMASs and its applications.

  8. Self-learning computers for surgical planning and prediction of postoperative alignment.

    Science.gov (United States)

    Lafage, Renaud; Pesenti, Sébastien; Lafage, Virginie; Schwab, Frank J

    2018-02-01

    In past decades, the role of sagittal alignment has been widely demonstrated in the setting of spinal conditions. As several parameters can be affected, identifying the driver of the deformity is the cornerstone of a successful treatment approach. Despite the importance of restoring sagittal alignment for optimizing outcome, this task remains challenging. Self-learning computers and optimized algorithms are of great interest in spine surgery as in that they facilitate better planning and prediction of postoperative alignment. Nowadays, computer-assisted tools are part of surgeons' daily practice; however, the use of such tools remains to be time-consuming. NARRATIVE REVIEW AND RESULTS: Computer-assisted methods for the prediction of postoperative alignment consist of a three step analysis: identification of anatomical landmark, definition of alignment objectives, and simulation of surgery. Recently, complex rules for the prediction of alignment have been proposed. Even though this kind of work leads to more personalized objectives, the number of parameters involved renders it difficult for clinical use, stressing the importance of developing computer-assisted tools. The evolution of our current technology, including machine learning and other types of advanced algorithms, will provide powerful tools that could be useful in improving surgical outcomes and alignment prediction. These tools can combine different types of advanced technologies, such as image recognition and shape modeling, and using this technique, computer-assisted methods are able to predict spinal shape. The development of powerful computer-assisted methods involves the integration of several sources of information such as radiographic parameters (X-rays, MRI, CT scan, etc.), demographic information, and unusual non-osseous parameters (muscle quality, proprioception, gait analysis data). In using a larger set of data, these methods will aim to mimic what is actually done by spine surgeons, leading

  9. Predictive modelling-based design and experiments for synthesis and spinning of bioinspired silk fibres

    Science.gov (United States)

    Gronau, Greta; Jacobsen, Matthew M.; Huang, Wenwen; Rizzo, Daniel J.; Li, David; Staii, Cristian; Pugno, Nicola M.; Wong, Joyce Y.; Kaplan, David L.; Buehler, Markus J.

    2016-01-01

    Scalable computational modelling tools are required to guide the rational design of complex hierarchical materials with predictable functions. Here, we utilize mesoscopic modelling, integrated with genetic block copolymer synthesis and bioinspired spinning process, to demonstrate de novo materials design that incorporates chemistry, processing and material characterization. We find that intermediate hydrophobic/hydrophilic block ratios observed in natural spider silks and longer chain lengths lead to outstanding silk fibre formation. This design by nature is based on the optimal combination of protein solubility, self-assembled aggregate size and polymer network topology. The original homogeneous network structure becomes heterogeneous after spinning, enhancing the anisotropic network connectivity along the shear flow direction. Extending beyond the classical polymer theory, with insights from the percolation network model, we illustrate the direct proportionality between network conductance and fibre Young's modulus. This integrated approach provides a general path towards de novo functional network materials with enhanced mechanical properties and beyond (optical, electrical or thermal) as we have experimentally verified. PMID:26017575

  10. Predictive modelling-based design and experiments for synthesis and spinning of bioinspired silk fibres.

    Science.gov (United States)

    Lin, Shangchao; Ryu, Seunghwa; Tokareva, Olena; Gronau, Greta; Jacobsen, Matthew M; Huang, Wenwen; Rizzo, Daniel J; Li, David; Staii, Cristian; Pugno, Nicola M; Wong, Joyce Y; Kaplan, David L; Buehler, Markus J

    2015-05-28

    Scalable computational modelling tools are required to guide the rational design of complex hierarchical materials with predictable functions. Here, we utilize mesoscopic modelling, integrated with genetic block copolymer synthesis and bioinspired spinning process, to demonstrate de novo materials design that incorporates chemistry, processing and material characterization. We find that intermediate hydrophobic/hydrophilic block ratios observed in natural spider silks and longer chain lengths lead to outstanding silk fibre formation. This design by nature is based on the optimal combination of protein solubility, self-assembled aggregate size and polymer network topology. The original homogeneous network structure becomes heterogeneous after spinning, enhancing the anisotropic network connectivity along the shear flow direction. Extending beyond the classical polymer theory, with insights from the percolation network model, we illustrate the direct proportionality between network conductance and fibre Young's modulus. This integrated approach provides a general path towards de novo functional network materials with enhanced mechanical properties and beyond (optical, electrical or thermal) as we have experimentally verified.

  11. Customizable de novo design strategies for DOCK: Application to HIVgp41 and other therapeutic targets.

    Science.gov (United States)

    Allen, William J; Fochtman, Brian C; Balius, Trent E; Rizzo, Robert C

    2017-11-15

    De novo design can be used to explore vast areas of chemical space in computational lead discovery. As a complement to virtual screening, from-scratch construction of molecules is not limited to compounds in pre-existing vendor catalogs. Here, we present an iterative fragment growth method, integrated into the program DOCK, in which new molecules are built using rules for allowable connections based on known molecules. The method leverages DOCK's advanced scoring and pruning approaches and users can define very specific criteria in terms of properties or features to customize growth toward a particular region of chemical space. The code was validated using three increasingly difficult classes of calculations: (1) Rebuilding known X-ray ligands taken from 663 complexes using only their component parts (focused libraries), (2) construction of new ligands in 57 drug target sites using a library derived from ∼13M drug-like compounds (generic libraries), and (3) application to a challenging protein-protein interface on the viral drug target HIVgp41. The computational testing confirms that the de novo DOCK routines are robust and working as envisioned, and the compelling results highlight the potential utility for designing new molecules against a wide variety of important protein targets. © 2017 Wiley Periodicals, Inc. © 2017 Wiley Periodicals, Inc.

  12. Why do Reservoir Computing Networks Predict Chaotic Systems so Well?

    Science.gov (United States)

    Lu, Zhixin; Pathak, Jaideep; Girvan, Michelle; Hunt, Brian; Ott, Edward

    Recently a new type of artificial neural network, which is called a reservoir computing network (RCN), has been employed to predict the evolution of chaotic dynamical systems from measured data and without a priori knowledge of the governing equations of the system. The quality of these predictions has been found to be spectacularly good. Here, we present a dynamical-system-based theory for how RCN works. Basically a RCN is thought of as consisting of three parts, a randomly chosen input layer, a randomly chosen recurrent network (the reservoir), and an output layer. The advantage of the RCN framework is that training is done only on the linear output layer, making it computationally feasible for the reservoir dimensionality to be large. In this presentation, we address the underlying dynamical mechanisms of RCN function by employing the concepts of generalized synchronization and conditional Lyapunov exponents. Using this framework, we propose conditions on reservoir dynamics necessary for good prediction performance. By looking at the RCN from this dynamical systems point of view, we gain a deeper understanding of its surprising computational power, as well as insights on how to design a RCN. Supported by Army Research Office Grant Number W911NF1210101.

  13. Computational design and elaboration of a de novo heterotetrameric alpha-helical protein that selectively binds an emissive abiological (porphinato)zinc chromophore.

    Science.gov (United States)

    Fry, H Christopher; Lehmann, Andreas; Saven, Jeffery G; DeGrado, William F; Therien, Michael J

    2010-03-24

    The first example of a computationally de novo designed protein that binds an emissive abiological chromophore is presented, in which a sophisticated level of cofactor discrimination is pre-engineered. This heterotetrameric, C(2)-symmetric bundle, A(His):B(Thr), uniquely binds (5,15-di[(4-carboxymethyleneoxy)phenyl]porphinato)zinc [(DPP)Zn] via histidine coordination and complementary noncovalent interactions. The A(2)B(2) heterotetrameric protein reflects ligand-directed elements of both positive and negative design, including hydrogen bonds to second-shell ligands. Experimental support for the appropriate formulation of [(DPP)Zn:A(His):B(Thr)](2) is provided by UV/visible and circular dichroism spectroscopies, size exclusion chromatography, and analytical ultracentrifugation. Time-resolved transient absorption and fluorescence spectroscopic data reveal classic excited-state singlet and triplet PZn photophysics for the A(His):B(Thr):(DPP)Zn protein (k(fluorescence) = 4 x 10(8) s(-1); tau(triplet) = 5 ms). The A(2)B(2) apoprotein has immeasurably low binding affinities for related [porphinato]metal chromophores that include a (DPP)Fe(III) cofactor and the zinc metal ion hemin derivative [(PPIX)Zn], underscoring the exquisite active-site binding discrimination realized in this computationally designed protein. Importantly, elements of design in the A(His):B(Thr) protein ensure that interactions within the tetra-alpha-helical bundle are such that only the heterotetramer is stable in solution; corresponding homomeric bundles present unfavorable ligand-binding environments and thus preclude protein structural rearrangements that could lead to binding of (porphinato)iron cofactors.

  14. A COMPARISON BETWEEN THREE PREDICTIVE MODELS OF COMPUTATIONAL INTELLIGENCE

    Directory of Open Access Journals (Sweden)

    DUMITRU CIOBANU

    2013-12-01

    Full Text Available Time series prediction is an open problem and many researchers are trying to find new predictive methods and improvements for the existing ones. Lately methods based on neural networks are used extensively for time series prediction. Also, support vector machines have solved some of the problems faced by neural networks and they began to be widely used for time series prediction. The main drawback of those two methods is that they are global models and in the case of a chaotic time series it is unlikely to find such model. In this paper it is presented a comparison between three predictive from computational intelligence field one based on neural networks one based on support vector machine and another based on chaos theory. We show that the model based on chaos theory is an alternative to the other two methods.

  15. Three-dimensional computed tomographic volumetry precisely predicts the postoperative pulmonary function.

    Science.gov (United States)

    Kobayashi, Keisuke; Saeki, Yusuke; Kitazawa, Shinsuke; Kobayashi, Naohiro; Kikuchi, Shinji; Goto, Yukinobu; Sakai, Mitsuaki; Sato, Yukio

    2017-11-01

    It is important to accurately predict the patient's postoperative pulmonary function. The aim of this study was to compare the accuracy of predictions of the postoperative residual pulmonary function obtained with three-dimensional computed tomographic (3D-CT) volumetry with that of predictions obtained with the conventional segment-counting method. Fifty-three patients scheduled to undergo lung cancer resection, pulmonary function tests, and computed tomography were enrolled in this study. The postoperative residual pulmonary function was predicted based on the segment-counting and 3D-CT volumetry methods. The predicted postoperative values were compared with the results of postoperative pulmonary function tests. Regarding the linear correlation coefficients between the predicted postoperative values and the measured values, those obtained using the 3D-CT volumetry method tended to be higher than those acquired using the segment-counting method. In addition, the variations between the predicted and measured values were smaller with the 3D-CT volumetry method than with the segment-counting method. These results were more obvious in COPD patients than in non-COPD patients. Our findings suggested that the 3D-CT volumetry was able to predict the residual pulmonary function more accurately than the segment-counting method, especially in patients with COPD. This method might lead to the selection of appropriate candidates for surgery among patients with a marginal pulmonary function.

  16. Computer code to predict the heat of explosion of high energy materials

    International Nuclear Information System (INIS)

    Muthurajan, H.; Sivabalan, R.; Pon Saravanan, N.; Talawar, M.B.

    2009-01-01

    The computational approach to the thermochemical changes involved in the process of explosion of a high energy materials (HEMs) vis-a-vis its molecular structure aids a HEMs chemist/engineers to predict the important thermodynamic parameters such as heat of explosion of the HEMs. Such a computer-aided design will be useful in predicting the performance of a given HEM as well as in conceiving futuristic high energy molecules that have significant potential in the field of explosives and propellants. The software code viz., LOTUSES developed by authors predicts various characteristics of HEMs such as explosion products including balanced explosion reactions, density of HEMs, velocity of detonation, CJ pressure, etc. The new computational approach described in this paper allows the prediction of heat of explosion (ΔH e ) without any experimental data for different HEMs, which are comparable with experimental results reported in literature. The new algorithm which does not require any complex input parameter is incorporated in LOTUSES (version 1.5) and the results are presented in this paper. The linear regression analysis of all data point yields the correlation coefficient R 2 = 0.9721 with a linear equation y = 0.9262x + 101.45. The correlation coefficient value 0.9721 reveals that the computed values are in good agreement with experimental values and useful for rapid hazard assessment of energetic materials

  17. De novo identification of viral pathogens from cell culture hologenomes

    Directory of Open Access Journals (Sweden)

    Patowary Ashok

    2012-01-01

    Full Text Available Abstract Background Fast, specific identification and surveillance of pathogens is the cornerstone of any outbreak response system, especially in the case of emerging infectious diseases and viral epidemics. This process is generally tedious and time-consuming thus making it ineffective in traditional settings. The added complexity in these situations is the non-availability of pure isolates of pathogens as they are present as mixed genomes or hologenomes. Next-generation sequencing approaches offer an attractive solution in this scenario as it provides adequate depth of sequencing at fast and affordable costs, apart from making it possible to decipher complex interactions between genomes at a scale that was not possible before. The widespread application of next-generation sequencing in this field has been limited by the non-availability of an efficient computational pipeline to systematically analyze data to delineate pathogen genomes from mixed population of genomes or hologenomes. Findings We applied next-generation sequencing on a sample containing mixed population of genomes from an epidemic with appropriate processing and enrichment. The data was analyzed using an extensive computational pipeline involving mapping to reference genome sets and de-novo assembly. In depth analysis of the data generated revealed the presence of sequences corresponding to Japanese encephalitis virus. The genome of the virus was also independently de-novo assembled. The presence of the virus was in addition, verified using standard molecular biology techniques. Conclusions Our approach can accurately identify causative pathogens from cell culture hologenome samples containing mixed population of genomes and in principle can be applied to patient hologenome samples without any background information. This methodology could be widely applied to identify and isolate pathogen genomes and understand their genomic variability during outbreaks.

  18. ''de novo'' aneurysms following endovascular procedures

    Energy Technology Data Exchange (ETDEWEB)

    Briganti, F.; Cirillo, S.; Caranci, F. [Department of Neurological Sciences, Services of Neuroradiology, ' ' Federico II' ' University, Naples (Italy); Esposito, F.; Maiuri, F. [Department of Neurological Sciences, Services of Neurosurgery, ' ' Federico II' ' University, Naples (Italy)

    2002-07-01

    Two personal cases of ''de novo'' aneurysms of the anterior communicating artery (ACoA) occurring 9 and 4 years, respectively, after endovascular carotid occlusion are described. A review of the 30 reported cases (including our own two) of ''de novo'' aneurysms after occlusion of the major cerebral vessels has shown some features, including a rather long time interval after the endovascular procedure of up to 20-25 years (average 9.6 years), a preferential ACoA (36.3%) and internal carotid artery-posterior communicating artery (ICA-PCoA) (33.3%) location of the ''de novo'' aneurysms, and a 10% rate of multiple aneurysms. These data are compared with those of the group of reported spontaneous ''de novo'' aneurysms after SAH or previous aneurysm clipping. We agree that the frequency of ''de novo'' aneurysms after major-vessel occlusion (two among ten procedures in our series, or 20%) is higher than commonly reported (0 to 11%). For this reason, we suggest that patients who have been submitted to endovascular major-vessel occlusion be followed up for up to 20-25 years after the procedure, using non-invasive imaging studies such as MR angiography and high-resolution CT angiography. On the other hand, periodic digital angiography has a questionable risk-benefit ratio; it may be used when a ''de novo'' aneurysm is detected or suspected on non-invasive studies. The progressive enlargement of the ACoA after carotid occlusion, as described in our case 1, must be considered a radiological finding of risk for ''de novo'' aneurysm formation. (orig.)

  19. Language and national identity in Novo Cinema Galego

    Directory of Open Access Journals (Sweden)

    Brais ROMERO SUÁREZ

    2015-12-01

    Full Text Available The talk of town since its inception in 2010, the Cinema Novo Galego has been successful in all competitions and festivals that has been present. From the FIPRESCI prize in Cannes to the Best Emerging Director at Locarno, this new wave of cinema places Galicia in the world film stage. But does Novo Cinema Galego an accurate representation of Galicia? What's the role of Galicia in this movement?

  20. The value of computed tomography-urography in predicting the ...

    African Journals Online (AJOL)

    Background The natural course of pelviureteric junction (PUJ) obstruction is variable. Of those who require surgical intervention, there is no definite reliable preoperative predictor of the likely postoperative outcome. We evaluated the value of preoperative computed tomography (CT)-urography in predicting the ...

  1. A de novo missense mutation of FGFR2 causes facial dysplasia syndrome in Holstein cattle

    DEFF Research Database (Denmark)

    Agerholm, Jørgen Steen; McEvoy, Fintan; Heegaard, Steffen

    2017-01-01

    was suspected as all recorded cases were progeny of the same sire. Detailed investigations were performed to characterize the syndrome and to reveal its cause. Results Seven malformed calves were submitted examination. All cases shared a common morphology with the most striking lesions being severe facial...... chromosome 26 where whole genome sequencing of a case-parent trio revealed two de novo variants perfectly associated with the disease: an intronic SNP in the DMBT1 gene and a single non-synonymous variant in the FGFR2 gene. This FGFR2 missense variant (c.927G>T) affects a gene encoding a member...... of the fibroblast growth factor receptor family, where amino acid sequence is highly conserved between members and across species. It is predicted to change an evolutionary conserved tryptophan into a cysteine residue (p.Trp309Cys). Both variant alleles were proven to result from de novo mutation events...

  2. Computer-aided and predictive models for design of controlled release of pesticides

    DEFF Research Database (Denmark)

    Suné, Nuria Muro; Gani, Rafiqul

    2004-01-01

    In the field of pesticide controlled release technology, a computer based model that can predict the delivery of the Active Ingredient (AI) from fabricated units is important for purposes of product design and marketing. A model for the release of an M from a microcapsule device is presented...... in this paper, together with a specific case study application to highlight its scope and significance. The paper also addresses the need for predictive models and proposes a computer aided modelling framework for achieving it through the development and introduction of reliable and predictive constitutive...... models. A group-contribution based model for one of the constitutive variables (AI solubility in polymers) is presented together with examples of application and validation....

  3. De novo peptide design and experimental validation of histone methyltransferase inhibitors.

    Directory of Open Access Journals (Sweden)

    James Smadbeck

    Full Text Available Histones are small proteins critical to the efficient packaging of DNA in the nucleus. DNA–protein complexes, known as nucleosomes, are formed when the DNA winds itself around the surface of the histones. The methylation of histone residues by enhancer of zeste homolog 2 (EZH2 maintains gene repression over successive cell generations. Overexpression of EZH2 can silence important tumor suppressor genes leading to increased invasiveness of many types of cancers. This makes the inhibition of EZH2 an important target in the development of cancer therapeutics. We employed a three-stage computational de novo peptide design method to design inhibitory peptides of EZH2. The method consists of a sequence selection stage and two validation stages for fold specificity and approximate binding affinity. The sequence selection stage consists of an integer linear optimization model that was solved to produce a rank-ordered list of amino acid sequences with increased stability in the bound peptide-EZH2 structure. These sequences were validated through the calculation of the fold specificity and approximate binding affinity of the designed peptides. Here we report the discovery of novel EZH2 inhibitory peptides using the de novo peptide design method. The computationally discovered peptides were experimentally validated in vitro using dose titrations and mechanism of action enzymatic assays. The peptide with the highest in vitro response, SQ037, was validated in nucleo using quantitative mass spectrometry-based proteomics. This peptide had an IC50 of 13.5 mM, demonstrated greater potency as an inhibitor when compared to the native and K27A mutant control peptides, and demonstrated competitive inhibition versus the peptide substrate. Additionally, this peptide demonstrated high specificity to the EZH2 target in comparison to other histone methyltransferases. The validated peptides are the first computationally designed peptides that directly inhibit EZH2

  4. De novo peptide design and experimental validation of histone methyltransferase inhibitors.

    Directory of Open Access Journals (Sweden)

    James Smadbeck

    Full Text Available Histones are small proteins critical to the efficient packaging of DNA in the nucleus. DNA-protein complexes, known as nucleosomes, are formed when the DNA winds itself around the surface of the histones. The methylation of histone residues by enhancer of zeste homolog 2 (EZH2 maintains gene repression over successive cell generations. Overexpression of EZH2 can silence important tumor suppressor genes leading to increased invasiveness of many types of cancers. This makes the inhibition of EZH2 an important target in the development of cancer therapeutics. We employed a three-stage computational de novo peptide design method to design inhibitory peptides of EZH2. The method consists of a sequence selection stage and two validation stages for fold specificity and approximate binding affinity. The sequence selection stage consists of an integer linear optimization model that was solved to produce a rank-ordered list of amino acid sequences with increased stability in the bound peptide-EZH2 structure. These sequences were validated through the calculation of the fold specificity and approximate binding affinity of the designed peptides. Here we report the discovery of novel EZH2 inhibitory peptides using the de novo peptide design method. The computationally discovered peptides were experimentally validated in vitro using dose titrations and mechanism of action enzymatic assays. The peptide with the highest in vitro response, SQ037, was validated in nucleo using quantitative mass spectrometry-based proteomics. This peptide had an IC50 of 13.5 [Formula: see text]M, demonstrated greater potency as an inhibitor when compared to the native and K27A mutant control peptides, and demonstrated competitive inhibition versus the peptide substrate. Additionally, this peptide demonstrated high specificity to the EZH2 target in comparison to other histone methyltransferases. The validated peptides are the first computationally designed peptides that directly

  5. On the Origin of De Novo Genes in Arabidopsis thaliana Populations.

    Science.gov (United States)

    Li, Zi-Wen; Chen, Xi; Wu, Qiong; Hagmann, Jörg; Han, Ting-Shen; Zou, Yu-Pan; Ge, Song; Guo, Ya-Long

    2016-08-03

    De novo genes, which originate from ancestral nongenic sequences, are one of the most important sources of protein-coding genes. This origination process is crucial for the adaptation of organisms. However, how de novo genes arise and become fixed in a population or species remains largely unknown. Here, we identified 782 de novo genes from the model plant Arabidopsis thaliana and divided them into three types based on the availability of translational evidence, transcriptional evidence, and neither transcriptional nor translational evidence for their origin. Importantly, by integrating multiple types of omics data, including data from genomes, epigenomes, transcriptomes, and translatomes, we found that epigenetic modifications (DNA methylation and histone modification) play an important role in the origination process of de novo genes. Intriguingly, using the transcriptomes and methylomes from the same population of 84 accessions, we found that de novo genes that are transcribed in approximately half of the total accessions within the population are highly methylated, with lower levels of transcription than those transcribed at other frequencies within the population. We hypothesized that, during the origin of de novo gene alleles, those neutralized to low expression states via DNA methylation have relatively high probabilities of spreading and becoming fixed in a population. Our results highlight the process underlying the origin of de novo genes at the population level, as well as the importance of DNA methylation in this process. © The Author 2016. Published by Oxford University Press on behalf of the Society for Molecular Biology and Evolution.

  6. BayesMotif: de novo protein sorting motif discovery from impure datasets.

    Science.gov (United States)

    Hu, Jianjun; Zhang, Fan

    2010-01-18

    Protein sorting is the process that newly synthesized proteins are transported to their target locations within or outside of the cell. This process is precisely regulated by protein sorting signals in different forms. A major category of sorting signals are amino acid sub-sequences usually located at the N-terminals or C-terminals of protein sequences. Genome-wide experimental identification of protein sorting signals is extremely time-consuming and costly. Effective computational algorithms for de novo discovery of protein sorting signals is needed to improve the understanding of protein sorting mechanisms. We formulated the protein sorting motif discovery problem as a classification problem and proposed a Bayesian classifier based algorithm (BayesMotif) for de novo identification of a common type of protein sorting motifs in which a highly conserved anchor is present along with a less conserved motif regions. A false positive removal procedure is developed to iteratively remove sequences that are unlikely to contain true motifs so that the algorithm can identify motifs from impure input sequences. Experiments on both implanted motif datasets and real-world datasets showed that the enhanced BayesMotif algorithm can identify anchored sorting motifs from pure or impure protein sequence dataset. It also shows that the false positive removal procedure can help to identify true motifs even when there is only 20% of the input sequences containing true motif instances. We proposed BayesMotif, a novel Bayesian classification based algorithm for de novo discovery of a special category of anchored protein sorting motifs from impure datasets. Compared to conventional motif discovery algorithms such as MEME, our algorithm can find less-conserved motifs with short highly conserved anchors. Our algorithm also has the advantage of easy incorporation of additional meta-sequence features such as hydrophobicity or charge of the motifs which may help to overcome the limitations of

  7. Glucagon infusion increases rate of purine synthesis de novo in rat liver

    International Nuclear Information System (INIS)

    Itakura, Mitsuo; Maeda, Noriaki; Tsuchiya, Masami; Yamashita, Kamejiro

    1987-01-01

    Based on the parallel increases of glucagon, the second peak of hepatic cAMP, and the rate of purine synthesis de novo in the prereplicative period in regenerating rate liver after a 70% hepatectomy, it was hypothesized that glucagon is responsible for the increased rate of purine synthesis de novo. To test this hypothesis, the effect of glucagon or dibutyryl cAMP infusion on the rate of purine synthesis de novo in rat liver was studied. Glucagon infusion but not insulin or glucose infusion increased the rate of purine synthesis de novo, which was assayed by [ 14 C]glycine or [ 14 C]formate incorporation, by 2.7- to 4.3-fold. Glucagon infusion increased cAMP concentrations by 4.9-fold and 5-phosphoribosyl-1-pyrophosphate concentrations by 1.5-fold in liver but did not change the specific activity of amidophosphoribosyltransferase or purine ribonucleotide concentrations. Dibutyryl cAMP infusion also increased the rate of purine synthesis de novo by 2.2- to 4.0-fold. Because glucagon infusion increased the rate of purine synthesis de novo in the presence of unchanged purine ribonucleotide concentrations, it is concluded that glucagon after infusion or in animals after a 70% hepatectomy is playing an anabolic role to increase the rate of purine synthesis de novo by increasing cAMP and 5-phosphoribosyl-1-pyrophosphate concentrations

  8. Computational Approaches for Prediction of Pathogen-Host Protein-Protein Interactions

    Directory of Open Access Journals (Sweden)

    Esmaeil eNourani

    2015-02-01

    Full Text Available Infectious diseases are still among the major and prevalent health problems, mostly because of the drug resistance of novel variants of pathogens. Molecular interactions between pathogens and their hosts are the key part of the infection mechanisms. Novel antimicrobial therapeutics to fight drug resistance is only possible in case of a thorough understanding of pathogen-host interaction (PHI systems. Existing databases, which contain experimentally verified PHI data, suffer from scarcity of reported interactions due to the technically challenging and time consuming process of experiments. This has motivated many researchers to address the problem by proposing computational approaches for analysis and prediction of PHIs. The computational methods primarily utilize sequence information, protein structure and known interactions. Classic machine learning techniques are used when there are sufficient known interactions to be used as training data. On the opposite case, transfer and multi task learning methods are preferred. Here, we present an overview of these computational approaches for PHI prediction, discussing their weakness and abilities, with future directions.

  9. Icarus: visualizer for de novo assembly evaluation.

    Science.gov (United States)

    Mikheenko, Alla; Valin, Gleb; Prjibelski, Andrey; Saveliev, Vladislav; Gurevich, Alexey

    2016-11-01

    : Data visualization plays an increasingly important role in NGS data analysis. With advances in both sequencing and computational technologies, it has become a new bottleneck in genomics studies. Indeed, evaluation of de novo genome assemblies is one of the areas that can benefit from the visualization. However, even though multiple quality assessment methods are now available, existing visualization tools are hardly suitable for this purpose. Here, we present Icarus-a novel genome visualizer for accurate assessment and analysis of genomic draft assemblies, which is based on the tool QUAST. Icarus can be used in studies where a related reference genome is available, as well as for non-model organisms. The tool is available online and as a standalone application. http://cab.spbu.ru/software/icarus CONTACT: aleksey.gurevich@spbu.ruSupplementary information: Supplementary data are available at Bioinformatics online. © The Author 2016. Published by Oxford University Press. All rights reserved. For Permissions, please e-mail: journals.permissions@oup.com.

  10. High frequencies of de novo CNVs in bipolar disorder and schizophrenia.

    LENUS (Irish Health Repository)

    Malhotra, Dheeraj

    2011-12-22

    While it is known that rare copy-number variants (CNVs) contribute to risk for some neuropsychiatric disorders, the role of CNVs in bipolar disorder is unclear. Here, we reasoned that a contribution of CNVs to mood disorders might be most evident for de novo mutations. We performed a genome-wide analysis of de novo CNVs in a cohort of 788 trios. Diagnoses of offspring included bipolar disorder (n = 185), schizophrenia (n = 177), and healthy controls (n = 426). Frequencies of de novo CNVs were significantly higher in bipolar disorder as compared with controls (OR = 4.8 [1.4,16.0], p = 0.009). De novo CNVs were particularly enriched among cases with an age at onset younger than 18 (OR = 6.3 [1.7,22.6], p = 0.006). We also confirmed a significant enrichment of de novo CNVs in schizophrenia (OR = 5.0 [1.5,16.8], p = 0.007). Our results suggest that rare spontaneous mutations are an important contributor to risk for bipolar disorder and other major neuropsychiatric diseases.

  11. Traffic Flow Prediction Model for Large-Scale Road Network Based on Cloud Computing

    Directory of Open Access Journals (Sweden)

    Zhaosheng Yang

    2014-01-01

    Full Text Available To increase the efficiency and precision of large-scale road network traffic flow prediction, a genetic algorithm-support vector machine (GA-SVM model based on cloud computing is proposed in this paper, which is based on the analysis of the characteristics and defects of genetic algorithm and support vector machine. In cloud computing environment, firstly, SVM parameters are optimized by the parallel genetic algorithm, and then this optimized parallel SVM model is used to predict traffic flow. On the basis of the traffic flow data of Haizhu District in Guangzhou City, the proposed model was verified and compared with the serial GA-SVM model and parallel GA-SVM model based on MPI (message passing interface. The results demonstrate that the parallel GA-SVM model based on cloud computing has higher prediction accuracy, shorter running time, and higher speedup.

  12. De Novo Human Cardiac Myocytes for Medical Research: Promises and Challenges

    Directory of Open Access Journals (Sweden)

    Veronique Hamel

    2017-01-01

    Full Text Available The advent of cellular reprogramming technology has revolutionized biomedical research. De novo human cardiac myocytes can now be obtained from direct reprogramming of somatic cells (such as fibroblasts, from induced pluripotent stem cells (iPSCs, which are reprogrammed from somatic cells, and from human embryonic stem cells (hESCs. Such de novo human cardiac myocytes hold great promise for in vitro disease modeling and drug screening and in vivo cell therapy of heart disease. Here, we review the technique advancements for generating de novo human cardiac myocytes. We also discuss several challenges for the use of such cells in research and regenerative medicine, such as the immature phenotype and heterogeneity of de novo cardiac myocytes obtained with existing protocols. We focus on the recent advancements in addressing such challenges.

  13. A computational environment for long-term multi-feature and multi-algorithm seizure prediction.

    Science.gov (United States)

    Teixeira, C A; Direito, B; Costa, R P; Valderrama, M; Feldwisch-Drentrup, H; Nikolopoulos, S; Le Van Quyen, M; Schelter, B; Dourado, A

    2010-01-01

    The daily life of epilepsy patients is constrained by the possibility of occurrence of seizures. Until now, seizures cannot be predicted with sufficient sensitivity and specificity. Most of the seizure prediction studies have been focused on a small number of patients, and frequently assuming unrealistic hypothesis. This paper adopts the view that for an appropriate development of reliable predictors one should consider long-term recordings and several features and algorithms integrated in one software tool. A computational environment, based on Matlab (®), is presented, aiming to be an innovative tool for seizure prediction. It results from the need of a powerful and flexible tool for long-term EEG/ECG analysis by multiple features and algorithms. After being extracted, features can be subjected to several reduction and selection methods, and then used for prediction. The predictions can be conducted based on optimized thresholds or by applying computational intelligence methods. One important aspect is the integrated evaluation of the seizure prediction characteristic of the developed predictors.

  14. Computational prediction of miRNA genes from small RNA sequencing data

    Directory of Open Access Journals (Sweden)

    Wenjing eKang

    2015-01-01

    Full Text Available Next-generation sequencing now for the first time allows researchers to gauge the depth and variation of entire transcriptomes. However, now as rare transcripts can be detected that are present in cells at single copies, more advanced computational tools are needed to accurately annotate and profile them. miRNAs are 22 nucleotide small RNAs (sRNAs that post-transcriptionally reduce the output of protein coding genes. They have established roles in numerous biological processes, including cancers and other diseases. During miRNA biogenesis, the sRNAs are sequentially cleaved from precursor molecules that have a characteristic hairpin RNA structure. The vast majority of new miRNA genes that are discovered are mined from small RNA sequencing (sRNA-seq, which can detect more than a billion RNAs in a single run. However, given that many of the detected RNAs are degradation products from all types of transcripts, the accurate identification of miRNAs remain a non-trivial computational problem. Here we review the tools available to predict animal miRNAs from sRNA sequencing data. We present tools for generalist and specialist use cases, including prediction from massively pooled data or in species without reference genome. We also present wet-lab methods used to validate predicted miRNAs, and approaches to computationally benchmark prediction accuracy. For each tool, we reference validation experiments and benchmarking efforts. Last, we discuss the future of the field.

  15. De novo assembly of the perennial ryegrass transcriptome using an RNA-Seq strategy.

    Directory of Open Access Journals (Sweden)

    Jacqueline D Farrell

    Full Text Available Perennial ryegrass is a highly heterozygous outbreeding grass species used for turf and forage production. Heterozygosity can affect de-Bruijn graph assembly making de novo transcriptome assembly of species such as perennial ryegrass challenging. Creating a reference transcriptome from a homozygous perennial ryegrass genotype can circumvent the challenge of heterozygosity. The goals of this study were to perform RNA-sequencing on multiple tissues from a highly inbred genotype to develop a reference transcriptome. This was complemented with RNA-sequencing of a highly heterozygous genotype for SNP calling.De novo transcriptome assembly of the inbred genotype created 185,833 transcripts with an average length of 830 base pairs. Within the inbred reference transcriptome 78,560 predicted open reading frames were found of which 24,434 were predicted as complete. Functional annotation found 50,890 transcripts with a BLASTp hit from the Swiss-Prot non-redundant database, 58,941 transcripts with a Pfam protein domain and 1,151 transcripts encoding putative secreted peptides. To evaluate the reference transcriptome we targeted the high-affinity K+ transporter gene family and found multiple orthologs. Using the longest unique open reading frames as the reference sequence, 64,242 single nucleotide polymorphisms were found. One thousand sixty one open reading frames from the inbred genotype contained heterozygous sites, confirming the high degree of homozygosity.Our study has developed an annotated, comprehensive transcriptome reference for perennial ryegrass that can aid in determining genetic variation, expression analysis, genome annotation, and gene mapping.

  16. Melhoramento do cafeeiro: IV - Café Mundo Novo

    Directory of Open Access Journals (Sweden)

    A. Carvalho

    1952-06-01

    Full Text Available Em um conjunto de cafeeiros existentes em Mundo Novo, hoje Urupês, na região Araraquarense do Estado de São Paulo, foram feitas seleções de vários cafeeiros baseando-se no seu aspecto vegetativo, na produção existente na época da seleção e na provável produção do ano seguinte. Estudou-se a origem da plantação inicial desse café, tanto em Urupês como em Jaú, chegando-se à conclusão de que é provavelmente originário desta última localidade. Progênies do café "Mundo Novo", anteriormente conhecido por "Sumatra" e derivado de plantas selecionadas em Urupês e Jaú, acham-se em estudo em seis localidades do Estado : Campinas, Ribeirão Prêto, Pindorama, Mococa, Jaú e Monte Alegre do Sul. No presente trabalho são apenas aproveitados dados referentes à variabilidade morfológica e característicos da produção das progênies dos primeiros cafeeiros selecionados em Urupês e estudados em Campinas, Jaú, Pindorama e Mococa. Em tôdas as localidades, observou-se variação nos caracteres morfológicos das progênies, verificando-se a ocorrência de plantas quase improdutivas. A maioria das progênies, no entanto, se caracteriza por acentuado vigor vegetativo. Foram estudadas as produções totais das progénies e das plantas, no período 1946-1951, notando-se que algumas progénies se salientaram pela elevada produção em tôdas as localidades. Os tipos de sementes "moca", "concha" e "chato" foram determinados em amostras de tôdas as plantas, por um período de três anos, notando-se que a variação ocorrida é da mesma ordem que a encontrada em outros cafeeiros em seleção. Procurou-se eliminar, pela seleção, cafeeiros com elevada produção de frutos sem sementes em uma ou duas lojas, característico êsse que parece ser hereditário. Os resultados obtidos de cruzamento entre os melhores cafeeiros "Mundo Novo" de Campinas e plantas da variedade murta, indicaram que esses cafeeiros são do tipo bourbon. Provavelmente

  17. De novo transcriptome assembly of the mycoheterotrophic plant Monotropa hypopitys

    Directory of Open Access Journals (Sweden)

    Alexey V. Beletsky

    2017-03-01

    Full Text Available Monotropa hypopitys (pinesap is a non-photosynthetic obligately mycoheterotrophic plant of the family Ericaceae. It obtains the carbon and other nutrients from the roots of surrounding autotrophic trees through the associated mycorrhizal fungi. In order to understand the evolutionary changes in the plant genome associated with transition to a heterotrophic lifestyle, we performed de novo transcriptomic analysis of M. hypopitys using next-generation sequencing. We obtained the RNA-Seq data from flowers, flower bracts and roots with haustoria using Illumina HiSeq2500 platform. The raw data obtained in this study can be available in NCBI SRA database with accession number of SRP069226. A total of 10.3 GB raw sequence data were obtained, corresponding to 103,357,809 raw reads. A total of 103,025,683 reads were filtered after removing low-quality reads and trimming the adapter sequences. The Trinity program was used to de novo assemble 98,349 unigens with an N50 of 1342 bp. Using the TransDecoder program, we predicted 43,505 putative proteins. 38,416 unigenes were annotated in the Swiss-Prot protein sequence database using BLASTX. The obtained transcriptomic data will be useful for further studies of the evolution of plant genomes upon transition to a non-photosynthetic lifestyle and the loss of photosynthesis-related functions.

  18. Failure of Noninvasive Ventilation for De Novo Acute Hypoxemic Respiratory Failure: Role of Tidal Volume.

    Science.gov (United States)

    Carteaux, Guillaume; Millán-Guilarte, Teresa; De Prost, Nicolas; Razazi, Keyvan; Abid, Shariq; Thille, Arnaud W; Schortgen, Frédérique; Brochard, Laurent; Brun-Buisson, Christian; Mekontso Dessap, Armand

    2016-02-01

    A low or moderate expired tidal volume can be difficult to achieve during noninvasive ventilation for de novo acute hypoxemic respiratory failure (i.e., not due to exacerbation of chronic lung disease or cardiac failure). We assessed expired tidal volume and its association with noninvasive ventilation outcome. Prospective observational study. Twenty-four bed university medical ICU. Consecutive patients receiving noninvasive ventilation for acute hypoxemic respiratory failure between August 2010 and February 2013. Noninvasive ventilation was uniformly delivered using a simple algorithm targeting the expired tidal volume between 6 and 8 mL/kg of predicted body weight. Expired tidal volume was averaged and respiratory and hemodynamic variables were systematically recorded at each noninvasive ventilation session. Sixty-two patients were enrolled, including 47 meeting criteria for acute respiratory distress syndrome, and 32 failed noninvasive ventilation (51%). Pneumonia (n = 51, 82%) was the main etiology of acute hypoxemic respiratory failure. The median (interquartile range) expired tidal volume averaged over all noninvasive ventilation sessions (mean expired tidal volume) was 9.8 mL/kg predicted body weight (8.1-11.1 mL/kg predicted body weight). The mean expired tidal volume was significantly higher in patients who failed noninvasive ventilation as compared with those who succeeded (10.6 mL/kg predicted body weight [9.6-12.0] vs 8.5 mL/kg predicted body weight [7.6-10.2]; p = 0.001), and expired tidal volume was independently associated with noninvasive ventilation failure in multivariate analysis. This effect was mainly driven by patients with PaO2/FIO2 up to 200 mm Hg. In these patients, the expired tidal volume above 9.5 mL/kg predicted body weight predicted noninvasive ventilation failure with a sensitivity of 82% and a specificity of 87%. A low expired tidal volume is almost impossible to achieve in the majority of patients receiving noninvasive ventilation

  19. De Novo Collapsing Glomerulopathy in a Renal Allograft Recipient

    Directory of Open Access Journals (Sweden)

    Kanodia K

    2008-01-01

    Full Text Available Collapsing glomerulopathy (CG, characterized histologically by segmental/global glomerular capillary collapse, podocyte hypertrophy and hypercellularity and tubulo-interstitial injury; is characterized clinically by massive proteinuria and rapid progressive renal failure. CG is known to recur in renal allograft and rarely de novo. We report de novo CG 3 years post-transplant in a patient who received renal allograft from haplo-identical type donor.

  20. Purine biosynthesis de novo by lymphocytes in gout

    International Nuclear Information System (INIS)

    Kamoun, P.; Chanard, J.; Brami, M.; Funck-Brentano, J.L.

    1978-01-01

    A method of measurement in vitro of purine biosynthesis de novo in human circulating blood lymphocytes is proposed. The rate of early reactions of purine biosynthesis de novo was determined by the incorporation of [ 14 C]formate into N-formyl glycinamide ribonucleotide when the subsequent reactions of the metabolic pathway were completely inhibited by the antibiotic azaserine. Synthesis of 14 C-labelled N-formyl glycinamide ribonucleotide by lymphocytes was measured in healthy control subjects and patients with primary gout or hyperuricaemia secondary to renal failure, with or without allopurinol therapy. The average synthesis was higher in gouty patients without therapy than in control subjects, but the values contained overlap the normal range. In secondary hyperuricaemia the synthesis was at same value as in control subjects. These results are in agreement with the inconstant acceleration of purine biosynthesis de novo in gouty patients as seen by others with measurement of [ 14 C]glycine incorporation into urinary uric acid. (author)

  1. Osteotomy simulation and soft tissue prediction using computer tomography scans

    International Nuclear Information System (INIS)

    Teschner, M.; Girod, S.; Girod, B.

    1999-01-01

    In this paper, a system is presented that can be used to simulate osteotomies of the skull and to estimate the resulting of tissue changes. Thus, the three-dimensional, photorealistic, postoperative appearance of a patient can be assessed. The system is based on a computer tomography scan and a photorealistic laser scan of the patient's face. In order to predict the postoperative appearance of a patient the soft tissue must follow the movement of the underlying bone. In this paper, a multi-layer soft tissue model is proposed that is based on springs. It incorporates features like skin turgor, gravity and sliding bone contact. The prediction of soft tissue changes due to bone realignments is computed using a very efficient and robust optimization method. The system can handle individual patient data sets and has been tested with several clinical cases. (author)

  2. Prediction of quantitative phenotypes based on genetic networks: a case study in yeast sporulation

    Directory of Open Access Journals (Sweden)

    Shen Li

    2010-09-01

    Full Text Available Abstract Background An exciting application of genetic network is to predict phenotypic consequences for environmental cues or genetic perturbations. However, de novo prediction for quantitative phenotypes based on network topology is always a challenging task. Results Using yeast sporulation as a model system, we have assembled a genetic network from literature and exploited Boolean network to predict sporulation efficiency change upon deleting individual genes. We observe that predictions based on the curated network correlate well with the experimentally measured values. In addition, computational analysis reveals the robustness and hysteresis of the yeast sporulation network and uncovers several patterns of sporulation efficiency change caused by double gene deletion. These discoveries may guide future investigation of underlying mechanisms. We have also shown that a hybridized genetic network reconstructed from both temporal microarray data and literature is able to achieve a satisfactory prediction accuracy of the same quantitative phenotypes. Conclusions This case study illustrates the value of predicting quantitative phenotypes based on genetic network and provides a generic approach.

  3. De novo giant A2 aneurysm following anterior communicating artery occlusion.

    Science.gov (United States)

    Ibrahim, Tarik F; Hafez, Ahmad; Andrade-Barazarte, Hugo; Raj, Rahul; Niemela, Mika; Lehto, Hanna; Numminen, Jussi; Jarvelainen, Juha; Hernesniemi, Juha

    2015-01-01

    De novo intracranial aneurysms are reported to occur with varying incidence after intracranial aneurysm treatment. They are purported to be observed, however, with increased incidence after Hunterian ligation; particularly in cases of carotid artery occlusion for giant or complex aneurysms deemed unclippable. We report a case of right-sided de novo giant A2 aneurysm 6 years after an anterior communicating artery (ACoA) aneurysm clipping. We believe this de novo aneurysm developed in part due to patient-specific risk factors but also a significant change in cerebral hemodynamics. The ACoA became occluded after surgery that likely altered the cerebral hemodynamics and contributed to the de novo aneurysm. We believe this to be the first reported case of a giant de novo aneurysm in this location. Following parent vessel occlusion (mostly of the carotid artery), there are no reports of any de novo aneurysms in the pericallosal arteries let alone a giant one. The patient had a dominant right A1 and the sudden increase in A2 blood flow likely resulted in increased wall shear stress, particularly in the medial wall of the A2 where the aneurysm occurred 2 mm distal to the A1-2 junction. ACoA preservation is a key element of aneurysm surgery in this location. Suspected occlusion of this vessel may warrant closer radiographic follow-up in patients with other risk factors for aneurysm development.

  4. Prediction of the Thermal Conductivity of Refrigerants by Computational Methods and Artificial Neural Network.

    Science.gov (United States)

    Ghaderi, Forouzan; Ghaderi, Amir H; Ghaderi, Noushin; Najafi, Bijan

    2017-01-01

    Background: The thermal conductivity of fluids can be calculated by several computational methods. However, these methods are reliable only at the confined levels of density, and there is no specific computational method for calculating thermal conductivity in the wide ranges of density. Methods: In this paper, two methods, an Artificial Neural Network (ANN) approach and a computational method established upon the Rainwater-Friend theory, were used to predict the value of thermal conductivity in all ranges of density. The thermal conductivity of six refrigerants, R12, R14, R32, R115, R143, and R152 was predicted by these methods and the effectiveness of models was specified and compared. Results: The results show that the computational method is a usable method for predicting thermal conductivity at low levels of density. However, the efficiency of this model is considerably reduced in the mid-range of density. It means that this model cannot be used at density levels which are higher than 6. On the other hand, the ANN approach is a reliable method for thermal conductivity prediction in all ranges of density. The best accuracy of ANN is achieved when the number of units is increased in the hidden layer. Conclusion: The results of the computational method indicate that the regular dependence between thermal conductivity and density at higher densities is eliminated. It can develop a nonlinear problem. Therefore, analytical approaches are not able to predict thermal conductivity in wide ranges of density. Instead, a nonlinear approach such as, ANN is a valuable method for this purpose.

  5. Reservoir computer predictions for the Three Meter magnetic field time evolution

    Science.gov (United States)

    Perevalov, A.; Rojas, R.; Lathrop, D. P.; Shani, I.; Hunt, B. R.

    2017-12-01

    The source of the Earth's magnetic field is the turbulent flow of liquid metal in the outer core. Our experiment's goal is to create Earth-like dynamo, to explore the mechanisms and to understand the dynamics of the magnetic and velocity fields. Since it is a complicated system, predictions of the magnetic field is a challenging problem. We present results of mimicking the three Meter experiment by a reservoir computer deep learning algorithm. The experiment is a three-meter diameter outer sphere and a one-meter diameter inner sphere with the gap filled with liquid sodium. The spheres can rotate up to 4 and 14 Hz respectively, giving a Reynolds number near to 108. Two external electromagnets apply magnetic fields, while an array of 31 external and 2 internal Hall sensors measure the resulting induced fields. We use this magnetic probe data to train a reservoir computer to predict the 3M time evolution and mimic waves in the experiment. Surprisingly accurate predictions can be made for several magnetic dipole time scales. This shows that such a complicated MHD system's behavior can be predicted. We gratefully acknowledge support from NSF EAR-1417148.

  6. Integrated Computational Solution for Predicting Skin Sensitization Potential of Molecules.

    Directory of Open Access Journals (Sweden)

    Konda Leela Sarath Kumar

    Full Text Available Skin sensitization forms a major toxicological endpoint for dermatology and cosmetic products. Recent ban on animal testing for cosmetics demands for alternative methods. We developed an integrated computational solution (SkinSense that offers a robust solution and addresses the limitations of existing computational tools i.e. high false positive rate and/or limited coverage.The key components of our solution include: QSAR models selected from a combinatorial set, similarity information and literature-derived sub-structure patterns of known skin protein reactive groups. Its prediction performance on a challenge set of molecules showed accuracy = 75.32%, CCR = 74.36%, sensitivity = 70.00% and specificity = 78.72%, which is better than several existing tools including VEGA (accuracy = 45.00% and CCR = 54.17% with 'High' reliability scoring, DEREK (accuracy = 72.73% and CCR = 71.44% and TOPKAT (accuracy = 60.00% and CCR = 61.67%. Although, TIMES-SS showed higher predictive power (accuracy = 90.00% and CCR = 92.86%, the coverage was very low (only 10 out of 77 molecules were predicted reliably.Owing to improved prediction performance and coverage, our solution can serve as a useful expert system towards Integrated Approaches to Testing and Assessment for skin sensitization. It would be invaluable to cosmetic/ dermatology industry for pre-screening their molecules, and reducing time, cost and animal testing.

  7. Direct Visualization of De novo Lipogenesis in Single Living Cells

    Science.gov (United States)

    Li, Junjie; Cheng, Ji-Xin

    2014-10-01

    Increased de novo lipogenesis is being increasingly recognized as a hallmark of cancer. Despite recent advances in fluorescence microscopy, autoradiography and mass spectrometry, direct observation of de novo lipogenesis in living systems remains to be challenging. Here, by coupling stimulated Raman scattering (SRS) microscopy with isotope labeled glucose, we were able to trace the dynamic metabolism of glucose in single living cells with high spatial-temporal resolution. As the first direct visualization, we observed that glucose was largely utilized for lipid synthesis in pancreatic cancer cells, which occurs at a much lower rate in immortalized normal pancreatic epithelial cells. By inhibition of glycolysis and fatty acid synthase (FAS), the key enzyme for fatty acid synthesis, we confirmed the deuterium labeled lipids in cancer cells were from de novo lipid synthesis. Interestingly, we also found that prostate cancer cells exhibit relatively lower level of de novo lipogenesis, but higher fatty acid uptake compared to pancreatic cancer cells. Together, our results demonstrate a valuable tool to study dynamic lipid metabolism in cancer and other disorders.

  8. De Novo Ultrascale Atomistic Simulations On High-End Parallel Supercomputers

    Energy Technology Data Exchange (ETDEWEB)

    Nakano, A; Kalia, R K; Nomura, K; Sharma, A; Vashishta, P; Shimojo, F; van Duin, A; Goddard, III, W A; Biswas, R; Srivastava, D; Yang, L H

    2006-09-04

    We present a de novo hierarchical simulation framework for first-principles based predictive simulations of materials and their validation on high-end parallel supercomputers and geographically distributed clusters. In this framework, high-end chemically reactive and non-reactive molecular dynamics (MD) simulations explore a wide solution space to discover microscopic mechanisms that govern macroscopic material properties, into which highly accurate quantum mechanical (QM) simulations are embedded to validate the discovered mechanisms and quantify the uncertainty of the solution. The framework includes an embedded divide-and-conquer (EDC) algorithmic framework for the design of linear-scaling simulation algorithms with minimal bandwidth complexity and tight error control. The EDC framework also enables adaptive hierarchical simulation with automated model transitioning assisted by graph-based event tracking. A tunable hierarchical cellular decomposition parallelization framework then maps the O(N) EDC algorithms onto Petaflops computers, while achieving performance tunability through a hierarchy of parameterized cell data/computation structures, as well as its implementation using hybrid Grid remote procedure call + message passing + threads programming. High-end computing platforms such as IBM BlueGene/L, SGI Altix 3000 and the NSF TeraGrid provide an excellent test grounds for the framework. On these platforms, we have achieved unprecedented scales of quantum-mechanically accurate and well validated, chemically reactive atomistic simulations--1.06 billion-atom fast reactive force-field MD and 11.8 million-atom (1.04 trillion grid points) quantum-mechanical MD in the framework of the EDC density functional theory on adaptive multigrids--in addition to 134 billion-atom non-reactive space-time multiresolution MD, with the parallel efficiency as high as 0.998 on 65,536 dual-processor BlueGene/L nodes. We have also achieved an automated execution of hierarchical QM

  9. Response monitoring in de novo patients with Parkinson's disease.

    Directory of Open Access Journals (Sweden)

    Rita Willemssen

    Full Text Available BACKGROUND: Parkinson's disease (PD is accompanied by dysfunctions in a variety of cognitive processes. One of these is error processing, which depends upon phasic decreases of medial prefrontal dopaminergic activity. Until now, there is no study evaluating these processes in newly diagnosed, untreated patients with PD ("de novo PD". METHODOLOGY/PRINCIPAL FINDINGS: Here we report large changes in performance monitoring processes using event-related potentials (ERPs in de novo PD-patients. The results suggest that increases in medial frontal dopaminergic activity after an error (Ne are decreased, relative to age-matched controls. In contrast, neurophysiological processes reflecting general motor response monitoring (Nc are enhanced in de novo patients. CONCLUSIONS/SIGNIFICANCE: It may be hypothesized that the Nc-increase is at costs of dopaminergic activity after an error; on a functional level errors may not always be detected and correct responses sometimes be misinterpreted as errors. This pattern differs from studies examining patients with a longer history of PD and may reflect compensatory processes, frequently occurring in pre-manifest stages of PD. From a clinical point of view the clearly attenuated Ne in the de novo PD patients may prove a useful additional tool for the early diagnosis of basal ganglia dysfunction in PD.

  10. Improving Computational Efficiency of Prediction in Model-Based Prognostics Using the Unscented Transform

    Science.gov (United States)

    Daigle, Matthew John; Goebel, Kai Frank

    2010-01-01

    Model-based prognostics captures system knowledge in the form of physics-based models of components, and how they fail, in order to obtain accurate predictions of end of life (EOL). EOL is predicted based on the estimated current state distribution of a component and expected profiles of future usage. In general, this requires simulations of the component using the underlying models. In this paper, we develop a simulation-based prediction methodology that achieves computational efficiency by performing only the minimal number of simulations needed in order to accurately approximate the mean and variance of the complete EOL distribution. This is performed through the use of the unscented transform, which predicts the means and covariances of a distribution passed through a nonlinear transformation. In this case, the EOL simulation acts as that nonlinear transformation. In this paper, we review the unscented transform, and describe how this concept is applied to efficient EOL prediction. As a case study, we develop a physics-based model of a solenoid valve, and perform simulation experiments to demonstrate improved computational efficiency without sacrificing prediction accuracy.

  11. A community computational challenge to predict the activity of pairs of compounds.

    Science.gov (United States)

    Bansal, Mukesh; Yang, Jichen; Karan, Charles; Menden, Michael P; Costello, James C; Tang, Hao; Xiao, Guanghua; Li, Yajuan; Allen, Jeffrey; Zhong, Rui; Chen, Beibei; Kim, Minsoo; Wang, Tao; Heiser, Laura M; Realubit, Ronald; Mattioli, Michela; Alvarez, Mariano J; Shen, Yao; Gallahan, Daniel; Singer, Dinah; Saez-Rodriguez, Julio; Xie, Yang; Stolovitzky, Gustavo; Califano, Andrea

    2014-12-01

    Recent therapeutic successes have renewed interest in drug combinations, but experimental screening approaches are costly and often identify only small numbers of synergistic combinations. The DREAM consortium launched an open challenge to foster the development of in silico methods to computationally rank 91 compound pairs, from the most synergistic to the most antagonistic, based on gene-expression profiles of human B cells treated with individual compounds at multiple time points and concentrations. Using scoring metrics based on experimental dose-response curves, we assessed 32 methods (31 community-generated approaches and SynGen), four of which performed significantly better than random guessing. We highlight similarities between the methods. Although the accuracy of predictions was not optimal, we find that computational prediction of compound-pair activity is possible, and that community challenges can be useful to advance the field of in silico compound-synergy prediction.

  12. Computational Efficient Upscaling Methodology for Predicting Thermal Conductivity of Nuclear Waste forms

    International Nuclear Information System (INIS)

    Li, Dongsheng; Sun, Xin; Khaleel, Mohammad A.

    2011-01-01

    This study evaluated different upscaling methods to predict thermal conductivity in loaded nuclear waste form, a heterogeneous material system. The efficiency and accuracy of these methods were compared. Thermal conductivity in loaded nuclear waste form is an important property specific to scientific researchers, in waste form Integrated performance and safety code (IPSC). The effective thermal conductivity obtained from microstructure information and local thermal conductivity of different components is critical in predicting the life and performance of waste form during storage. How the heat generated during storage is directly related to thermal conductivity, which in turn determining the mechanical deformation behavior, corrosion resistance and aging performance. Several methods, including the Taylor model, Sachs model, self-consistent model, and statistical upscaling models were developed and implemented. Due to the absence of experimental data, prediction results from finite element method (FEM) were used as reference to determine the accuracy of different upscaling models. Micrographs from different loading of nuclear waste were used in the prediction of thermal conductivity. Prediction results demonstrated that in term of efficiency, boundary models (Taylor and Sachs model) are better than self consistent model, statistical upscaling method and FEM. Balancing the computation resource and accuracy, statistical upscaling is a computational efficient method in predicting effective thermal conductivity for nuclear waste form.

  13. Computational prediction of drug-drug interactions based on drugs functional similarities.

    Science.gov (United States)

    Ferdousi, Reza; Safdari, Reza; Omidi, Yadollah

    2017-06-01

    Therapeutic activities of drugs are often influenced by co-administration of drugs that may cause inevitable drug-drug interactions (DDIs) and inadvertent side effects. Prediction and identification of DDIs are extremely vital for the patient safety and success of treatment modalities. A number of computational methods have been employed for the prediction of DDIs based on drugs structures and/or functions. Here, we report on a computational method for DDIs prediction based on functional similarity of drugs. The model was set based on key biological elements including carriers, transporters, enzymes and targets (CTET). The model was applied for 2189 approved drugs. For each drug, all the associated CTETs were collected, and the corresponding binary vectors were constructed to determine the DDIs. Various similarity measures were conducted to detect DDIs. Of the examined similarity methods, the inner product-based similarity measures (IPSMs) were found to provide improved prediction values. Altogether, 2,394,766 potential drug pairs interactions were studied. The model was able to predict over 250,000 unknown potential DDIs. Upon our findings, we propose the current method as a robust, yet simple and fast, universal in silico approach for identification of DDIs. We envision that this proposed method can be used as a practical technique for the detection of possible DDIs based on the functional similarities of drugs. Copyright © 2017. Published by Elsevier Inc.

  14. De novo mutations in synaptic transmission genes including DNM1 cause epileptic encephalopathies

    DEFF Research Database (Denmark)

    2014-01-01

    in five individuals and de novo mutations in GABBR2, FASN, and RYR3 in two individuals each. Unlike previous studies, this cohort is sufficiently large to show a significant excess of de novo mutations in epileptic encephalopathy probands compared to the general population using a likelihood analysis (p...... = 8.2 × 10(-4)), supporting a prominent role for de novo mutations in epileptic encephalopathies. We bring statistical evidence that mutations in DNM1 cause epileptic encephalopathy, find suggestive evidence for a role of three additional genes, and show that at least 12% of analyzed individuals have...... analyzed exome-sequencing data of 356 trios with the "classical" epileptic encephalopathies, infantile spasms and Lennox Gastaut syndrome, including 264 trios previously analyzed by the Epi4K/EPGP consortium. In this expanded cohort, we find 429 de novo mutations, including de novo mutations in DNM1...

  15. Proposal of computation chart for general use for diffusion prediction of discharged warm water

    International Nuclear Information System (INIS)

    Wada, Akira; Kadoyu, Masatake

    1976-01-01

    The authors have developed the unique simulation analysis method using the numerical models for the prediction of discharged warm water diffusion. At the present stage, the method is adopted for the precise analysis computation in order to make the prediction of the diffusion of discharged warm water at each survey point, but instead of this method, it is strongly requested that some simple and easy prediction methods should be established. For the purpose of meeting this demand, in this report, the computation chart for general use is given to predict simply the diffusion range of discharged warm water, after classifying the semi-infinite sea region into several flow patterns according to the sea conditions and conducting the systematic simulation analysis with the numerical model of each pattern, respectively. (1) Establishment of the computation conditions: The special sea region was picked up as the area to be investigated, which is semi-infinite facing the outer sea and along the rectilineal coast line from many sea regions surrounding Japan, and from the viewpoint of the flow and the diffusion characteristics, the sea region was classified into three patterns. 51 cases in total various parameters were obtained, and finally the simulation analysis was performed. (2) Drawing up the general use chart: 28 sheets of the computation chart for general use were drawn, which are available for computing the approximate temperature rise caused by the discharged warm water diffusion. The example of Anegasaki Thermal Power Station is given. (Kako, I.)

  16. Computational thermofracture mechanics and life prediction

    International Nuclear Information System (INIS)

    Hsu Tairan

    1992-01-01

    This paper will present computational techniques used for the prediction of the thermofracture behaviour of structures subject to either monotonic or cyclic combined thermal and mechanical loadings. Two specific areas will be dealt with in the paper. (1) The Time-invariant thermofracture of leaking pipelines with non-uniform temperature fields; in this case, the induced non-uniform temperature fields near leaking cracks have shown to be significant. The severity of these temperature fields on the thermofracture behaviour of the pipeline will be demonstrated by a numerical example. (2) Thermomechanical creep fracture of structures: Recent developments, including those of the author's own work, on cyclic creep-fracture using damage theory will be presented. Long 'hold' and 'dwell' times, which occur in the actual operations of nuclear power plant components have been shown to have a significant effect on the overall creep-fracture behaviour of the material. Constitutive laws, which include most of these effects, have been incorporated into the existing TEPSAC code for the prediction of crack growth in solids under cyclic creep loadings. The effectiveness of using the damage parameters as fracture criteria, and the presence of plastic deformation in the overall results will be assessed. (orig.)

  17. De novo transcriptome assembly of shrimp Palaemon serratus

    Directory of Open Access Journals (Sweden)

    Alejandra Perina

    2017-03-01

    Full Text Available The shrimp Palaemon serratus is a coastal decapod crustacean with a high commercial value. It is harvested for human consumption. In this study, we used Illumina sequencing technology (HiSeq 2000 to sequence, assemble and annotate the transcriptome of P. serratus. RNA was isolated from muscle of adults individuals and, from a pool of larvae. A total number of 4 cDNA libraries were constructed, using the TruSeq RNA Sample Preparation Kit v2. The raw data in this study was deposited in NCBI SRA database with study accession number of SRP090769. The obtained data were subjected to de novo transcriptome assembly using Trinity software, and coding regions were predicted by TransDecoder. We used Blastp and Sma3s to annotate the identified proteins. The transcriptome data could provide some insight into the understanding of genes involved in the larval development and metamorphosis.

  18. De Novo Assembly and Characterization of the Transcriptome of Grasshopper Shirakiacris shirakii

    Directory of Open Access Journals (Sweden)

    Zhongying Qiu

    2016-07-01

    Full Text Available Background: The grasshopper Shirakiacris shirakii is an important agricultural pest and feeds mainly on gramineous plants, thereby causing economic damage to a wide range of crops. However, genomic information on this species is extremely limited thus far, and transcriptome data relevant to insecticide resistance and pest control are also not available. Methods: The transcriptome of S. shirakii was sequenced using the Illumina HiSeq platform, and we de novo assembled the transcriptome. Results: Its sequencing produced a total of 105,408,878 clean reads, and the de novo assembly revealed 74,657 unigenes with an average length of 680 bp and N50 of 1057 bp. A total of 28,173 unigenes were annotated for the NCBI non-redundant protein sequences (Nr, NCBI non-redundant nucleotide sequences (Nt, a manually-annotated and reviewed protein sequence database (Swiss-Prot, Gene Ontology (GO and Kyoto Encyclopedia of Genes and Genomes (KEGG databases. Based on the Nr annotation results, we manually identified 79 unigenes encoding cytochrome P450 monooxygenases (P450s, 36 unigenes encoding carboxylesterases (CarEs and 36 unigenes encoding glutathione S-transferases (GSTs in S. shirakii. Core RNAi components relevant to miroRNA, siRNA and piRNA pathways, including Pasha, Loquacious, Argonaute-1, Argonaute-2, Argonaute-3, Zucchini, Aubergine, enhanced RNAi-1 and Piwi, were expressed in S. shirakii. We also identified five unigenes that were homologous to the Sid-1 gene. In addition, the analysis of differential gene expressions revealed that a total of 19,764 unigenes were up-regulated and 4185 unigenes were down-regulated in larvae. In total, we predicted 7504 simple sequence repeats (SSRs from 74,657 unigenes. Conclusions: The comprehensive de novo transcriptomic data of S. shirakii will offer a series of valuable molecular resources for better studying insecticide resistance, RNAi and molecular marker discovery in the transcriptome.

  19. Defining the maize transcriptome de novo using deep RNA-Seq

    Energy Technology Data Exchange (ETDEWEB)

    Martin, Jeffrey; Gross, Stephen; Choi, Cindy; Zhang, Tao; Lindquist, Erika; Wei, Chia-Lin; Wang, Zhong

    2011-06-01

    De novo assembly of the transcriptome is crucial for functional genomics studies in bioenergy research, since many of the organisms lack high quality reference genomes. In a previous study we successfully de novo assembled simple eukaryote transcriptomes exclusively from short Illumina RNA-Seq reads [1]. However, extensive alternative splicing, present in most of the higher eukaryotes, poses a significant challenge for current short read assembly processes. Furthermore, the size of next-generation datasets, often large for plant genomes, presents an informatics challenge. To tackle these challenges we present a combined experimental and informatics strategy for de novo assembly in higher eukaryotes. Using maize as a test case, preliminary results suggest our approach can resolve transcript variants and improve gene annotations.

  20. Defining the maize transcriptome de novo using deep RNA-Seq

    Energy Technology Data Exchange (ETDEWEB)

    Martin, Jeffrey; Gross, Stephen; Choi, Cindy; Zhang, Tao; Lindquist, Erika; Wei, Chia-Lin; Wang, Zhong

    2011-06-02

    De novo assembly of the transcriptome is crucial for functional genomics studies in bioenergy research, since many of the organisms lack high quality reference genomes. In a previous study we successfully de novo assembled simple eukaryote transcriptomes exclusively from short Illumina RNA-Seq reads [1]. However, extensive alternative splicing, present in most of the higher eukaryotes, poses a significant challenge for current short read assembly processes. Furthermore, the size of next-generation datasets, often large for plant genomes, presents an informatics challenge. To tackle these challenges we present a combined experimental and informatics strategy for de novo assembly in higher eukaryotes. Using maize as a test case, preliminary results suggest our approach can resolve transcript variants and improve gene annotations.

  1. Novel computational methods to predict drug–target interactions using graph mining and machine learning approaches

    KAUST Repository

    Olayan, Rawan S.

    2017-12-01

    Computational drug repurposing aims at finding new medical uses for existing drugs. The identification of novel drug-target interactions (DTIs) can be a useful part of such a task. Computational determination of DTIs is a convenient strategy for systematic screening of a large number of drugs in the attempt to identify new DTIs at low cost and with reasonable accuracy. This necessitates development of accurate computational methods that can help focus on the follow-up experimental validation on a smaller number of highly likely targets for a drug. Although many methods have been proposed for computational DTI prediction, they suffer the high false positive prediction rate or they do not predict the effect that drugs exert on targets in DTIs. In this report, first, we present a comprehensive review of the recent progress in the field of DTI prediction from data-centric and algorithm-centric perspectives. The aim is to provide a comprehensive review of computational methods for identifying DTIs, which could help in constructing more reliable methods. Then, we present DDR, an efficient method to predict the existence of DTIs. DDR achieves significantly more accurate results compared to the other state-of-theart methods. As supported by independent evidences, we verified as correct 22 out of the top 25 DDR DTIs predictions. This validation proves the practical utility of DDR, suggesting that DDR can be used as an efficient method to identify 5 correct DTIs. Finally, we present DDR-FE method that predicts the effect types of a drug on its target. On different representative datasets, under various test setups, and using different performance measures, we show that DDR-FE achieves extremely good performance. Using blind test data, we verified as correct 2,300 out of 3,076 DTIs effects predicted by DDR-FE. This suggests that DDR-FE can be used as an efficient method to identify correct effects of a drug on its target.

  2. Prediction of monthly regional groundwater levels through hybrid soft-computing techniques

    Science.gov (United States)

    Chang, Fi-John; Chang, Li-Chiu; Huang, Chien-Wei; Kao, I.-Feng

    2016-10-01

    Groundwater systems are intrinsically heterogeneous with dynamic temporal-spatial patterns, which cause great difficulty in quantifying their complex processes, while reliable predictions of regional groundwater levels are commonly needed for managing water resources to ensure proper service of water demands within a region. In this study, we proposed a novel and flexible soft-computing technique that could effectively extract the complex high-dimensional input-output patterns of basin-wide groundwater-aquifer systems in an adaptive manner. The soft-computing models combined the Self Organized Map (SOM) and the Nonlinear Autoregressive with Exogenous Inputs (NARX) network for predicting monthly regional groundwater levels based on hydrologic forcing data. The SOM could effectively classify the temporal-spatial patterns of regional groundwater levels, the NARX could accurately predict the mean of regional groundwater levels for adjusting the selected SOM, the Kriging was used to interpolate the predictions of the adjusted SOM into finer grids of locations, and consequently the prediction of a monthly regional groundwater level map could be obtained. The Zhuoshui River basin in Taiwan was the study case, and its monthly data sets collected from 203 groundwater stations, 32 rainfall stations and 6 flow stations during 2000 and 2013 were used for modelling purpose. The results demonstrated that the hybrid SOM-NARX model could reliably and suitably predict monthly basin-wide groundwater levels with high correlations (R2 > 0.9 in both training and testing cases). The proposed methodology presents a milestone in modelling regional environmental issues and offers an insightful and promising way to predict monthly basin-wide groundwater levels, which is beneficial to authorities for sustainable water resources management.

  3. Novos paradigmas literários

    Directory of Open Access Journals (Sweden)

    Denise Azevedo Duarte Guimarães

    2005-12-01

    Full Text Available O artigo estuda a emergência de novos paradigmas literários, procurando refletir acerca das textualidades contemporâneas. Focaliza os hipertextos informatizados e a poesia multimídia, com o intuito de desvendar como estão sendo criados novos procedimentos expressivos e em que medida eles podem ser identificados com reflexões teóricas anteriores acerca do texto literário impresso. Remete a questões ligadas à leitura dos diferentes tipos de signos e aos modos como eles se integram para a constituição dessas novíssimas linguagens híbridas em novos suportes.El artículo estudia la emergencia de nuevos paradigmas literarios, procurando reflejar acerca de las textualidades contemporáneas. Enfoca los hipertextos informatizados y la poesía multimedia, intentando desvendar cómo están siendo creados nuevos procedimientos expresivos y en qué medida ellos pueden ser identificados a reflexiones teóricas anteriores acerca del texto literario impreso. Remite a cuestiones ligadas a la lectura de los diferentes tipos de signos y a los modos cómo ellos se interaccionan para la constitución de los novísimos lenguajes híbridos en nuevos supuestos.This article investigates the emergence of new literary paradigms as it tries to understand new contemporary textualities. It analyses some hypertexts and multimedia poetry trying to trace how new expressive procedures are being created. How can these new languages be identified and what are their relations to previous theories which dealt with the literary printed text? This study approaches questions linked to the reading of different types of signs and the modes they function towards the fabrication of these new hybrid languages.

  4. De novo transcriptome assembly of two Vigna angularis varieties collected from Korea

    Directory of Open Access Journals (Sweden)

    Yeonhwa Jo

    2016-06-01

    Full Text Available The adzuki bean (Vigna angularis, a member of the family Fabaceae, is widely grown in Asia, from East Asia to the Himalayas. The adzuki bean is known as an ingredient that adds sweetness to diverse desserts made in Eastern Asian countries. Libraries prepared from two V. angularis varieties referred to as Taejin Black and Taejin Red were paired-end sequenced using the Illumina HiSeq 2000 system. The raw data in this study can be available in NCBI SRA database with accession numbers of SRR3406660 and SRR3406553. After de novo transcriptome assembly using Trinity, we obtained 324,219 and 280,056 transcripts from Taejin Black and Taejin Red, respectively. We predicted a total of 238,321 proteins and 179,519 proteins for Taejin Black and Taejin Red, respectively, by the TransDecoder program. We carried out BLASTP on the predicted proteins against the Swiss-Prot protein sequence database to predict the putative functions of identified proteins. Taken together, we provide transcriptomes of two adzuki bean varieties by RNA-Seq, which might be usefully applied to generate molecular markers.

  5. A Public Trial De Novo

    DEFF Research Database (Denmark)

    Vedel, Jane Bjørn; Gad, Christopher

    2011-01-01

    This article addresses the concept of “industrial interests” and examines its role in a topical controversy about a large research grant from a private foundation, the Novo Nordisk Foundation, to the University of Copenhagen. The authors suggest that the debate took the form of a “public trial” w.......” The article ends with a discussion of some implications of the analysis, including that policy making, academic research, and public debates might benefit from more detailed accounts of interests and stakes.......This article addresses the concept of “industrial interests” and examines its role in a topical controversy about a large research grant from a private foundation, the Novo Nordisk Foundation, to the University of Copenhagen. The authors suggest that the debate took the form of a “public trial......” where the grant and close(r) intermingling between industry and public research was prosecuted and defended. First, the authors address how the grant was framed in the media. Second, they redescribe the case by introducing new “evidence” that, because of this framing, did not reach “the court...

  6. Use of computer-assisted prediction of toxic effects of chemical substances

    International Nuclear Information System (INIS)

    Simon-Hettich, Brigitte; Rothfuss, Andreas; Steger-Hartmann, Thomas

    2006-01-01

    The current revision of the European policy for the evaluation of chemicals (REACH) has lead to a controversy with regard to the need of additional animal safety testing. To avoid increases in animal testing but also to save time and resources, alternative in silico or in vitro tests for the assessment of toxic effects of chemicals are advocated. The draft of the original document issued in 29th October 2003 by the European Commission foresees the use of alternative methods but does not give further specification on which methods should be used. Computer-assisted prediction models, so-called predictive tools, besides in vitro models, will likely play an essential role in the proposed repertoire of 'alternative methods'. The current discussion has urged the Advisory Committee of the German Toxicology Society to present its position on the use of predictive tools in toxicology. Acceptable prediction models already exist for those toxicological endpoints which are based on well-understood mechanism, such as mutagenicity and skin sensitization, whereas mechanistically more complex endpoints such as acute, chronic or organ toxicities currently cannot be satisfactorily predicted. A potential strategy to assess such complex toxicities will lie in their dissection into models for the different steps or pathways leading to the final endpoint. Integration of these models should result in a higher predictivity. Despite these limitations, computer-assisted prediction tools already today play a complementary role for the assessment of chemicals for which no data is available or for which toxicological testing is impractical due to the lack of availability of sufficient compounds for testing. Furthermore, predictive tools offer support in the screening and the subsequent prioritization of compound for further toxicological testing, as expected within the scope of the European REACH program. This program will also lead to the collection of high-quality data which will broaden the

  7. Anterior abdominal wall leiomyoma arising de novo in a fertile women: A case report

    International Nuclear Information System (INIS)

    Cho, Je Young; Woo, Ji Young; Hong, Hye Suk; Yang, Ik; Lee, Yul; Hwang, Ji Young; Kim, Han Myun; Shin, Mi Kyung

    2016-01-01

    Abdominal wall leiomyoma arising de novo is very rare, hence the reported imaging findings of this disease are also rare. We reported the case of a 33-year-old woman who presented with an abdominal wall mass without antecedent gynecological surgeries. The initial abdominal computed tomography (CT) showed thickening of the left rectus abdominis and the loss of intervening fat between the rectus abdominis and the lateral abdominal muscles. After 8 months, the follow-up contrast-enhanced CT and ultrasonography (US) showed a lentiform-shaped mass with isodensity to the adjacent muscles. The US-guided biopsy was consistent with leiomyoma

  8. Anterior abdominal wall leiomyoma arising de novo in a fertile women: A case report

    Energy Technology Data Exchange (ETDEWEB)

    Cho, Je Young; Woo, Ji Young; Hong, Hye Suk; Yang, Ik; Lee, Yul; Hwang, Ji Young; Kim, Han Myun; Shin, Mi Kyung [Hallym University College of Medicine, Kangnam Sacred Heart Hospital, Seoul (Korea, Republic of)

    2016-01-15

    Abdominal wall leiomyoma arising de novo is very rare, hence the reported imaging findings of this disease are also rare. We reported the case of a 33-year-old woman who presented with an abdominal wall mass without antecedent gynecological surgeries. The initial abdominal computed tomography (CT) showed thickening of the left rectus abdominis and the loss of intervening fat between the rectus abdominis and the lateral abdominal muscles. After 8 months, the follow-up contrast-enhanced CT and ultrasonography (US) showed a lentiform-shaped mass with isodensity to the adjacent muscles. The US-guided biopsy was consistent with leiomyoma.

  9. The genome of flax (Linum usitatissimum) assembled de novo from short shotgun sequence reads.

    Science.gov (United States)

    Wang, Zhiwen; Hobson, Neil; Galindo, Leonardo; Zhu, Shilin; Shi, Daihu; McDill, Joshua; Yang, Linfeng; Hawkins, Simon; Neutelings, Godfrey; Datla, Raju; Lambert, Georgina; Galbraith, David W; Grassa, Christopher J; Geraldes, Armando; Cronk, Quentin C; Cullis, Christopher; Dash, Prasanta K; Kumar, Polumetla A; Cloutier, Sylvie; Sharpe, Andrew G; Wong, Gane K-S; Wang, Jun; Deyholos, Michael K

    2012-11-01

    Flax (Linum usitatissimum) is an ancient crop that is widely cultivated as a source of fiber, oil and medicinally relevant compounds. To accelerate crop improvement, we performed whole-genome shotgun sequencing of the nuclear genome of flax. Seven paired-end libraries ranging in size from 300 bp to 10 kb were sequenced using an Illumina genome analyzer. A de novo assembly, comprised exclusively of deep-coverage (approximately 94× raw, approximately 69× filtered) short-sequence reads (44-100 bp), produced a set of scaffolds with N(50) =694 kb, including contigs with N(50)=20.1 kb. The contig assembly contained 302 Mb of non-redundant sequence representing an estimated 81% genome coverage. Up to 96% of published flax ESTs aligned to the whole-genome shotgun scaffolds. However, comparisons with independently sequenced BACs and fosmids showed some mis-assembly of regions at the genome scale. A total of 43384 protein-coding genes were predicted in the whole-genome shotgun assembly, and up to 93% of published flax ESTs, and 86% of A. thaliana genes aligned to these predicted genes, indicating excellent coverage and accuracy at the gene level. Analysis of the synonymous substitution rates (K(s) ) observed within duplicate gene pairs was consistent with a recent (5-9 MYA) whole-genome duplication in flax. Within the predicted proteome, we observed enrichment of many conserved domains (Pfam-A) that may contribute to the unique properties of this crop, including agglutinin proteins. Together these results show that de novo assembly, based solely on whole-genome shotgun short-sequence reads, is an efficient means of obtaining nearly complete genome sequence information for some plant species. © 2012 The Authors. The Plant Journal © 2012 Blackwell Publishing Ltd.

  10. Sensitivity-Informed De Novo Programming for Many-Objective Water Portfolio Planning Under Uncertainty

    Science.gov (United States)

    Kasprzyk, J. R.; Reed, P. M.; Kirsch, B. R.; Characklis, G. W.

    2009-12-01

    Risk-based water supply management presents severe cognitive, computational, and social challenges to planning in a changing world. Decision aiding frameworks must confront the cognitive biases implicit to risk, the severe uncertainties associated with long term planning horizons, and the consequent ambiguities that shape how we define and solve water resources planning and management problems. This paper proposes and demonstrates a new interactive framework for sensitivity informed de novo programming. The theoretical focus of our many-objective de novo programming is to promote learning and evolving problem formulations to enhance risk-based decision making. We have demonstrated our proposed de novo programming framework using a case study for a single city’s water supply in the Lower Rio Grande Valley (LRGV) in Texas. Key decisions in this case study include the purchase of permanent rights to reservoir inflows and anticipatory thresholds for acquiring transfers of water through optioning and spot leases. A 10-year Monte Carlo simulation driven by historical data is used to provide performance metrics for the supply portfolios. The three major components of our methodology include Sobol globoal sensitivity analysis, many-objective evolutionary optimization and interactive tradeoff visualization. The interplay between these components allows us to evaluate alternative design metrics, their decision variable controls and the consequent system vulnerabilities. Our LRGV case study measures water supply portfolios’ efficiency, reliability, and utilization of transfers in the water supply market. The sensitivity analysis is used interactively over interannual, annual, and monthly time scales to indicate how the problem controls change as a function of the timescale of interest. These results have been used then to improve our exploration and understanding of LRGV costs, vulnerabilities, and the water portfolios’ critical reliability constraints. These results

  11. Optimizing and benchmarking de novo transcriptome sequencing: from library preparation to assembly evaluation.

    Science.gov (United States)

    Hara, Yuichiro; Tatsumi, Kaori; Yoshida, Michio; Kajikawa, Eriko; Kiyonari, Hiroshi; Kuraku, Shigehiro

    2015-11-18

    RNA-seq enables gene expression profiling in selected spatiotemporal windows and yields massive sequence information with relatively low cost and time investment, even for non-model species. However, there remains a large room for optimizing its workflow, in order to take full advantage of continuously developing sequencing capacity. Transcriptome sequencing for three embryonic stages of Madagascar ground gecko (Paroedura picta) was performed with the Illumina platform. The output reads were assembled de novo for reconstructing transcript sequences. In order to evaluate the completeness of transcriptome assemblies, we prepared a reference gene set consisting of vertebrate one-to-one orthologs. To take advantage of increased read length of >150 nt, we demonstrated shortened RNA fragmentation time, which resulted in a dramatic shift of insert size distribution. To evaluate products of multiple de novo assembly runs incorporating reads with different RNA sources, read lengths, and insert sizes, we introduce a new reference gene set, core vertebrate genes (CVG), consisting of 233 genes that are shared as one-to-one orthologs by all vertebrate genomes examined (29 species)., The completeness assessment performed by the computational pipelines CEGMA and BUSCO referring to CVG, demonstrated higher accuracy and resolution than with the gene set previously established for this purpose. As a result of the assessment with CVG, we have derived the most comprehensive transcript sequence set of the Madagascar ground gecko by means of assembling individual libraries followed by clustering the assembled sequences based on their overall similarities. Our results provide several insights into optimizing de novo RNA-seq workflow, including the coordination between library insert size and read length, which manifested in improved connectivity of assemblies. The approach and assembly assessment with CVG demonstrated here would be applicable to transcriptome analysis of other species as

  12. Enzyme-like replication de novo in a microcontroller environment.

    Science.gov (United States)

    Tangen, Uwe

    2010-01-01

    The desire to start evolution from scratch inside a computer memory is as old as computing. Here we demonstrate how viable computer programs can be established de novo in a Precambrian environment without supplying any specific instantiation, just starting with random bit sequences. These programs are not self-replicators, but act much more like catalysts. The microcontrollers used in the end are the result of a long series of simplifications. The objective of this simplification process was to produce universal machines with a human-readable interface, allowing software and/or hardware evolution to be studied. The power of the instruction set can be modified by introducing a secondary structure-folding mechanism, which is a state machine, allowing nontrivial replication to emerge with an instruction width of only a few bits. This state-machine approach not only attenuates the problems of brittleness and encoding functionality (too few bits available for coding, and too many instructions needed); it also enables the study of hardware evolution as such. Furthermore, the instruction set is sufficiently powerful to permit external signals to be processed. This information-theoretic approach forms one vertex of a triangle alongside artificial cell research and experimental research on the creation of life. Hopefully this work helps develop an understanding of how information—in a similar sense to the account of functional information described by Hazen et al.—is created by evolution and how this information interacts with or is embedded in its physico-chemical environment.

  13. Vehicular traffic noise prediction using soft computing approach.

    Science.gov (United States)

    Singh, Daljeet; Nigam, S P; Agrawal, V P; Kumar, Maneek

    2016-12-01

    A new approach for the development of vehicular traffic noise prediction models is presented. Four different soft computing methods, namely, Generalized Linear Model, Decision Trees, Random Forests and Neural Networks, have been used to develop models to predict the hourly equivalent continuous sound pressure level, Leq, at different locations in the Patiala city in India. The input variables include the traffic volume per hour, percentage of heavy vehicles and average speed of vehicles. The performance of the four models is compared on the basis of performance criteria of coefficient of determination, mean square error and accuracy. 10-fold cross validation is done to check the stability of the Random Forest model, which gave the best results. A t-test is performed to check the fit of the model with the field data. Copyright © 2016 Elsevier Ltd. All rights reserved.

  14. Recurrence risk in de novo structural chromosomal rearrangements.

    Science.gov (United States)

    Röthlisberger, Benno; Kotzot, Dieter

    2007-08-01

    According to the textbook of Gardner and Sutherland [2004], the standard on genetic counseling for chromosome abnormalities, the recurrence risk of de novo structural or combined structural and numeric chromosome rearrangements is less than 0.5-2% and takes into account recurrence by chance, gonadal mosaicism, and somatic-gonadal mosaicism. However, these figures are roughly estimated and neither any systematic study nor exact or evidence-based risk calculations are available. To address this question, an extensive literature search was performed and surprisingly only 29 case reports of recurrence of de novo structural or combined structural and numeric chromosomal rearrangements were found. Thirteen of them were with a trisomy 21 due to an i(21q) replacing one normal chromosome 21. In eight of them low-level mosaicism in one of the parents was found either in fibroblasts or in blood or in both. As a consequence of the low number of cases and theoretical considerations (clinical consequences, mechanisms of formation, etc.), the recurrence risk should be reduced to less than 1% for a de novo i(21q) and to even less than 0.3% for all other de novo structural or combined structural and numeric chromosomal rearrangements. As the latter is lower than the commonly accepted risk of approximately 0.3% for indicating an invasive prenatal diagnosis and as the risk of abortion of a healthy fetus after chorionic villous sampling or amniocentesis is higher than approximately 0.5%, invasive prenatal investigation in most cases is not indicated and should only be performed if explicitly asked by the parents subsequent to appropriate genetic counseling. (c) 2007 Wiley-Liss, Inc.

  15. CIPESC® Curitiba: o trabalho da enfermagem no Distrito Bairro Novo CIPESC® Curitiba: el trabajo de enfermería en el Districto Bairro Novo CIPESC® Curitiba: the work of nursing at Bairro Novo District

    Directory of Open Access Journals (Sweden)

    Denise Maria Altino

    2006-08-01

    the health unities of Bairro Novo district of Curitiba City. We interviewed nursing and auxiliaries using check-list. The results showed: the sheltering is done by everyone; the auxiliaries' activities are procedures caring; all the nurses do nursing consultation and most of then use computer-based system CIPESC®. In conclusion, excepting the research activities, the nursing staff works all the time according municipality program, such is under organized and scientific based system to giving nursing care, emphasizing at women health. Activities such as planning and education have been improved also.

  16. NovoTTF™-100A System (Tumor Treating Fields) transducer array layout planning for glioblastoma: a NovoTAL™ system user study.

    Science.gov (United States)

    Chaudhry, Aafia; Benson, Laura; Varshaver, Michael; Farber, Ori; Weinberg, Uri; Kirson, Eilon; Palti, Yoram

    2015-11-11

    Optune™, previously known as the NovoTTF-100A System™, generates Tumor Treating Fields (TTFields), an effective anti-mitotic therapy for glioblastoma. The system delivers intermediate frequency, alternating electric fields to the supratentorial brain. Patient therapy is personalized by configuring transducer array layout placement on the scalp to the tumor site using MRI measurements and the NovoTAL System. Transducer array layout mapping optimizes therapy by maximizing electric field intensity to the tumor site. This study evaluated physician performance in conducting transducer array layout mapping using the NovoTAL System compared with mapping performed by the Novocure in-house clinical team. Fourteen physicians (7 neuro-oncologists, 4 medical oncologists, and 3 neurosurgeons) evaluated five blinded cases of recurrent glioblastoma and performed head size and tumor location measurements using a standard Digital Imaging and Communications in Medicine reader. Concordance with Novocure measurement and intra- and inter-rater reliability were assessed using relevant correlation coefficients. The study criterion for success was a concordance correlation coefficient (CCC) >0.80. CCC for each physician versus Novocure on 20 MRI measurements was 0.96 (standard deviation, SD ± 0.03, range 0.90-1.00), indicating very high agreement between the two groups. Intra- and inter-rater reliability correlation coefficients were similarly high: 0.83 (SD ±0.15, range 0.54-1.00) and 0.80 (SD ±0.18, range 0.48-1.00), respectively. This user study demonstrated an excellent level of concordance between prescribing physicians and Novocure in-house clinical teams in performing transducer array layout planning. Intra-rater reliability was very high, indicating reproducible performance. Physicians prescribing TTFields, when trained on the NovoTAL System, can independently perform transducer array layout mapping required for the initiation and maintenance of patients on TTFields

  17. A Terra em Transe: o cosmopolitismo às avessas do cinema novo

    Directory of Open Access Journals (Sweden)

    Angela Prysthon

    2008-11-01

    Full Text Available Usando como referencial teórico os estudos culturais, este artigo analisa o cinema novo brasileiro como parte de uma estratégia terceiro mundista de conceber a cultura. A partir da emergência do conceito de terceiro mundo e das lutas de descolonização nos anos 1950 e 1960, a ideologia cosmopolita foi sendo vista pelos intelectuais de esquerda como a versão cultural da aliança com as forças hegemônicas da Europa e dos Estados Unidos. O projeto do cinema novo chama a atenção por suas afinidades ideológicas com o terceiro mundismo, mas, paradoxalmente, trazendo à tona uma polí­tica cosmopolita da periferia. Palavras-chave cinema novo, identidade, cultura brasileira, terceiro mundismo, estudos culturais. Abstract Using the cultural studies theoretical framework, this paper analyzes the cinema novo movement in Brazil as a part of the Third World conception of culture. Following the creation of the term "Third World" and the international politics of colonial independence of the 1950s and 1960s, a cosmopolitan attitude was seen by the intellectuals of the left as a cultural version of the alliance with the hegemonic forces of Europe and North America. Even though the cinema novo project can be associated with the ideology of an united Third World ,it brings about, paradoxically, a very cosmopolitan politics of the periphery. Key words cinema novo, identity, Brazilian culture, third world, cultural studies.

  18. Eculizumab for drug-induced de novo posttransplantation thrombotic microangiopathy: A case report.

    Science.gov (United States)

    Safa, Kassem; Logan, Merranda S; Batal, Ibrahim; Gabardi, Steven; Rennke, Helmut G; Abdi, Reza

    2015-02-01

    De novo thrombotic microangiopathy (TMA) following renal transplantation is a severe complication associated with high rates of allograft failure. Several immunosuppressive agents are associated with TMA. Conventional approaches to managing this entity, such as withdrawal of the offending agent and/or plasmapheresis, often offer limited help, with high rates of treatment failure and graft loss. We herein report a case of drug induced de novo TMA successfully treated using the C5a inhibitor eculizumab in a renal transplant patient. This report highlights a potentially important role for eculizumab in settings where drug-induced de novo TMA is refractory to conventional therapies.

  19. De novo autoimmune hepatitis after liver transplantation.

    Science.gov (United States)

    Lohse, Ansgar W; Weiler-Norman, Christina; Burdelski, Martin

    2007-10-01

    The Kings College group was the first to describe a clinical syndrome similar to autoimmune hepatitis in children and young adults transplanted for non-immune mediated liver diseases. They coined the term "de novo autoimmune hepatitis". Several other liver transplant centres confirmed this observation. Even though the condition is uncommon, patients with de novo AIH are now seen in most of the major transplant centres. The disease is usually characterized by features of acute hepatitis in otherwise stable transplant recipients. The most characteristic laboratory hallmark is a marked hypergammaglobulinaemia. Autoantibodies are common, mostly ANA. We described also a case of LKM1-positivity in a patients transplanted for Wilson's disease, however this patients did not develop clinical or histological features of AIH. Development of SLA/LP-autoantibodies is also not described. Therefore, serologically de novo AIH appears to correspond to type 1 AIH. Like classical AIH patients respond promptly to treatment with increased doses of prednisolone and azathioprine, while the calcineurin inhibitors cyclosporine or tacrolimus areof very limited value - which is not surprising, as almost all patients develop de novo AIH while receiving these drugs. Despite the good response to treatment, most patients remain a clinical challenge as complete stable remissions are uncommon and flares, relapses and chronic disease activity can often occur. Pathogenetically this syndrome is intriguing. It is not clear, if the immune response is directed against allo-antigens, neo-antigens in the liver, or self-antigens, possibly shared by donor and host cells. It is very likely that the inflammatory milieu due to alloreactive cells in the transplanted organ contribute to the disease process. Either leading to aberrant antigen presentation, or providing co-stimulatory signals leading to the breaking of self-tolerance. The development of this disease in the presence of treatment with calcineurin

  20. A randomized, double-blind, cross-over, phase IV trial of oros-methylphenidate (CONCERTA(®)) and generic novo-methylphenidate ER-C (NOVO-generic).

    Science.gov (United States)

    Fallu, Angelo; Dabouz, Farida; Furtado, Melissa; Anand, Leena; Katzman, Martin A

    2016-08-01

    Attention-deficit/hyperactivity disorder (ADHD) is a common neurobehavioral disorder with onset during childhood. Multiple aspects of a child's development are hindered, in both home and school settings, with negative impacts on social, emotional, and cognitive functioning. If left untreated, ADHD is commonly associated with poor academic achievement and low occupational status, as well as increased risk of substance abuse and delinquency. The objective of this study was to evaluate adult ADHD subject reported outcomes when switched from a stable dose of CONCERTA(®) to the same dose of generic Novo-methylphenidate ER-C(®). Randomized, double-blind, cross-over, phase IV trial consisted of two phases in which participants with a primary diagnosis of ADHD were randomized in a 1:1 ratio to 3 weeks of treatment with CONCERTA or generic Novo-Methylphenidate ER-C. Following 3 weeks of treatment, participants were crossed-over to receive the other treatment for an additional 3 weeks. Primary efficacy was assessed through the use of the Treatment Satisfaction Questionnaire for Medication, Version II (TSQM-II). Participants with ADHD treated with CONCERTA were more satisfied in terms of efficacy and side effects compared to those receiving an equivalent dose of generic Novo-Methylphenidate ER-C. All participants chose to continue with CONCERTA treatment at the conclusion of the study. Although CONCERTA and generic Novo-Methylphenidate ER-C have been deemed bioequivalent, however the present findings demonstrate clinically and statistically significant differences between generic and branded CONCERTA. Further investigation of these differences is warranted.

  1. Arginine de novo and nitric oxide production in disease states

    OpenAIRE

    Luiking, Yvette C.; Ten Have, Gabriella A. M.; Wolfe, Robert R.; Deutz, Nicolaas E. P.

    2012-01-01

    Arginine is derived from dietary protein intake, body protein breakdown, or endogenous de novo arginine production. The latter may be linked to the availability of citrulline, which is the immediate precursor of arginine and limiting factor for de novo arginine production. Arginine metabolism is highly compartmentalized due to the expression of the enzymes involved in arginine metabolism in various organs. A small fraction of arginine enters the NO synthase (NOS) pathway. Tetrahydrobiopterin ...

  2. Experimental and computational prediction of glass transition temperature of drugs.

    Science.gov (United States)

    Alzghoul, Ahmad; Alhalaweh, Amjad; Mahlin, Denny; Bergström, Christel A S

    2014-12-22

    Glass transition temperature (Tg) is an important inherent property of an amorphous solid material which is usually determined experimentally. In this study, the relation between Tg and melting temperature (Tm) was evaluated using a data set of 71 structurally diverse druglike compounds. Further, in silico models for prediction of Tg were developed based on calculated molecular descriptors and linear (multilinear regression, partial least-squares, principal component regression) and nonlinear (neural network, support vector regression) modeling techniques. The models based on Tm predicted Tg with an RMSE of 19.5 K for the test set. Among the five computational models developed herein the support vector regression gave the best result with RMSE of 18.7 K for the test set using only four chemical descriptors. Hence, two different models that predict Tg of drug-like molecules with high accuracy were developed. If Tm is available, a simple linear regression can be used to predict Tg. However, the results also suggest that support vector regression and calculated molecular descriptors can predict Tg with equal accuracy, already before compound synthesis.

  3. Integrating Crop Growth Models with Whole Genome Prediction through Approximate Bayesian Computation.

    Directory of Open Access Journals (Sweden)

    Frank Technow

    Full Text Available Genomic selection, enabled by whole genome prediction (WGP methods, is revolutionizing plant breeding. Existing WGP methods have been shown to deliver accurate predictions in the most common settings, such as prediction of across environment performance for traits with additive gene effects. However, prediction of traits with non-additive gene effects and prediction of genotype by environment interaction (G×E, continues to be challenging. Previous attempts to increase prediction accuracy for these particularly difficult tasks employed prediction methods that are purely statistical in nature. Augmenting the statistical methods with biological knowledge has been largely overlooked thus far. Crop growth models (CGMs attempt to represent the impact of functional relationships between plant physiology and the environment in the formation of yield and similar output traits of interest. Thus, they can explain the impact of G×E and certain types of non-additive gene effects on the expressed phenotype. Approximate Bayesian computation (ABC, a novel and powerful computational procedure, allows the incorporation of CGMs directly into the estimation of whole genome marker effects in WGP. Here we provide a proof of concept study for this novel approach and demonstrate its use with synthetic data sets. We show that this novel approach can be considerably more accurate than the benchmark WGP method GBLUP in predicting performance in environments represented in the estimation set as well as in previously unobserved environments for traits determined by non-additive gene effects. We conclude that this proof of concept demonstrates that using ABC for incorporating biological knowledge in the form of CGMs into WGP is a very promising and novel approach to improving prediction accuracy for some of the most challenging scenarios in plant breeding and applied genetics.

  4. Prediction of velocity and attitude of a yacht sailing upwind by computational fluid dynamics

    OpenAIRE

    Lee, Heebum; Park, Mi Yeon; Park, Sunho; Rhee, Shin Hyung

    2016-01-01

    One of the most important factors in sailing yacht design is accurate velocity prediction. Velocity prediction programs (VPP's) are widely used to predict velocity of sailing yachts. VPP's, which are primarily based on experimental data and experience of long years, however suffer limitations when applied in realistic conditions. Thus, in the present study, a high fidelity velocity prediction method using computational fluid dynamics (CFD) was proposed. Using the developed method, velocity an...

  5. Computer predictions on Rh-based double perovskites with unusual electronic and magnetic properties

    Science.gov (United States)

    Halder, Anita; Nafday, Dhani; Sanyal, Prabuddha; Saha-Dasgupta, Tanusri

    2018-03-01

    In search for new magnetic materials, we make computer prediction of structural, electronic and magnetic properties of yet-to-be synthesized Rh-based double perovskite compounds, Sr(Ca)2BRhO6 (B=Cr, Mn, Fe). We use combination of evolutionary algorithm, density functional theory, and statistical-mechanical tool for this purpose. We find that the unusual valence of Rh5+ may be stabilized in these compounds through formation of oxygen ligand hole. Interestingly, while the Cr-Rh and Mn-Rh compounds are predicted to be ferromagnetic half-metals, the Fe-Rh compounds are found to be rare examples of antiferromagnetic and metallic transition-metal oxide with three-dimensional electronic structure. The computed magnetic transition temperatures of the predicted compounds, obtained from finite temperature Monte Carlo study of the first principles-derived model Hamiltonian, are found to be reasonably high. The prediction of favorable growth condition of the compounds, reported in our study, obtained through extensive thermodynamic analysis should be useful for future synthesize of this interesting class of materials with intriguing properties.

  6. De novo FBXO11 mutations are associated with intellectual disability and behavioural anomalies.

    Science.gov (United States)

    Fritzen, Daniel; Kuechler, Alma; Grimmel, Mona; Becker, Jessica; Peters, Sophia; Sturm, Marc; Hundertmark, Hela; Schmidt, Axel; Kreiß, Martina; Strom, Tim M; Wieczorek, Dagmar; Haack, Tobias B; Beck-Wödl, Stefanie; Cremer, Kirsten; Engels, Hartmut

    2018-05-01

    Intellectual disability (ID) has an estimated prevalence of 1.5-2%. In most affected individuals, its genetic basis remains unclear. Whole exome sequencing (WES) studies have identified a multitude of novel causative gene defects and have shown that a large proportion of sporadic ID cases results from de novo mutations. Here, we present two unrelated individuals with similar clinical features and deleterious de novo variants in FBXO11 detected by WES. Individual 1, a 14-year-old boy, has mild ID as well as mild microcephaly, corrected cleft lip and alveolus, hyperkinetic disorder, mild brain atrophy and minor facial dysmorphism. WES detected a heterozygous de novo 1 bp insertion in the splice donor site of exon 3. Individual 2, a 3-year-old boy, showed ID and pre- and postnatal growth retardation, postnatal mild microcephaly, hyperkinetic and restless behaviour, as well as mild dysmorphism. WES detected a heterozygous de novo frameshift mutation. While ten individuals with ID and de novo variants in FBXO11 have been reported as part of larger studies, only one of the reports has some additional clinical data. Interestingly, the latter individual carries the identical mutation as our individual 2 and also displays ID, intrauterine growth retardation, microcephaly, behavioural anomalies, and dysmorphisms. Thus, we confirm deleterious de novo mutations in FBXO11 as a cause of ID and start the delineation of the associated clinical picture which may also comprise postnatal microcephaly or borderline small head size and behavioural anomalies.

  7. Prediction of surgical view of neurovascular decompression using interactive computer graphics.

    Science.gov (United States)

    Kin, Taichi; Oyama, Hiroshi; Kamada, Kyousuke; Aoki, Shigeki; Ohtomo, Kuni; Saito, Nobuhito

    2009-07-01

    To assess the value of an interactive visualization method for detecting the offending vessels in neurovascular compression syndrome in patients with facial spasm and trigeminal neuralgia. Computer graphics models are created by fusion of fast imaging employing steady-state acquisition and magnetic resonance angiography. High-resolution magnetic resonance angiography and fast imaging employing steady-state acquisition were performed preoperatively in 17 patients with neurovascular compression syndromes (facial spasm, n = 10; trigeminal neuralgia, n = 7) using a 3.0-T magnetic resonance imaging scanner. Computer graphics models were created with computer software and observed interactively for detection of offending vessels by rotation, enlargement, reduction, and retraction on a graphic workstation. Two-dimensional images were reviewed by 2 radiologists blinded to the clinical details, and 2 neurosurgeons predicted the offending vessel with the interactive visualization method before surgery. Predictions from the 2 imaging approaches were compared with surgical findings. The vessels identified during surgery were assumed to be the true offending vessels. Offending vessels were identified correctly in 16 of 17 patients (94%) using the interactive visualization method and in 10 of 17 patients using 2-dimensional images. These data demonstrated a significant difference (P = 0.015 by Fisher's exact method). The interactive visualization method data corresponded well with surgical findings (surgical field, offending vessels, and nerves). Virtual reality 3-dimensional computer graphics using fusion magnetic resonance angiography and fast imaging employing steady-state acquisition may be helpful for preoperative simulation.

  8. Assessing the suitability of soft computing approaches for forest fires prediction

    Directory of Open Access Journals (Sweden)

    Samaher Al_Janabi

    2018-07-01

    Full Text Available Forest fires present one of the main causes of environmental hazards that have many negative results in different aspect of life. Therefore, early prediction, fast detection and rapid action are the key elements for controlling such phenomenon and saving lives. Through this work, 517 different entries were selected at different times for montesinho natural park (MNP in Portugal to determine the best predictor that has the ability to detect forest fires, The principle component analysis (PCA was applied to find the critical patterns and particle swarm optimization (PSO technique was used to segment the fire regions (clusters. In the next stage, five soft computing (SC Techniques based on neural network were used in parallel to identify the best technique that would potentially give more accurate and optimum results in predicting of forest fires, these techniques namely; cascade correlation network (CCN, multilayer perceptron neural network (MPNN, polynomial neural network (PNN, radial basis function (RBF and support vector machine (SVM In the final stage, the predictors and their performance were evaluated based on five quality measures including root mean squared error (RMSE, mean squared error (MSE, relative absolute error (RAE, mean absolute error (MAE and information gain (IG. The results indicate that SVM technique was more effective and efficient than the RBF, MPNN, PNN and CCN predictors. The results also show that the SVM algorithm provides more precise predictions compared with other predictors with small estimation error. The obtained results confirm that the SVM improves the prediction accuracy and suitable for forest fires prediction compared to other methods. Keywords: Forest fires, Soft computing, Prediction, Principle component analysis, Particle swarm optimization, Cascade correlation network, Multilayer perceptron neural network, Polynomial neural networks, Radial basis function, Support vector machine

  9. Computational Model-Based Prediction of Human Episodic Memory Performance Based on Eye Movements

    Science.gov (United States)

    Sato, Naoyuki; Yamaguchi, Yoko

    Subjects' episodic memory performance is not simply reflected by eye movements. We use a ‘theta phase coding’ model of the hippocampus to predict subjects' memory performance from their eye movements. Results demonstrate the ability of the model to predict subjects' memory performance. These studies provide a novel approach to computational modeling in the human-machine interface.

  10. Role of computer graphics in space telerobotics - Preview and predictive displays

    Science.gov (United States)

    Bejczy, Antal K.; Venema, Steven; Kim, Won S.

    1991-01-01

    The application of computer graphics in space telerobotics research and development work is briefly reviewed and illustrated by specific examples implemented in real time operation. The applications are discussed under the following four major categories: preview displays, predictive displays, sensor data displays, and control system status displays.

  11. Computational prediction and experimental validation of Ciona intestinalis microRNA genes

    Directory of Open Access Journals (Sweden)

    Pasquinelli Amy E

    2007-11-01

    Full Text Available Abstract Background This study reports the first collection of validated microRNA genes in the sea squirt, Ciona intestinalis. MicroRNAs are processed from hairpin precursors to ~22 nucleotide RNAs that base pair to target mRNAs and inhibit expression. As a member of the subphylum Urochordata (Tunicata whose larval form has a notochord, the sea squirt is situated at the emergence of vertebrates, and therefore may provide information about the evolution of molecular regulators of early development. Results In this study, computational methods were used to predict 14 microRNA gene families in Ciona intestinalis. The microRNA prediction algorithm utilizes configurable microRNA sequence conservation and stem-loop specificity parameters, grouping by miRNA family, and phylogenetic conservation to the related species, Ciona savignyi. The expression for 8, out of 9 attempted, of the putative microRNAs in the adult tissue of Ciona intestinalis was validated by Northern blot analyses. Additionally, a target prediction algorithm was implemented, which identified a high confidence list of 240 potential target genes. Over half of the predicted targets can be grouped into the gene ontology categories of metabolism, transport, regulation of transcription, and cell signaling. Conclusion The computational techniques implemented in this study can be applied to other organisms and serve to increase the understanding of the origins of non-coding RNAs, embryological and cellular developmental pathways, and the mechanisms for microRNA-controlled gene regulatory networks.

  12. Airline Maintenance Manpower Optimization from the De Novo Perspective

    Science.gov (United States)

    Liou, James J. H.; Tzeng, Gwo-Hshiung

    Human resource management (HRM) is an important issue for today’s competitive airline marketing. In this paper, we discuss a multi-objective model designed from the De Novo perspective to help airlines optimize their maintenance manpower portfolio. The effectiveness of the model and solution algorithm is demonstrated in an empirical study of the optimization of the human resources needed for airline line maintenance. Both De Novo and traditional multiple objective programming (MOP) methods are analyzed. A comparison of the results with those of traditional MOP indicates that the proposed model and solution algorithm does provide better performance and an improved human resource portfolio.

  13. POLYAR, a new computer program for prediction of poly(A sites in human sequences

    Directory of Open Access Journals (Sweden)

    Qamar Raheel

    2010-11-01

    Full Text Available Abstract Background mRNA polyadenylation is an essential step of pre-mRNA processing in eukaryotes. Accurate prediction of the pre-mRNA 3'-end cleavage/polyadenylation sites is important for defining the gene boundaries and understanding gene expression mechanisms. Results 28761 human mapped poly(A sites have been classified into three classes containing different known forms of polyadenylation signal (PAS or none of them (PAS-strong, PAS-weak and PAS-less, respectively and a new computer program POLYAR for the prediction of poly(A sites of each class was developed. In comparison with polya_svm (till date the most accurate computer program for prediction of poly(A sites while searching for PAS-strong poly(A sites in human sequences, POLYAR had a significantly higher prediction sensitivity (80.8% versus 65.7% and specificity (66.4% versus 51.7% However, when a similar sort of search was conducted for PAS-weak and PAS-less poly(A sites, both programs had a very low prediction accuracy, which indicates that our knowledge about factors involved in the determination of the poly(A sites is not sufficient to identify such polyadenylation regions. Conclusions We present a new classification of polyadenylation sites into three classes and a novel computer program POLYAR for prediction of poly(A sites/regions of each of the class. In tests, POLYAR shows high accuracy of prediction of the PAS-strong poly(A sites, though this program's efficiency in searching for PAS-weak and PAS-less poly(A sites is not very high but is comparable to other available programs. These findings suggest that additional characteristics of such poly(A sites remain to be elucidated. POLYAR program with a stand-alone version for downloading is available at http://cub.comsats.edu.pk/polyapredict.htm.

  14. Predicting Flow Reversals in a Computational Fluid Dynamics Simulated Thermosyphon Using Data Assimilation.

    Science.gov (United States)

    Reagan, Andrew J; Dubief, Yves; Dodds, Peter Sheridan; Danforth, Christopher M

    2016-01-01

    A thermal convection loop is a annular chamber filled with water, heated on the bottom half and cooled on the top half. With sufficiently large forcing of heat, the direction of fluid flow in the loop oscillates chaotically, dynamics analogous to the Earth's weather. As is the case for state-of-the-art weather models, we only observe the statistics over a small region of state space, making prediction difficult. To overcome this challenge, data assimilation (DA) methods, and specifically ensemble methods, use the computational model itself to estimate the uncertainty of the model to optimally combine these observations into an initial condition for predicting the future state. Here, we build and verify four distinct DA methods, and then, we perform a twin model experiment with the computational fluid dynamics simulation of the loop using the Ensemble Transform Kalman Filter (ETKF) to assimilate observations and predict flow reversals. We show that using adaptively shaped localized covariance outperforms static localized covariance with the ETKF, and allows for the use of less observations in predicting flow reversals. We also show that a Dynamic Mode Decomposition (DMD) of the temperature and velocity fields recovers the low dimensional system underlying reversals, finding specific modes which together are predictive of reversal direction.

  15. Predicting Flow Reversals in a Computational Fluid Dynamics Simulated Thermosyphon Using Data Assimilation.

    Directory of Open Access Journals (Sweden)

    Andrew J Reagan

    Full Text Available A thermal convection loop is a annular chamber filled with water, heated on the bottom half and cooled on the top half. With sufficiently large forcing of heat, the direction of fluid flow in the loop oscillates chaotically, dynamics analogous to the Earth's weather. As is the case for state-of-the-art weather models, we only observe the statistics over a small region of state space, making prediction difficult. To overcome this challenge, data assimilation (DA methods, and specifically ensemble methods, use the computational model itself to estimate the uncertainty of the model to optimally combine these observations into an initial condition for predicting the future state. Here, we build and verify four distinct DA methods, and then, we perform a twin model experiment with the computational fluid dynamics simulation of the loop using the Ensemble Transform Kalman Filter (ETKF to assimilate observations and predict flow reversals. We show that using adaptively shaped localized covariance outperforms static localized covariance with the ETKF, and allows for the use of less observations in predicting flow reversals. We also show that a Dynamic Mode Decomposition (DMD of the temperature and velocity fields recovers the low dimensional system underlying reversals, finding specific modes which together are predictive of reversal direction.

  16. A Review of Computational Methods to Predict the Risk of Rupture of Abdominal Aortic Aneurysms

    Directory of Open Access Journals (Sweden)

    Tejas Canchi

    2015-01-01

    Full Text Available Computational methods have played an important role in health care in recent years, as determining parameters that affect a certain medical condition is not possible in experimental conditions in many cases. Computational fluid dynamics (CFD methods have been used to accurately determine the nature of blood flow in the cardiovascular and nervous systems and air flow in the respiratory system, thereby giving the surgeon a diagnostic tool to plan treatment accordingly. Machine learning or data mining (MLD methods are currently used to develop models that learn from retrospective data to make a prediction regarding factors affecting the progression of a disease. These models have also been successful in incorporating factors such as patient history and occupation. MLD models can be used as a predictive tool to determine rupture potential in patients with abdominal aortic aneurysms (AAA along with CFD-based prediction of parameters like wall shear stress and pressure distributions. A combination of these computer methods can be pivotal in bridging the gap between translational and outcomes research in medicine. This paper reviews the use of computational methods in the diagnosis and treatment of AAA.

  17. in silico Whole Genome Sequencer & Analyzer (iWGS): a computational pipeline to guide the design and analysis of de novo genome sequencing studies

    Science.gov (United States)

    The availability of genomes across the tree of life is highly biased toward vertebrates, pathogens, human disease models, and organisms with relatively small and simple genomes. Recent progress in genomics has enabled the de novo decoding of the genome of virtually any organism, greatly expanding it...

  18. Computational Protein Design

    DEFF Research Database (Denmark)

    Johansson, Kristoffer Enøe

    Proteins are the major functional group of molecules in biology. The impact of protein science on medicine and chemical productions is rapidly increasing. However, the greatest potential remains to be realized. The fi eld of protein design has advanced computational modeling from a tool of support...... to a central method that enables new developments. For example, novel enzymes with functions not found in natural proteins have been de novo designed to give enough activity for experimental optimization. This thesis presents the current state-of-the-art within computational design methods together...... with a novel method based on probability theory. With the aim of assembling a complete pipeline for protein design, this work touches upon several aspects of protein design. The presented work is the computational half of a design project where the other half is dedicated to the experimental part...

  19. Modular Engineering Concept at Novo Nordisk Engineering

    DEFF Research Database (Denmark)

    Moelgaard, Gert; Miller, Thomas Dedenroth

    1997-01-01

    This report describes the concept of a new engineering method at Novo Nordisk Engineering: Modular Engineering (ME). Three tools are designed to support project phases with different levels of detailing and abstraction. ME supports a standard, cross-functional breakdown of projects that facilitates...

  20. Computationally efficient model predictive control algorithms a neural network approach

    CERN Document Server

    Ławryńczuk, Maciej

    2014-01-01

    This book thoroughly discusses computationally efficient (suboptimal) Model Predictive Control (MPC) techniques based on neural models. The subjects treated include: ·         A few types of suboptimal MPC algorithms in which a linear approximation of the model or of the predicted trajectory is successively calculated on-line and used for prediction. ·         Implementation details of the MPC algorithms for feedforward perceptron neural models, neural Hammerstein models, neural Wiener models and state-space neural models. ·         The MPC algorithms based on neural multi-models (inspired by the idea of predictive control). ·         The MPC algorithms with neural approximation with no on-line linearization. ·         The MPC algorithms with guaranteed stability and robustness. ·         Cooperation between the MPC algorithms and set-point optimization. Thanks to linearization (or neural approximation), the presented suboptimal algorithms do not require d...

  1. De novo synthesis of adenine nucleotides in different skeletal muscle fiber types

    International Nuclear Information System (INIS)

    Tullson, P.C.; John-Alder, H.B.; Hood, D.A.; Terjung, R.L.

    1988-01-01

    Management of adenine nucleotide catabolism differs among skeletal muscle fiber types. This study evaluated whether there are corresponding differences in the rates of de novo synthesis of adenine nucleotide among fiber type sections of skeletal muscle using an isolated perfused rat hindquarter preparation. Label incorporation into adenine nucleotides from the [1-14C]glycine precursor was determined and used to calculate synthesis rates based on the intracellular glycine specific radioactivity. Results show that intracellular glycine is closely related to the direct precursor pool. Rates of de novo synthesis were highest in fast-twitch red muscle (57.0 +/- 4.0, 58.2 +/- 4.4 nmol.h-1.g-1; deep red gastrocnemius and vastus lateralis), relatively high in slow-twitch red muscle (47.0 +/- 3.1; soleus), and low in fast-twitch white muscle (26.1 +/- 2.0 and 21.6 +/- 2.3; superficial white gastrocnemius and vastus lateralis). Rates for four mixed muscles were intermediate, ranging between 32.3 and 37.3. Specific de novo synthesis rates exhibited a strong correlation (r = 0.986) with muscle section citrate synthase activity. Turnover rates (de novo synthesis rate/adenine nucleotide pool size) were highest in high oxidative muscle (0.82-1.06%/h), lowest in low oxidative muscle (0.30-0.35%/h), and intermediate in mixed muscle (0.44-0.55%/h). Our results demonstrate that differences in adenine nucleotide management among fiber types extends to the process of de novo adenine nucleotide synthesis

  2. De novo transcriptome assembly of Sorghum bicolor variety Taejin

    Directory of Open Access Journals (Sweden)

    Yeonhwa Jo

    2016-06-01

    Full Text Available Sorghum (Sorghum bicolor, also known as great millet, is one of the most popular cultivated grass species in the world. Sorghum is frequently consumed as food for humans and animals as well as used for ethanol production. In this study, we conducted de novo transcriptome assembly for sorghum variety Taejin by next-generation sequencing, obtaining 8.748 GB of raw data. The raw data in this study can be available in NCBI SRA database with accession number of SRX1715644. Using the Trinity program, we identified 222,161 transcripts from sorghum variety Taejin. We further predicted coding regions within the assembled transcripts by the TransDecoder program, resulting in a total of 148,531 proteins. We carried out BLASTP against the Swiss-Prot protein sequence database to annotate the functions of the identified proteins. To our knowledge, this is the first transcriptome data for a sorghum variety derived from Korea, and it can be usefully applied to the generation of genetic markers.

  3. De novo assembly of a haplotype-resolved human genome.

    Science.gov (United States)

    Cao, Hongzhi; Wu, Honglong; Luo, Ruibang; Huang, Shujia; Sun, Yuhui; Tong, Xin; Xie, Yinlong; Liu, Binghang; Yang, Hailong; Zheng, Hancheng; Li, Jian; Li, Bo; Wang, Yu; Yang, Fang; Sun, Peng; Liu, Siyang; Gao, Peng; Huang, Haodong; Sun, Jing; Chen, Dan; He, Guangzhu; Huang, Weihua; Huang, Zheng; Li, Yue; Tellier, Laurent C A M; Liu, Xiao; Feng, Qiang; Xu, Xun; Zhang, Xiuqing; Bolund, Lars; Krogh, Anders; Kristiansen, Karsten; Drmanac, Radoje; Drmanac, Snezana; Nielsen, Rasmus; Li, Songgang; Wang, Jian; Yang, Huanming; Li, Yingrui; Wong, Gane Ka-Shu; Wang, Jun

    2015-06-01

    The human genome is diploid, and knowledge of the variants on each chromosome is important for the interpretation of genomic information. Here we report the assembly of a haplotype-resolved diploid genome without using a reference genome. Our pipeline relies on fosmid pooling together with whole-genome shotgun strategies, based solely on next-generation sequencing and hierarchical assembly methods. We applied our sequencing method to the genome of an Asian individual and generated a 5.15-Gb assembled genome with a haplotype N50 of 484 kb. Our analysis identified previously undetected indels and 7.49 Mb of novel coding sequences that could not be aligned to the human reference genome, which include at least six predicted genes. This haplotype-resolved genome represents the most complete de novo human genome assembly to date. Application of our approach to identify individual haplotype differences should aid in translating genotypes to phenotypes for the development of personalized medicine.

  4. Wegener's granulomatosis occurring de novo during pregnancy.

    Science.gov (United States)

    Alfhaily, F; Watts, R; Leather, A

    2009-01-01

    Wegener's granulomatosis (WG) is rarely diagnosed during the reproductive years and uncommonly manifests for the first time during pregnancy. We report a case of de novo WG presenting at 30 weeks gestation with classical symptoms of WG (ENT, pulmonary). The diagnosis was confirmed by radiological, laboratory, and histological investigations. With a multidisciplinary approach, she had a successful vaginal delivery of a healthy baby. She was treated successfully by a combination of steroids, azathioprine and intravenous immunoglobulin in the active phase of disease for induction of remission and by azathioprine and steroids for maintenance of remission. The significant improvement in her symptoms allowed us to continue her pregnancy to 37 weeks when delivery was electively induced. Transplacental transmission of PR3-ANCA occurred but the neonate remained well. This case of de novo WG during pregnancy highlights the seriousness of this disease and the challenge in management of such patients.

  5. Mathematical modeling and computational prediction of cancer drug resistance.

    Science.gov (United States)

    Sun, Xiaoqiang; Hu, Bin

    2017-06-23

    Diverse forms of resistance to anticancer drugs can lead to the failure of chemotherapy. Drug resistance is one of the most intractable issues for successfully treating cancer in current clinical practice. Effective clinical approaches that could counter drug resistance by restoring the sensitivity of tumors to the targeted agents are urgently needed. As numerous experimental results on resistance mechanisms have been obtained and a mass of high-throughput data has been accumulated, mathematical modeling and computational predictions using systematic and quantitative approaches have become increasingly important, as they can potentially provide deeper insights into resistance mechanisms, generate novel hypotheses or suggest promising treatment strategies for future testing. In this review, we first briefly summarize the current progress of experimentally revealed resistance mechanisms of targeted therapy, including genetic mechanisms, epigenetic mechanisms, posttranslational mechanisms, cellular mechanisms, microenvironmental mechanisms and pharmacokinetic mechanisms. Subsequently, we list several currently available databases and Web-based tools related to drug sensitivity and resistance. Then, we focus primarily on introducing some state-of-the-art computational methods used in drug resistance studies, including mechanism-based mathematical modeling approaches (e.g. molecular dynamics simulation, kinetic model of molecular networks, ordinary differential equation model of cellular dynamics, stochastic model, partial differential equation model, agent-based model, pharmacokinetic-pharmacodynamic model, etc.) and data-driven prediction methods (e.g. omics data-based conventional screening approach for node biomarkers, static network approach for edge biomarkers and module biomarkers, dynamic network approach for dynamic network biomarkers and dynamic module network biomarkers, etc.). Finally, we discuss several further questions and future directions for the use of

  6. Identification of optimum sequencing depth especially for de novo genome assembly of small genomes using next generation sequencing data.

    Science.gov (United States)

    Desai, Aarti; Marwah, Veer Singh; Yadav, Akshay; Jha, Vineet; Dhaygude, Kishor; Bangar, Ujwala; Kulkarni, Vivek; Jere, Abhay

    2013-01-01

    Next Generation Sequencing (NGS) is a disruptive technology that has found widespread acceptance in the life sciences research community. The high throughput and low cost of sequencing has encouraged researchers to undertake ambitious genomic projects, especially in de novo genome sequencing. Currently, NGS systems generate sequence data as short reads and de novo genome assembly using these short reads is computationally very intensive. Due to lower cost of sequencing and higher throughput, NGS systems now provide the ability to sequence genomes at high depth. However, currently no report is available highlighting the impact of high sequence depth on genome assembly using real data sets and multiple assembly algorithms. Recently, some studies have evaluated the impact of sequence coverage, error rate and average read length on genome assembly using multiple assembly algorithms, however, these evaluations were performed using simulated datasets. One limitation of using simulated datasets is that variables such as error rates, read length and coverage which are known to impact genome assembly are carefully controlled. Hence, this study was undertaken to identify the minimum depth of sequencing required for de novo assembly for different sized genomes using graph based assembly algorithms and real datasets. Illumina reads for E.coli (4.6 MB) S.kudriavzevii (11.18 MB) and C.elegans (100 MB) were assembled using SOAPdenovo, Velvet, ABySS, Meraculous and IDBA-UD. Our analysis shows that 50X is the optimum read depth for assembling these genomes using all assemblers except Meraculous which requires 100X read depth. Moreover, our analysis shows that de novo assembly from 50X read data requires only 6-40 GB RAM depending on the genome size and assembly algorithm used. We believe that this information can be extremely valuable for researchers in designing experiments and multiplexing which will enable optimum utilization of sequencing as well as analysis resources.

  7. Applying Machine Learning and High Performance Computing to Water Quality Assessment and Prediction

    Directory of Open Access Journals (Sweden)

    Ruijian Zhang

    2017-12-01

    Full Text Available Water quality assessment and prediction is a more and more important issue. Traditional ways either take lots of time or they can only do assessments. In this research, by applying machine learning algorithm to a long period time of water attributes’ data; we can generate a decision tree so that it can predict the future day’s water quality in an easy and efficient way. The idea is to combine the traditional ways and the computer algorithms together. Using machine learning algorithms, the assessment of water quality will be far more efficient, and by generating the decision tree, the prediction will be quite accurate. The drawback of the machine learning modeling is that the execution takes quite long time, especially when we employ a better accuracy but more time-consuming algorithm in clustering. Therefore, we applied the high performance computing (HPC System to deal with this problem. Up to now, the pilot experiments have achieved very promising preliminary results. The visualized water quality assessment and prediction obtained from this project would be published in an interactive website so that the public and the environmental managers could use the information for their decision making.

  8. De novo ORFs in Drosophila are important to organismal fitness and evolved rapidly from previously non-coding sequences.

    Directory of Open Access Journals (Sweden)

    Josephine A Reinhardt

    Full Text Available How non-coding DNA gives rise to new protein-coding genes (de novo genes is not well understood. Recent work has revealed the origins and functions of a few de novo genes, but common principles governing the evolution or biological roles of these genes are unknown. To better define these principles, we performed a parallel analysis of the evolution and function of six putatively protein-coding de novo genes described in Drosophila melanogaster. Reconstruction of the transcriptional history of de novo genes shows that two de novo genes emerged from novel long non-coding RNAs that arose at least 5 MY prior to evolution of an open reading frame. In contrast, four other de novo genes evolved a translated open reading frame and transcription within the same evolutionary interval suggesting that nascent open reading frames (proto-ORFs, while not required, can contribute to the emergence of a new de novo gene. However, none of the genes arose from proto-ORFs that existed long before expression evolved. Sequence and structural evolution of de novo genes was rapid compared to nearby genes and the structural complexity of de novo genes steadily increases over evolutionary time. Despite the fact that these genes are transcribed at a higher level in males than females, and are most strongly expressed in testes, RNAi experiments show that most of these genes are essential in both sexes during metamorphosis. This lethality suggests that protein coding de novo genes in Drosophila quickly become functionally important.

  9. Modeling ERBB receptor-regulated G1/S transition to find novel targets for de novo trastuzumab resistance

    Directory of Open Access Journals (Sweden)

    Thieffry Denis

    2009-01-01

    Full Text Available Abstract Background In breast cancer, overexpression of the transmembrane tyrosine kinase ERBB2 is an adverse prognostic marker, and occurs in almost 30% of the patients. For therapeutic intervention, ERBB2 is targeted by monoclonal antibody trastuzumab in adjuvant settings; however, de novo resistance to this antibody is still a serious issue, requiring the identification of additional targets to overcome resistance. In this study, we have combined computational simulations, experimental testing of simulation results, and finally reverse engineering of a protein interaction network to define potential therapeutic strategies for de novo trastuzumab resistant breast cancer. Results First, we employed Boolean logic to model regulatory interactions and simulated single and multiple protein loss-of-functions. Then, our simulation results were tested experimentally by producing single and double knockdowns of the network components and measuring their effects on G1/S transition during cell cycle progression. Combinatorial targeting of ERBB2 and EGFR did not affect the response to trastuzumab in de novo resistant cells, which might be due to decoupling of receptor activation and cell cycle progression. Furthermore, examination of c-MYC in resistant as well as in sensitive cell lines, using a specific chemical inhibitor of c-MYC (alone or in combination with trastuzumab, demonstrated that both trastuzumab sensitive and resistant cells responded to c-MYC perturbation. Conclusion In this study, we connected ERBB signaling with G1/S transition of the cell cycle via two major cell signaling pathways and two key transcription factors, to model an interaction network that allows for the identification of novel targets in the treatment of trastuzumab resistant breast cancer. Applying this new strategy, we found that, in contrast to trastuzumab sensitive breast cancer cells, combinatorial targeting of ERBB receptors or of key signaling intermediates does not

  10. Persistent hyperthyroidism and de novo Graves' ophthalmopathy after total thyroidectomy.

    Science.gov (United States)

    Tay, Wei Lin; Loh, Wann Jia; Lee, Lianne Ai Ling; Chng, Chiaw Ling

    2017-01-01

    We report a patient with Graves' disease who remained persistently hyperthyroid after a total thyroidectomy and also developed de novo Graves' ophthalmopathy 5 months after surgery. She was subsequently found to have a mature cystic teratoma containing struma ovarii after undergoing a total hysterectomy and salpingo-oophorectomy for an incidental ovarian lesion. It is important to investigate for other causes of primary hyperthyroidism when thyrotoxicosis persists after total thyroidectomy.TSH receptor antibody may persist after total thyroidectomy and may potentially contribute to the development of de novo Graves' ophthalmopathy.

  11. Selecting Superior De Novo Transcriptome Assemblies: Lessons Learned by Leveraging the Best Plant Genome.

    Directory of Open Access Journals (Sweden)

    Loren A Honaas

    Full Text Available Whereas de novo assemblies of RNA-Seq data are being published for a growing number of species across the tree of life, there are currently no broadly accepted methods for evaluating such assemblies. Here we present a detailed comparison of 99 transcriptome assemblies, generated with 6 de novo assemblers including CLC, Trinity, SOAP, Oases, ABySS and NextGENe. Controlled analyses of de novo assemblies for Arabidopsis thaliana and Oryza sativa transcriptomes provide new insights into the strengths and limitations of transcriptome assembly strategies. We find that the leading assemblers generate reassuringly accurate assemblies for the majority of transcripts. At the same time, we find a propensity for assemblers to fail to fully assemble highly expressed genes. Surprisingly, the instance of true chimeric assemblies is very low for all assemblers. Normalized libraries are reduced in highly abundant transcripts, but they also lack 1000s of low abundance transcripts. We conclude that the quality of de novo transcriptome assemblies is best assessed through consideration of a combination of metrics: 1 proportion of reads mapping to an assembly 2 recovery of conserved, widely expressed genes, 3 N50 length statistics, and 4 the total number of unigenes. We provide benchmark Illumina transcriptome data and introduce SCERNA, a broadly applicable modular protocol for de novo assembly improvement. Finally, our de novo assembly of the Arabidopsis leaf transcriptome revealed ~20 putative Arabidopsis genes lacking in the current annotation.

  12. Role of de novo biosynthesis in ecosystem scale monoterpene emissions from a boreal Scots pine forest

    Directory of Open Access Journals (Sweden)

    R. Taipale

    2011-08-01

    Full Text Available Monoterpene emissions from Scots pine have traditionally been assumed to originate as evaporation from specialized storage pools. More recently, the significance of de novo emissions, originating directly from monoterpene biosynthesis, has been recognized. To study the role of biosynthesis at the ecosystem scale, we measured monoterpene emissions from a Scots pine dominated forest in southern Finland using the disjunct eddy covariance method combined with proton transfer reaction mass spectrometry. The interpretation of the measurements was based on a correlation analysis and a hybrid emission algorithm describing both de novo and pool emissions. During the measurement period May–August 2007, the monthly medians of daytime emissions were 200, 290, 180, and 200 μg m−2 h−1. The emissions were partly light dependent, probably due to de novo biosynthesis. The emission potential for both de novo and pool emissions exhibited a decreasing summertime trend. The ratio of the de novo emission potential to the total emission potential varied between 30 % and 46 %. Although the monthly changes were not significant, the ratio always differed statistically from zero, suggesting that the role of de novo biosynthesis was observable. Given the uncertainties in this study, we conclude that more accurate estimates of the contribution of de novo emissions are required for improving monoterpene emission algorithms for Scots pine dominated forests.

  13. Computational neurorehabilitation: modeling plasticity and learning to predict recovery.

    Science.gov (United States)

    Reinkensmeyer, David J; Burdet, Etienne; Casadio, Maura; Krakauer, John W; Kwakkel, Gert; Lang, Catherine E; Swinnen, Stephan P; Ward, Nick S; Schweighofer, Nicolas

    2016-04-30

    Despite progress in using computational approaches to inform medicine and neuroscience in the last 30 years, there have been few attempts to model the mechanisms underlying sensorimotor rehabilitation. We argue that a fundamental understanding of neurologic recovery, and as a result accurate predictions at the individual level, will be facilitated by developing computational models of the salient neural processes, including plasticity and learning systems of the brain, and integrating them into a context specific to rehabilitation. Here, we therefore discuss Computational Neurorehabilitation, a newly emerging field aimed at modeling plasticity and motor learning to understand and improve movement recovery of individuals with neurologic impairment. We first explain how the emergence of robotics and wearable sensors for rehabilitation is providing data that make development and testing of such models increasingly feasible. We then review key aspects of plasticity and motor learning that such models will incorporate. We proceed by discussing how computational neurorehabilitation models relate to the current benchmark in rehabilitation modeling - regression-based, prognostic modeling. We then critically discuss the first computational neurorehabilitation models, which have primarily focused on modeling rehabilitation of the upper extremity after stroke, and show how even simple models have produced novel ideas for future investigation. Finally, we conclude with key directions for future research, anticipating that soon we will see the emergence of mechanistic models of motor recovery that are informed by clinical imaging results and driven by the actual movement content of rehabilitation therapy as well as wearable sensor-based records of daily activity.

  14. Applying a computer-aided scheme to detect a new radiographic image marker for prediction of chemotherapy outcome

    International Nuclear Information System (INIS)

    Wang, Yunzhi; Qiu, Yuchen; Thai, Theresa; Moore, Kathleen; Liu, Hong; Zheng, Bin

    2016-01-01

    To investigate the feasibility of automated segmentation of visceral and subcutaneous fat areas from computed tomography (CT) images of ovarian cancer patients and applying the computed adiposity-related image features to predict chemotherapy outcome. A computerized image processing scheme was developed to segment visceral and subcutaneous fat areas, and compute adiposity-related image features. Then, logistic regression models were applied to analyze association between the scheme-generated assessment scores and progression-free survival (PFS) of patients using a leave-one-case-out cross-validation method and a dataset involving 32 patients. The correlation coefficients between automated and radiologist’s manual segmentation of visceral and subcutaneous fat areas were 0.76 and 0.89, respectively. The scheme-generated prediction scores using adiposity-related radiographic image features significantly associated with patients’ PFS (p < 0.01). Using a computerized scheme enables to more efficiently and robustly segment visceral and subcutaneous fat areas. The computed adiposity-related image features also have potential to improve accuracy in predicting chemotherapy outcome

  15. An integrated computational validation approach for potential novel miRNA prediction

    Directory of Open Access Journals (Sweden)

    Pooja Viswam

    2017-12-01

    Full Text Available MicroRNAs (miRNAs are short, non-coding RNAs between 17bp-24bp length that regulate gene expression by targeting mRNA molecules. The regulatory functions of miRNAs are known to be majorly associated with disease phenotypes such as cancer, cell signaling, cell division, growth and other metabolisms. Novel miRNAs are defined as sequences which does not have any similarity with the existing known sequences and void of any experimental evidences. In recent decades, the advent of next-generation sequencing allows us to capture the small RNA molecules form the cells and developing methods to estimate their expression levels. Several computational algorithms are available to predict the novel miRNAs from the deep sequencing data. In this work, we integrated three novel miRNA prediction programs miRDeep, miRanalyzer and miRPRo to compare and validate their prediction efficiency. The dicer cleavage sites, alignment density, seed conservation, minimum free energy, AU-GC percentage, secondary loop scores, false discovery rates and confidence scores will be considered for comparison and evaluation. Efficiency to identify isomiRs and base pair mismatches in a strand specific manner will also be considered for the computational validation. Further, the criteria and parameters for the identification of the best possible novel miRNA with minimal false positive rates were deduced.

  16. Motivation and emotion predict medical students' attention to computer-based feedback.

    Science.gov (United States)

    Naismith, Laura M; Lajoie, Susanne P

    2017-12-14

    Students cannot learn from feedback unless they pay attention to it. This study investigated relationships between the personal factors of achievement goal orientations, achievement emotions, and attention to feedback in BioWorld, a computer environment for learning clinical reasoning. Novice medical students (N = 28) completed questionnaires to measure their achievement goal orientations and then thought aloud while solving three endocrinology patient cases and reviewing corresponding expert solutions. Questionnaires administered after each case measured participants' experiences of five feedback emotions: pride, relief, joy, shame, and anger. Attention to individual text segments of the expert solutions was modelled using logistic regression and the method of generalized estimating equations. Participants did not attend to all of the feedback that was available to them. Performance-avoidance goals and shame positively predicted attention to feedback, and performance-approach goals and relief negatively predicted attention to feedback. Aspects of how the feedback was displayed also influenced participants' attention. Findings are discussed in terms of their implications for educational theory as well as the design and use of computer learning environments in medical education.

  17. De novo nonsense mutations in ASXL1 cause Bohring-Opitz syndrome

    DEFF Research Database (Denmark)

    Hoischen, Alexander; van Bon, Bregje W M; Rodríguez-Santiago, Benjamín

    2011-01-01

    Bohring-Opitz syndrome is characterized by severe intellectual disability, distinctive facial features and multiple congenital malformations. We sequenced the exomes of three individuals with Bohring-Opitz syndrome and in each identified heterozygous de novo nonsense mutations in ASXL1, which...... is required for maintenance of both activation and silencing of Hox genes. In total, 7 out of 13 subjects with a Bohring-Opitz phenotype had de novo ASXL1 mutations, suggesting that the syndrome is genetically heterogeneous....

  18. De novo triiodothyronine formation from thyrocytes activated by thyroid-stimulating hormone.

    Science.gov (United States)

    Citterio, Cintia E; Veluswamy, Balaji; Morgan, Sarah J; Galton, Valerie A; Banga, J Paul; Atkins, Stephen; Morishita, Yoshiaki; Neumann, Susanne; Latif, Rauf; Gershengorn, Marvin C; Smith, Terry J; Arvan, Peter

    2017-09-15

    The thyroid gland secretes primarily tetraiodothyronine (T 4 ), and some triiodothyronine (T 3 ). Under normal physiological circumstances, only one-fifth of circulating T 3 is directly released by the thyroid, but in states of hyperactivation of thyroid-stimulating hormone receptors (TSHRs), patients develop a syndrome of relative T 3 toxicosis. Thyroidal T 4 production results from iodination of thyroglobulin (TG) at residues Tyr 5 and Tyr 130 , whereas thyroidal T 3 production may originate in several different ways. In this study, the data demonstrate that within the carboxyl-terminal portion of mouse TG, T 3 is formed de novo independently of deiodination from T 4 We found that upon iodination in vitro , de novo T 3 formation in TG was decreased in mice lacking TSHRs. Conversely, de novo T 3 that can be formed upon iodination of TG secreted from PCCL3 (rat thyrocyte) cells was augmented from cells previously exposed to increased TSH, a TSHR agonist, a cAMP analog, or a TSHR-stimulating antibody. We present data suggesting that TSH-stimulated TG phosphorylation contributes to enhanced de novo T 3 formation. These effects were reversed within a few days after removal of the hyperstimulating conditions. Indeed, direct exposure of PCCL3 cells to human serum from two patients with Graves' disease, but not control sera, led to secretion of TG with an increased intrinsic ability to form T 3 upon in vitro iodination. Furthermore, TG secreted from human thyrocyte cultures hyperstimulated with TSH also showed an increased intrinsic ability to form T 3 Our data support the hypothesis that TG processing in the secretory pathway of TSHR-hyperstimulated thyrocytes alters the structure of the iodination substrate in a way that enhances de novo T 3 formation, contributing to the relative T 3 toxicosis of Graves' disease.

  19. On the Bayesian calibration of computer model mixtures through experimental data, and the design of predictive models

    Science.gov (United States)

    Karagiannis, Georgios; Lin, Guang

    2017-08-01

    For many real systems, several computer models may exist with different physics and predictive abilities. To achieve more accurate simulations/predictions, it is desirable for these models to be properly combined and calibrated. We propose the Bayesian calibration of computer model mixture method which relies on the idea of representing the real system output as a mixture of the available computer model outputs with unknown input dependent weight functions. The method builds a fully Bayesian predictive model as an emulator for the real system output by combining, weighting, and calibrating the available models in the Bayesian framework. Moreover, it fits a mixture of calibrated computer models that can be used by the domain scientist as a mean to combine the available computer models, in a flexible and principled manner, and perform reliable simulations. It can address realistic cases where one model may be more accurate than the others at different input values because the mixture weights, indicating the contribution of each model, are functions of the input. Inference on the calibration parameters can consider multiple computer models associated with different physics. The method does not require knowledge of the fidelity order of the models. We provide a technique able to mitigate the computational overhead due to the consideration of multiple computer models that is suitable to the mixture model framework. We implement the proposed method in a real-world application involving the Weather Research and Forecasting large-scale climate model.

  20. De Novo Design of Bioactive Small Molecules by Artificial Intelligence.

    Science.gov (United States)

    Merk, Daniel; Friedrich, Lukas; Grisoni, Francesca; Schneider, Gisbert

    2018-01-01

    Generative artificial intelligence offers a fresh view on molecular design. We present the first-time prospective application of a deep learning model for designing new druglike compounds with desired activities. For this purpose, we trained a recurrent neural network to capture the constitution of a large set of known bioactive compounds represented as SMILES strings. By transfer learning, this general model was fine-tuned on recognizing retinoid X and peroxisome proliferator-activated receptor agonists. We synthesized five top-ranking compounds designed by the generative model. Four of the compounds revealed nanomolar to low-micromolar receptor modulatory activity in cell-based assays. Apparently, the computational model intrinsically captured relevant chemical and biological knowledge without the need for explicit rules. The results of this study advocate generative artificial intelligence for prospective de novo molecular design, and demonstrate the potential of these methods for future medicinal chemistry. © 2018 The Authors. Published by Wiley-VCH Verlag GmbH & Co. KGaA.

  1. The impact of employee satisfaction on productivity in Tiskarna Novo mesto, Ltd.

    Directory of Open Access Journals (Sweden)

    Simona Cimperman

    2016-06-01

    Full Text Available Research Question: Does employee satisfaction, impact on productivity? How are these two variables associated? What is the job satisfaction in Tiskarna Novo mesto, Ltd. What needs to be done to make employees more satisfied at work and, consequently, more productive? Purpose: The purpose of the study is to determine what are the factors that influence employee satisfaction Tiskarna Novo mesto, Ltd. and check the connection between work satisfaction and employee productivity. The aim of the research is to examine what is the level of job satisfaction of employees in Tiskarna Novo mesto, Ltd. And find our reasons and factors that prevent employees were satisfied in the workplace. Method: In this study we used a descriptive method and the method of combining the study of domestic and foreign literature. Pending the results we have come to interview employees in the Tiskarna Novo mesto, Ltd. Results: We conducted a survey among employees in Tiskarna Novo mesto, Ltd and we came to the conclusion that the employees are medium satisfied – the average grade point job satisfaction of employees was 3.1 (evaluated on a 5-point Likert scale. The worst assessed was factor in job satisfaction opportunity for advancement and educational opportunities. We have found out that factors like receiving praise and awards as well as good interpersonal relations are those that affect good on job satisfaction, on the other hand conflict is the one that reduces job satisfaction. The existence of links between work satisfaction and productivity were not found (r = -0.061. Organization: The organization and managers, it is important to know which are the factors by which employees are satisfied or dissatisfied. Results of the research will give managers a clear picture of the factors of satisfaction / dissatisfaction and opinion on productivity. Society: The employees it means a lot to have your job satisfaction and consequently they are more productive. Originality: The

  2. Interplay between De Novo Biosynthesis and Sequestration of Cyanogenic Glucosides in Arthropods

    DEFF Research Database (Denmark)

    Fürstenberg-Hägg, Joel

    (Zygaenidae, Lepidoptera) both sequester (take up and accumulate) the CNglcs linamarin and lotaustralin from their food plants (Fabacea) and biosynthesize them de novo from valine and isoleucine. The presented research demonstrates that de novo biosynthesis of CNglcs in Z. filipendulae is dependent...

  3. Prediction of velocity and attitude of a yacht sailing upwind by computational fluid dynamics

    Directory of Open Access Journals (Sweden)

    Heebum Lee

    2016-01-01

    Full Text Available One of the most important factors in sailing yacht design is accurate velocity prediction. Velocity prediction programs (VPP's are widely used to predict velocity of sailing yachts. VPP's, which are primarily based on experimental data and experience of long years, however suffer limitations when applied in realistic conditions. Thus, in the present study, a high fidelity velocity prediction method using computational fluid dynamics (CFD was proposed. Using the developed method, velocity and attitude of a 30 feet sloop yacht, which was developed by Korea Research Institute of Ship and Ocean (KRISO and termed KORDY30, were predicted in upwind sailing condition.

  4. Novo Jornalismo: fronteiras litero-factuais em A sangue Frio e em Radical Chique

    Directory of Open Access Journals (Sweden)

    Francisco Aquinei Timóteo Queirós

    2012-12-01

    Full Text Available A pesquisa busca analisar de que forma fato e ficção se entrecruzam no “movimento” do Novo Jornalismo, a partir das obras A sangue Frio e Radical Chique e o Novo Jornalismo, de Truman Capote e Tom Wolfe, respectivamente. Pretende-se, a partir da investigação do corpus em estudo, revelar os aspectos que aproximam o fato jornalístico, a notícia e a reportagem às técnicas literárias do romance, do conto e da crônica. O estudo investiga o Novo Jornalismo sob o viés de textos centrais das áreas de teoria literária e estudos jornalísticos utilizando autores como Mikhail Bakhtin, Hayden White, Paul Ricoeur, Muniz Sodré; além de referenciar outros escritores que, como Tom Wolfe e Truman Capote, fizeram parte de um grande movimento renovador do jornalismo literário nos anos 1950, 1960 e 1970 chamado, genericamente, de Novo Jornalismo.

  5. Demanda dos principais metais e novos materiais : analise de tendencias

    OpenAIRE

    Wilson Trigueiro de Sousa

    1990-01-01

    Resumo: Neste trabalho são analisadas algumas tendências na área de novos materiais na tentativa de obter um melhor entendimento das repercussões das atuais inovações tecnológicas para o setor mineral. Inicialmente são revisados os principais estudos sobre as mudanças ocorridas por volta de 1972/74 no comportamento da demanda dos metais mais importantes. Entre as possíveis causas, está o progresso técnico, que tornou possível o surgimento de novos materiais e o aperfeiçoamento de outros em us...

  6. The immobilization of heavy metals in soil by bioaugmentation of a UV-mutant Bacillus subtilis 38 assisted by NovoGro biostimulation and changes of soil microbial community

    Energy Technology Data Exchange (ETDEWEB)

    Wang, Ting [MOE Key Laboratory of Pollution Processes and Environmental Criteria, College of Environmental Science and Engineering, Nankai University, Tianjin 300071 (China); Urban Transport Emission Control Research Centre, College of Environmental Science and Engineering, Nankai University, Tianjin 300071 (China); Sun, Hongwen, E-mail: sunhongwen@nankai.edu.cn [MOE Key Laboratory of Pollution Processes and Environmental Criteria, College of Environmental Science and Engineering, Nankai University, Tianjin 300071 (China); Mao, Hongjun [Urban Transport Emission Control Research Centre, College of Environmental Science and Engineering, Nankai University, Tianjin 300071 (China); Zhang, Yanfeng; Wang, Cuiping; Zhang, Zhiyuan; Wang, Baolin; Sun, Lei [MOE Key Laboratory of Pollution Processes and Environmental Criteria, College of Environmental Science and Engineering, Nankai University, Tianjin 300071 (China)

    2014-08-15

    Highlights: • A UV-mutated species, Bacillus subtilis 38, is a good sorbent for multi-metals (Cd, Cr, Hg and Pb). • B38 mixed with NovoGro exhibited a synergetic effect on the immobilization of heavy metals in soil. • DTPA, M3 and BCR were suitable for predicting metal bioavailability for specific classes of plant. • The NovoGro could enhance the proliferation of both exotic B38 and native microbes. • It's a practical strategy for the remediation of actual farmland polluted by multi-heavy metals. - Abstract: Bacillus subtilis 38 (B38) is a mutant species of Bacillus subtilis acquired by UV irradiation with high cadmium tolerance. This study revealed that B38 was a good biosorbent for the adsorption of multiple heavy metals (cadmium, chromium, mercury, and lead). Simultaneous application of B38 and NovoGro (SNB) exhibited a synergetic effect on the immobilization of heavy metals in soil. The heavy metal concentrations in the edible part of the tested plants (lettuce, radish, and soybean) under SNB treatment decreased by 55.4–97.9% compared to the control. Three single extraction methods, diethylenetriaminepentaacetic acid (DTPA), Mehlich 3 (M3), and the first step of the Community Bureau of Reference method (BCR1), showed good predictive capacities for metal bioavailability to leafy, rhizome, and leguminous plant, respectively. The polymerase chain reaction–denaturing gradient gel electrophoresis (PCR–DGGE) profiles revealed that NovoGro could enhance the proliferation of both exotic B38 and native microbes. Finally, the technology was checked in the field, the reduction in heavy metal concentrations in the edible part of radish was in the range between 30.8% and 96.0% after bioremediation by SNB treatment. This study provides a practical strategy for the remediation of farmland contaminated by multiple heavy metals.

  7. The immobilization of heavy metals in soil by bioaugmentation of a UV-mutant Bacillus subtilis 38 assisted by NovoGro biostimulation and changes of soil microbial community

    International Nuclear Information System (INIS)

    Wang, Ting; Sun, Hongwen; Mao, Hongjun; Zhang, Yanfeng; Wang, Cuiping; Zhang, Zhiyuan; Wang, Baolin; Sun, Lei

    2014-01-01

    Highlights: • A UV-mutated species, Bacillus subtilis 38, is a good sorbent for multi-metals (Cd, Cr, Hg and Pb). • B38 mixed with NovoGro exhibited a synergetic effect on the immobilization of heavy metals in soil. • DTPA, M3 and BCR were suitable for predicting metal bioavailability for specific classes of plant. • The NovoGro could enhance the proliferation of both exotic B38 and native microbes. • It's a practical strategy for the remediation of actual farmland polluted by multi-heavy metals. - Abstract: Bacillus subtilis 38 (B38) is a mutant species of Bacillus subtilis acquired by UV irradiation with high cadmium tolerance. This study revealed that B38 was a good biosorbent for the adsorption of multiple heavy metals (cadmium, chromium, mercury, and lead). Simultaneous application of B38 and NovoGro (SNB) exhibited a synergetic effect on the immobilization of heavy metals in soil. The heavy metal concentrations in the edible part of the tested plants (lettuce, radish, and soybean) under SNB treatment decreased by 55.4–97.9% compared to the control. Three single extraction methods, diethylenetriaminepentaacetic acid (DTPA), Mehlich 3 (M3), and the first step of the Community Bureau of Reference method (BCR1), showed good predictive capacities for metal bioavailability to leafy, rhizome, and leguminous plant, respectively. The polymerase chain reaction–denaturing gradient gel electrophoresis (PCR–DGGE) profiles revealed that NovoGro could enhance the proliferation of both exotic B38 and native microbes. Finally, the technology was checked in the field, the reduction in heavy metal concentrations in the edible part of radish was in the range between 30.8% and 96.0% after bioremediation by SNB treatment. This study provides a practical strategy for the remediation of farmland contaminated by multiple heavy metals

  8. Impact of implementation choices on quantitative predictions of cell-based computational models

    Science.gov (United States)

    Kursawe, Jochen; Baker, Ruth E.; Fletcher, Alexander G.

    2017-09-01

    'Cell-based' models provide a powerful computational tool for studying the mechanisms underlying the growth and dynamics of biological tissues in health and disease. An increasing amount of quantitative data with cellular resolution has paved the way for the quantitative parameterisation and validation of such models. However, the numerical implementation of cell-based models remains challenging, and little work has been done to understand to what extent implementation choices may influence model predictions. Here, we consider the numerical implementation of a popular class of cell-based models called vertex models, which are often used to study epithelial tissues. In two-dimensional vertex models, a tissue is approximated as a tessellation of polygons and the vertices of these polygons move due to mechanical forces originating from the cells. Such models have been used extensively to study the mechanical regulation of tissue topology in the literature. Here, we analyse how the model predictions may be affected by numerical parameters, such as the size of the time step, and non-physical model parameters, such as length thresholds for cell rearrangement. We find that vertex positions and summary statistics are sensitive to several of these implementation parameters. For example, the predicted tissue size decreases with decreasing cell cycle durations, and cell rearrangement may be suppressed by large time steps. These findings are counter-intuitive and illustrate that model predictions need to be thoroughly analysed and implementation details carefully considered when applying cell-based computational models in a quantitative setting.

  9. Sequencing and de novo assembly of 150 genomes from Denmark as a population reference

    DEFF Research Database (Denmark)

    Maretty, Lasse; Jensen, Jacob Malte; Petersen, Bent

    2017-01-01

    or by performing local assembly. However, these approaches are biased against discovery of structural variants and variation in the more complex parts of the genome. Hence, large-scale de novo assembly is needed. Here we show that it is possible to construct excellent de novo assemblies from high......-coverage sequencing with mate-pair libraries extending up to 20 kilobases. We report de novo assemblies of 150 individuals (50 trios) from the GenomeDenmark project. The quality of these assemblies is similar to those obtained using the more expensive long-read technology. We use the assemblies to identify a rich set...

  10. Computational modeling for prediction of the shear stress of three-dimensional isotropic and aligned fiber networks.

    Science.gov (United States)

    Park, Seungman

    2017-09-01

    Interstitial flow (IF) is a creeping flow through the interstitial space of the extracellular matrix (ECM). IF plays a key role in diverse biological functions, such as tissue homeostasis, cell function and behavior. Currently, most studies that have characterized IF have focused on the permeability of ECM or shear stress distribution on the cells, but less is known about the prediction of shear stress on the individual fibers or fiber networks despite its significance in the alignment of matrix fibers and cells observed in fibrotic or wound tissues. In this study, I developed a computational model to predict shear stress for different structured fibrous networks. To generate isotropic models, a random growth algorithm and a second-order orientation tensor were employed. Then, a three-dimensional (3D) solid model was created using computer-aided design (CAD) software for the aligned models (i.e., parallel, perpendicular and cubic models). Subsequently, a tetrahedral unstructured mesh was generated and flow solutions were calculated by solving equations for mass and momentum conservation for all models. Through the flow solutions, I estimated permeability using Darcy's law. Average shear stress (ASS) on the fibers was calculated by averaging the wall shear stress of the fibers. By using nonlinear surface fitting of permeability, viscosity, velocity, porosity and ASS, I devised new computational models. Overall, the developed models showed that higher porosity induced higher permeability, as previous empirical and theoretical models have shown. For comparison of the permeability, the present computational models were matched well with previous models, which justify our computational approach. ASS tended to increase linearly with respect to inlet velocity and dynamic viscosity, whereas permeability was almost the same. Finally, the developed model nicely predicted the ASS values that had been directly estimated from computational fluid dynamics (CFD). The present

  11. Hidden Hearing Loss and Computational Models of the Auditory Pathway: Predicting Speech Intelligibility Decline

    Science.gov (United States)

    2016-11-28

    Title: Hidden Hearing Loss and Computational Models of the Auditory Pathway: Predicting Speech Intelligibility Decline Christopher J. Smalt...representation of speech intelligibility in noise. The auditory-periphery model of Zilany et al. (JASA 2009,2014) is used to make predictions of...auditory nerve (AN) responses to speech stimuli under a variety of difficult listening conditions. The resulting cochlear neurogram, a spectrogram

  12. Chapter 6 – Computer-Aided Molecular Design and Property Prediction

    DEFF Research Database (Denmark)

    Gani, Rafiqul; Zhang, L.; Kalakul, Sawitree

    2017-01-01

    for the initial stages of the design/development process. Therefore, computer-aided molecular design and property prediction techniques are two topics that play important roles in chemical product design, analysis, and application. In this chapter, an overview of the concepts, methods, and tools related......Today's society needs many chemical-based products for its survival, nutrition, health, transportation, agriculture, and the functioning of processes. Chemical-based products have to be designed/developed in order to meet these needs, while at the same time, they must be innovative and sustainable...... to these two topics are given. In addition, a generic computer-aided framework for the design of molecules, mixtures, and blends is presented. The application of the framework is highlighted for molecular products through two case studies involving the design of refrigerants and surfactants....

  13. On the performance of de novo pathway enrichment

    DEFF Research Database (Denmark)

    Batra, Richa; Alcaraz, Nicolas; Gitzhofer, Kevin

    2017-01-01

    De novo pathway enrichment is a powerful approach to discover previously uncharacterized molecular mechanisms in addition to already known pathways. To achieve this, condition-specific functional modules are extracted from large interaction networks. Here, we give an overview of the state...

  14. Transferência do fator caturra para o cultivar Mundo Novo de Coffea arabica Transfer of the CT gene to Mundo Novo cultivar

    Directory of Open Access Journals (Sweden)

    A. Carvalho

    1972-01-01

    Full Text Available No presente trabalho são relatados os estudos realizados visando à introdução do gene Ct (caturra que contribui para reduzir a altura da planta, no cultivar Mundo" Novo de Coffea arabica.Estudaram-se, em ensaios de produtividade, as populações Fv F.,, F3 e F4. Nessas populações e principalmente entre os descendentes dos "caféeiros H 2077-2-5 e H 2077-2-12, foram selecionadas plantas homozigotas para os alelos Ct e também para os alelos responsáveis pela cor do fruto xc ou Xc. Essas combinações foram denominadas 'Catuaí Amarelo' e 'Catuaí Vermelho', respectivamente, e suas características são apresentadas. Os novos cultivares vêm-se mostrando de interesse econômico para as regiões cafeeiras não somente pelo porte pequeno, mas também pela produtividade, pelo vigor vegetativo e pela precocidade.The successful transfer of the Ct gene for short internode to the tall cultivar of Coffea arábica'Mundo Novo' is reported. Individual selections were carried out in the F1, F2, F3 and F4 generations. It was found that early selection in the F2 generation was quite effective. A remarkably good correlation was found between productitivity of F2 plants and the yield of the F3 and F4 generations. Plants of the F4 generation have shown reasonable uniformity and high yield in several trials. The new selections showed to be early producers. Two new cultivars were released namely 'Catuaí Amarelo' and 'Catuaí Vermelho'. The former has yellow fruits whereas the latter has red fruits. The plants are much shorter that the ones of Mundo Novo. The new cultivars have a very strong secondary and tertiary branching. Because of these characteristics Catuaí Amarelo and Catuaí Vermelho are being planted in large scale replacing the tall cultivars.

  15. When less is more: 'slicing' sequencing data improves read decoding accuracy and de novo assembly quality.

    Science.gov (United States)

    Lonardi, Stefano; Mirebrahim, Hamid; Wanamaker, Steve; Alpert, Matthew; Ciardo, Gianfranco; Duma, Denisa; Close, Timothy J

    2015-09-15

    As the invention of DNA sequencing in the 70s, computational biologists have had to deal with the problem of de novo genome assembly with limited (or insufficient) depth of sequencing. In this work, we investigate the opposite problem, that is, the challenge of dealing with excessive depth of sequencing. We explore the effect of ultra-deep sequencing data in two domains: (i) the problem of decoding reads to bacterial artificial chromosome (BAC) clones (in the context of the combinatorial pooling design we have recently proposed), and (ii) the problem of de novo assembly of BAC clones. Using real ultra-deep sequencing data, we show that when the depth of sequencing increases over a certain threshold, sequencing errors make these two problems harder and harder (instead of easier, as one would expect with error-free data), and as a consequence the quality of the solution degrades with more and more data. For the first problem, we propose an effective solution based on 'divide and conquer': we 'slice' a large dataset into smaller samples of optimal size, decode each slice independently, and then merge the results. Experimental results on over 15 000 barley BACs and over 4000 cowpea BACs demonstrate a significant improvement in the quality of the decoding and the final assembly. For the second problem, we show for the first time that modern de novo assemblers cannot take advantage of ultra-deep sequencing data. Python scripts to process slices and resolve decoding conflicts are available from http://goo.gl/YXgdHT; software Hashfilter can be downloaded from http://goo.gl/MIyZHs stelo@cs.ucr.edu or timothy.close@ucr.edu Supplementary data are available at Bioinformatics online. © The Author 2015. Published by Oxford University Press. All rights reserved. For Permissions, please e-mail: journals.permissions@oup.com.

  16. Computer program for prediction of the deposition of material released from fixed and rotary wing aircraft

    Science.gov (United States)

    Teske, M. E.

    1984-01-01

    This is a user manual for the computer code ""AGDISP'' (AGricultural DISPersal) which has been developed to predict the deposition of material released from fixed and rotary wing aircraft in a single-pass, computationally efficient manner. The formulation of the code is novel in that the mean particle trajectory and the variance about the mean resulting from turbulent fluid fluctuations are simultaneously predicted. The code presently includes the capability of assessing the influence of neutral atmospheric conditions, inviscid wake vortices, particle evaporation, plant canopy and terrain on the deposition pattern.

  17. Binding Mode and Induced Fit Predictions for Prospective Computational Drug Design.

    Science.gov (United States)

    Grebner, Christoph; Iegre, Jessica; Ulander, Johan; Edman, Karl; Hogner, Anders; Tyrchan, Christian

    2016-04-25

    Computer-aided drug design plays an important role in medicinal chemistry to obtain insights into molecular mechanisms and to prioritize design strategies. Although significant improvement has been made in structure based design, it still remains a key challenge to accurately model and predict induced fit mechanisms. Most of the current available techniques either do not provide sufficient protein conformational sampling or are too computationally demanding to fit an industrial setting. The current study presents a systematic and exhaustive investigation of predicting binding modes for a range of systems using PELE (Protein Energy Landscape Exploration), an efficient and fast protein-ligand sampling algorithm. The systems analyzed (cytochrome P, kinase, protease, and nuclear hormone receptor) exhibit different complexities of ligand induced fit mechanisms and protein dynamics. The results are compared with results from classical molecular dynamics simulations and (induced fit) docking. This study shows that ligand induced side chain rearrangements and smaller to medium backbone movements are captured well in PELE. Large secondary structure rearrangements, however, remain challenging for all employed techniques. Relevant binding modes (ligand heavy atom RMSD PELE method within a few hours of simulation, positioning PELE as a tool applicable for rapid drug design cycles.

  18. Novo Jornalismo: fronteiras litero-factuais em A sangue Frio e em Radical Chique

    Directory of Open Access Journals (Sweden)

    Francisco Aquinei Timóteo Queirós

    2012-03-01

    Full Text Available http://dx.doi.org/10.5007/1984-784X.2012v12n18p130 A pesquisa busca analisar de que forma fato e ficção se entrecruzam no “movimento” do Novo Jornalismo, a partir das obras A sangue Frio e Radical Chique e o Novo Jornalismo, de Truman Capote e Tom Wolfe, respectivamente. Pretende-se, a partir da investigação do corpus em estudo, revelar os aspectos que aproximam o fato jornalístico, a notícia e a reportagem às técnicas literárias do romance, do conto e da crônica. O estudo investiga o Novo Jornalismo sob o viés de textos centrais das áreas de teoria literária e estudos jornalísticos utilizando autores como Mikhail Bakhtin, Hayden White, Paul Ricoeur, Muniz Sodré; além de referenciar outros escritores que, como Tom Wolfe e Truman Capote, fizeram parte de um grande movimento renovador do jornalismo literário nos anos 1950, 1960 e 1970 chamado, genericamente, de Novo Jornalismo.

  19. Facebook - Um novo espaço autobiográfico?

    Directory of Open Access Journals (Sweden)

    Maria Tereza Lima

    2015-07-01

    Full Text Available O arigo "Facebook - Um novo espaço autobiográfico?" tem como objetivo central investigar como a perspectiva autobiográfica e biográfica se configura em uma rede social. Levando em consideração esse novo espaço de exteriorização da memória, analisamos as escolhas de uma pessoa ao postar os mais diversos gêneros textuais no Facebook e verificamos até que ponto tais fragmentos textuais narram a história de um indivíduo. Quais textos são postados? O que foi escolhido e o que foi excluído desse perfil? O autor trava um pacto de leitura com o leitor? Se levarmos em consideração que os textos postados nessa rede social são textos produzidos pelo próprio autor do perfil e de autores diversos, como configuraremos esses espaços virtuais? Autobiográficos e biográficos? Quem escreve a página virtual é o próprio autor do perfil ou múltiplos autores? Com as redes sociais, surge um novo modelo de autobiografia e de biógrafo? Esses e tantos outros questionamentos nortearam nossas investigações e permitiram-nos conhecer um pouco mais sobre as estratégias autobiográficas dos autores virtuais contemporâneos.

  20. De Novo Heart Failure After Kidney Transplantation: Trends in Incidence and Outcomes.

    Science.gov (United States)

    Lenihan, Colin R; Liu, Sai; Deswal, Anita; Montez-Rath, Maria E; Winkelmayer, Wolfgang C

    2018-03-29

    Heart failure is an important cause of morbidity and mortality following kidney transplantation. Some studies in the general population have shown that the incidence of heart failure has decreased during the past 20 years. However, it is not currently known whether such a trend exists in the kidney transplantation population. Retrospective observational cohort study. Adult patients included in the US Renal Data System who underwent their first kidney transplantation in the United States between 1998 and 2010 with at least 6 months of continuous Medicare parts A and B coverage before transplantation and no prior evidence for a diagnosis of heart failure before kidney transplantation. Calendar year of transplantation and calendar year of posttransplantation heart failure diagnosis. De novo posttransplantation heart failure defined using International Classification of Diseases, Ninth Revision diagnosis codes and mortality following de novo posttransplantation heart failure diagnosis. Secular trends in de novo post-kidney transplantation heart failure were examined using Cox proportional hazards analysis. Within a study cohort of 48,771 patients, 7,269 developed de novo heart failure within 3 years of kidney transplantation, with a median time to heart failure of 0.76 years. The adjusted HR for heart failure with death as competing risk comparing patients who underwent transplantation in 2010 with those who underwent transplantation in 1998 was 0.69 (95% CI, 0.60-0.79). No temporal trend in mortality following a diagnosis of post-kidney transplantation heart failure was observed. Potential residual confounding from either incorrectly ascertained or unavailable confounders. The cohort was limited to Medicare beneficiaries. Adjusted for demographic and clinical characteristics, the risk for developing de novo post-kidney transplantation heart failure has declined significantly between 1998 and 2010, with no apparent change in subsequent mortality. Copyright © 2018

  1. Clinical Prediction Models for Cardiovascular Disease: Tufts Predictive Analytics and Comparative Effectiveness Clinical Prediction Model Database.

    Science.gov (United States)

    Wessler, Benjamin S; Lai Yh, Lana; Kramer, Whitney; Cangelosi, Michael; Raman, Gowri; Lutz, Jennifer S; Kent, David M

    2015-07-01

    Clinical prediction models (CPMs) estimate the probability of clinical outcomes and hold the potential to improve decision making and individualize care. For patients with cardiovascular disease, there are numerous CPMs available although the extent of this literature is not well described. We conducted a systematic review for articles containing CPMs for cardiovascular disease published between January 1990 and May 2012. Cardiovascular disease includes coronary heart disease, heart failure, arrhythmias, stroke, venous thromboembolism, and peripheral vascular disease. We created a novel database and characterized CPMs based on the stage of development, population under study, performance, covariates, and predicted outcomes. There are 796 models included in this database. The number of CPMs published each year is increasing steadily over time. Seven hundred seventeen (90%) are de novo CPMs, 21 (3%) are CPM recalibrations, and 58 (7%) are CPM adaptations. This database contains CPMs for 31 index conditions, including 215 CPMs for patients with coronary artery disease, 168 CPMs for population samples, and 79 models for patients with heart failure. There are 77 distinct index/outcome pairings. Of the de novo models in this database, 450 (63%) report a c-statistic and 259 (36%) report some information on calibration. There is an abundance of CPMs available for a wide assortment of cardiovascular disease conditions, with substantial redundancy in the literature. The comparative performance of these models, the consistency of effects and risk estimates across models and the actual and potential clinical impact of this body of literature is poorly understood. © 2015 American Heart Association, Inc.

  2. Evaluation of computer imaging technique for predicting the SPAD readings in potato leaves

    Directory of Open Access Journals (Sweden)

    M.S. Borhan

    2017-12-01

    Full Text Available Facilitating non-contact measurement, a computer-imaging system was devised and evaluated to predict the chlorophyll content in potato leaves. A charge-coupled device (CCD camera paired with two optical filters and light chamber was used to acquire green (550 ± 40 nm and red band (700 ± 40 nm images from the same leaf. Potato leaves from 15 plants differing in coloration (green to yellow and age were selected for this study. Histogram based image features, such as mean and variances of green and red band images, were extracted from the histogram. Regression analyses demonstrated that the variations in SPAD meter reading could be explained by the mean gray and variances of gray scale values. The fitted least square models based on the mean gray scale levels were inversely related to the chlorophyll content of the potato leaf with a R2 of 0.87 using a green band image and with an R2 of 0.79 using a red band image. With the extracted four image features, the developed multiple linear regression model predicted the chlorophyll content with a high R2 of 0.88. The multiple regression model (using all features provided an average prediction accuracy of 85.08% and a maximum accuracy of 99.8%. The prediction model using only mean gray value of red band showed an average accuracy of 81.6% with a maximum accuracy of 99.14%. Keywords: Computer imaging, Chlorophyll, SPAD meter, Regression, Prediction accuracy

  3. A computational method to predict fluid-structure interaction of pressure relief valves

    Energy Technology Data Exchange (ETDEWEB)

    Kang, S. K.; Lee, D. H.; Park, S. K.; Hong, S. R. [Korea Electric Power Research Institute, Taejon (Korea, Republic of)

    2004-07-01

    An effective CFD (Computational fluid dynamics) method to predict important performance parameters, such as blowdown and chattering, for pressure relief valves in NPPs is provided in the present study. To calculate the valve motion, 6DOF (six degree of freedom) model is used. A chimera overset grid method is utilized to this study for the elimination of grid remeshing problem, when the disk moves. Further, CFD-Fastran which is developed by CFD-RC for compressible flow analysis is applied to an 1' safety valve. The prediction results ensure the applicability of the presented method in this study.

  4. Photoreactivation of conversion and de novo suppressor mutation in Escherichia coli

    Energy Technology Data Exchange (ETDEWEB)

    Bockrath, R C; Plamer, J E [Indiana Univ., Indianapolis (USA). Dept. of Microbiology

    1977-04-01

    Studies of mutagenesis and photoreactivation in various E.coli strains have shown that conversion mutation of a mutant containing an amber suppressor to one containing an ochre suppressor is sensitive to photoreactivation. Direct photoreactivation by photoreactivating light (PRL) after uv mutagenesis reduced mutation frequencies by a factor of about 2 for each minute of exposure during the first 5 to 8 min of exposure for cells with normal repair capacity. Conversion and potential de novo suppressor mutations were about equally sensitive. For conversion, the sensitivities to PRL were identical in the repair-normal and excisions-repair-deficient strains. For de novo suppressor mutation, the rate of mutation frequency reduction by PRL in the repair-deficient strain was about one-half that in the other strains. The results suggest that ultraviolet radiation produces both de novo suppressor mutation and conversion at the sup(E,B) locus by photoreversible pyrimidine dimers in the DNA. The causative dimers could be Thy()Cyt dimers in the transcribed strand or the non-transcribed strand, respectively.

  5. Prediction of sentinel lymph node status using single-photon emission computed tomography (SPECT)/computed tomography (CT) imaging of breast cancer.

    Science.gov (United States)

    Tomiguchi, Mai; Yamamoto-Ibusuki, Mutsuko; Yamamoto, Yutaka; Fujisue, Mamiko; Shiraishi, Shinya; Inao, Touko; Murakami, Kei-ichi; Honda, Yumi; Yamashita, Yasuyuki; Iyama, Ken-ichi; Iwase, Hirotaka

    2016-02-01

    Single-photon emission computed tomography (SPECT)/computed tomography (CT) improves the anatomical identification of sentinel lymph nodes (SNs). We aimed to evaluate the possibility of predicting the SN status using SPECT/CT. SN mapping using a SPECT/CT system was performed in 381 cases of clinically node-negative, operable invasive breast cancer. We evaluated and compared the values of SN mapping on SPECT/CT, the findings of other modalities and clinicopathological factors in predicting the SN status. Patients with SNs located in the Level I area were evaluated. Of the 355 lesions (94.8 %) assessed, six cases (1.6 %) were not detected using any imaging method. According to the final histological diagnosis, 298 lesions (78.2 %) were node negative and 83 lesions (21.7 %) were node positive. The univariate analysis showed that SN status was significantly correlated with the number of SNs detected on SPECT/CT in the Level I area (P = 0.0048), total number of SNs detected on SPECT/CT (P = 0.011), findings of planar lymphoscintigraphy (P = 0.011) and findings of a handheld gamma probe during surgery (P = 0.012). According to the multivariate analysis, the detection of multiple SNs on SPECT/CT imaging helped to predict SN metastasis. The number of SNs located in the Level I area detected using the SPECT/CT system may be a predictive factor for SN metastasis.

  6. The Key Drivers behind Novo Nordisk’s Growth in the Diabetes Market in China

    Directory of Open Access Journals (Sweden)

    Hind Louiza CHITOUR

    2013-12-01

    Full Text Available To enter the Chinese Pharmaceutical market, “Big Pharma” has adopted different strategies to tackle the challenges specific to the country in terms of size, demographics, specific sales channels and logistics adjustments. While the majority of Global Pharmaceutical players have opted for an aggressive M&A approach to penetrate the Chinese market and gain local insight; the Danish Novo Nordisk has instead chosen a strategy focusing on innovation and developing its R&D structure to capitalize on the local talent pool. To illustrate Novo Nordisk’s growth strategy in the Mainland, we analyzed its competitiveness in the diabetes market by demonstrating the key drivers behind this success. We applied a various set of tools for this research: Novo Nordisk, Dong Bao Pharmaceutical executives’ interviews and personal observations accounting for the primary data, we also reviewed secondary data to perform a PEST analysis in addition to Porter’s competitive advantage model in order to extract the reasons behind Novo Nordisk’s marching success in the Mainland.

  7. User's Self-Prediction of Performance in Motor Imagery Brain-Computer Interface.

    Science.gov (United States)

    Ahn, Minkyu; Cho, Hohyun; Ahn, Sangtae; Jun, Sung C

    2018-01-01

    Performance variation is a critical issue in motor imagery brain-computer interface (MI-BCI), and various neurophysiological, psychological, and anatomical correlates have been reported in the literature. Although the main aim of such studies is to predict MI-BCI performance for the prescreening of poor performers, studies which focus on the user's sense of the motor imagery process and directly estimate MI-BCI performance through the user's self-prediction are lacking. In this study, we first test each user's self-prediction idea regarding motor imagery experimental datasets. Fifty-two subjects participated in a classical, two-class motor imagery experiment and were asked to evaluate their easiness with motor imagery and to predict their own MI-BCI performance. During the motor imagery experiment, an electroencephalogram (EEG) was recorded; however, no feedback on motor imagery was given to subjects. From EEG recordings, the offline classification accuracy was estimated and compared with several questionnaire scores of subjects, as well as with each subject's self-prediction of MI-BCI performance. The subjects' performance predictions during motor imagery task showed a high positive correlation ( r = 0.64, p performance even without feedback information. This implies that the human brain is an active learning system and, by self-experiencing the endogenous motor imagery process, it can sense and adopt the quality of the process. Thus, it is believed that users may be able to predict MI-BCI performance and results may contribute to a better understanding of low performance and advancing BCI.

  8. [Computational prediction of human immunodeficiency resistance to reverse transcriptase inhibitors].

    Science.gov (United States)

    Tarasova, O A; Filimonov, D A; Poroikov, V V

    2017-10-01

    Human immunodeficiency virus (HIV) causes acquired immunodeficiency syndrome (AIDS) and leads to over one million of deaths annually. Highly active antiretroviral treatment (HAART) is a gold standard in the HIV/AIDS therapy. Nucleoside and non-nucleoside inhibitors of HIV reverse transcriptase (RT) are important component of HAART, but their effect depends on the HIV susceptibility/resistance. HIV resistance mainly occurs due to mutations leading to conformational changes in the three-dimensional structure of HIV RT. The aim of our work was to develop and test a computational method for prediction of HIV resistance associated with the mutations in HIV RT. Earlier we have developed a method for prediction of HIV type 1 (HIV-1) resistance; it is based on the usage of position-specific descriptors. These descriptors are generated using the particular amino acid residue and its position; the position of certain residue is determined in a multiple alignment. The training set consisted of more than 1900 sequences of HIV RT from the Stanford HIV Drug Resistance database; for these HIV RT variants experimental data on their resistance to ten inhibitors are presented. Balanced accuracy of prediction varies from 80% to 99% depending on the method of classification (support vector machine, Naive Bayes, random forest, convolutional neural networks) and the drug, resistance to which is obtained. Maximal balanced accuracy was obtained for prediction of resistance to zidovudine, stavudine, didanosine and efavirenz by the random forest classifier. Average accuracy of prediction is 89%.

  9. A sampling-based computational strategy for the representation of epistemic uncertainty in model predictions with evidence theory.

    Energy Technology Data Exchange (ETDEWEB)

    Johnson, J. D. (Prostat, Mesa, AZ); Oberkampf, William Louis; Helton, Jon Craig (Arizona State University, Tempe, AZ); Storlie, Curtis B. (North Carolina State University, Raleigh, NC)

    2006-10-01

    Evidence theory provides an alternative to probability theory for the representation of epistemic uncertainty in model predictions that derives from epistemic uncertainty in model inputs, where the descriptor epistemic is used to indicate uncertainty that derives from a lack of knowledge with respect to the appropriate values to use for various inputs to the model. The potential benefit, and hence appeal, of evidence theory is that it allows a less restrictive specification of uncertainty than is possible within the axiomatic structure on which probability theory is based. Unfortunately, the propagation of an evidence theory representation for uncertainty through a model is more computationally demanding than the propagation of a probabilistic representation for uncertainty, with this difficulty constituting a serious obstacle to the use of evidence theory in the representation of uncertainty in predictions obtained from computationally intensive models. This presentation describes and illustrates a sampling-based computational strategy for the representation of epistemic uncertainty in model predictions with evidence theory. Preliminary trials indicate that the presented strategy can be used to propagate uncertainty representations based on evidence theory in analysis situations where naive sampling-based (i.e., unsophisticated Monte Carlo) procedures are impracticable due to computational cost.

  10. Predicting the Pullout Capacity of Small Ground Anchors Using Nonlinear Integrated Computing Techniques

    Directory of Open Access Journals (Sweden)

    Mosbeh R. Kaloop

    2017-01-01

    Full Text Available This study investigates predicting the pullout capacity of small ground anchors using nonlinear computing techniques. The input-output prediction model for the nonlinear Hammerstein-Wiener (NHW and delay inputs for the adaptive neurofuzzy inference system (DANFIS are developed and utilized to predict the pullout capacity. The results of the developed models are compared with previous studies that used artificial neural networks and least square support vector machine techniques for the same case study. The in situ data collection and statistical performances are used to evaluate the models performance. Results show that the developed models enhance the precision of predicting the pullout capacity when compared with previous studies. Also, the DANFIS model performance is proven to be better than other models used to detect the pullout capacity of ground anchors.

  11. Identifying wrong assemblies in de novo short read primary ...

    Indian Academy of Sciences (India)

    2016-08-05

    Aug 5, 2016 ... Most of these assemblies are done using some de novo short read assemblers and other related approaches. .... benchmarking projects like Assemblathon 1, Assemblathon ... from a large insert library (at least 1000 bases).

  12. Analysis of 60 706 Exomes Questions the Role of De Novo Variants Previously Implicated in Cardiac Disease

    DEFF Research Database (Denmark)

    Paludan-Müller, Christian; Ahlberg, Gustav; Ghouse, Jonas

    2017-01-01

    BACKGROUND: De novo variants in the exome occur at a rate of 1 per individual per generation, and because of the low reproductive fitness for de novo variants causing severe disease, the likelihood of finding these as standing variations in the general population is low. Therefore, this study...... sought to evaluate the pathogenicity of de novo variants previously associated with cardiac disease based on a large population-representative exome database. METHODS AND RESULTS: We performed a literature search for previous publications on de novo variants associated with severe arrhythmias...... trio studies (>1000 subjects). Of the monogenic variants, 11% (23/211) were present in ExAC, whereas 26% (802/3050) variants believed to increase susceptibility of disease were identified in ExAC. Monogenic de novo variants in ExAC had a total allele count of 109 and with ≈844 expected cases in Ex...

  13. Similar prognosis of transformed and de novo diffuse large B-cell lymphomas in patients treated with immunochemotherapy.

    Science.gov (United States)

    Sorigue, Marc; Garcia, Olga; Baptista, Maria Joao; Sancho, Juan-Manuel; Tapia, Gustavo; Mate, José Luis; Feliu, Evarist; Navarro, José-Tomás; Ribera, Josep-Maria

    2017-03-22

    The prognosis of diffuse large B-cell lymphomas (DLBCL) transformed from indolent lymphoma (TL) has been considered poorer than that of de novo DLBCL. However, it seems to have improved since the introduction of rituximab. We compared the characteristics (including the cell-of-origin), and the prognosis of 29 patients with TL and 101 with de novo DLBCL treated with immunochemotherapy. Patients with TL and de novo DLBCL had similar characteristics. All TL cases evolving from follicular lymphoma were germinal-center B-cell-like, while those TL from marginal zone lymphoma or chronic lymphocytic leukemia were non-germinal-center B-cell-like. The complete response rate was similar in TL and de novo DLBCL (62 vs. 66%, P=.825). The 5-year overall and progression-free survival probabilities (95% CI) were 59% (40-78) and 41% (22-60) for TL and 63% (53-73) and 60% (50-70) for de novo DLBCL, respectively (P=.732 for overall survival and P=.169 for progression-free survival). In this study, the prognosis of TL and de novo DLBCL treated with immunochemotherapy was similar. The role of intensification with stem cell transplantation in the management of TL may be questionable in the rituximab era. Copyright © 2016 Elsevier España, S.L.U. All rights reserved.

  14. A Bayesian approach for parameter estimation and prediction using a computationally intensive model

    International Nuclear Information System (INIS)

    Higdon, Dave; McDonnell, Jordan D; Schunck, Nicolas; Sarich, Jason; Wild, Stefan M

    2015-01-01

    Bayesian methods have been successful in quantifying uncertainty in physics-based problems in parameter estimation and prediction. In these cases, physical measurements y are modeled as the best fit of a physics-based model η(θ), where θ denotes the uncertain, best input setting. Hence the statistical model is of the form y=η(θ)+ϵ, where ϵ accounts for measurement, and possibly other, error sources. When nonlinearity is present in η(⋅), the resulting posterior distribution for the unknown parameters in the Bayesian formulation is typically complex and nonstandard, requiring computationally demanding computational approaches such as Markov chain Monte Carlo (MCMC) to produce multivariate draws from the posterior. Although generally applicable, MCMC requires thousands (or even millions) of evaluations of the physics model η(⋅). This requirement is problematic if the model takes hours or days to evaluate. To overcome this computational bottleneck, we present an approach adapted from Bayesian model calibration. This approach combines output from an ensemble of computational model runs with physical measurements, within a statistical formulation, to carry out inference. A key component of this approach is a statistical response surface, or emulator, estimated from the ensemble of model runs. We demonstrate this approach with a case study in estimating parameters for a density functional theory model, using experimental mass/binding energy measurements from a collection of atomic nuclei. We also demonstrate how this approach produces uncertainties in predictions for recent mass measurements obtained at Argonne National Laboratory. (paper)

  15. Applying Machine Learning and High Performance Computing to Water Quality Assessment and Prediction

    OpenAIRE

    Ruijian Zhang; Deren Li

    2017-01-01

    Water quality assessment and prediction is a more and more important issue. Traditional ways either take lots of time or they can only do assessments. In this research, by applying machine learning algorithm to a long period time of water attributes’ data; we can generate a decision tree so that it can predict the future day’s water quality in an easy and efficient way. The idea is to combine the traditional ways and the computer algorithms together. Using machine learning algorithms, the ass...

  16. Morphometric analysis - Cone beam computed tomography to predict bone quality and quantity.

    Science.gov (United States)

    Hohlweg-Majert, B; Metzger, M C; Kummer, T; Schulze, D

    2011-07-01

    Modified quantitative computed tomography is a method used to predict bone quality and quantify the bone mass of the jaw. The aim of this study was to determine whether bone quantity or quality was detected by cone beam computed tomography (CBCT) combined with image analysis. MATERIALS AND PROCEDURES: Different measurements recorded on two phantoms (Siemens phantom, Comac phantom) were evaluated on images taken with the Somatom VolumeZoom (Siemens Medical Solutions, Erlangen, Germany) and the NewTom 9000 (NIM s.r.l., Verona, Italy) in order to calculate a calibration curve. The spatial relationships of six sample cylinders and the repositioning from four pig skull halves relative to adjacent defined anatomical structures were assessed by means of three-dimensional visualization software. The calibration curves for computer tomography (CT) and cone beam computer tomography (CBCT) using the Siemens phantom showed linear correlation in both modalities between the Hounsfield Units (HU) and bone morphology. A correction factor for CBCT was calculated. Exact information about the micromorphology of the bone cylinders was only available using of micro computer tomography. Cone-beam computer tomography is a suitable choice for analysing bone mass, but, it does not give any information about bone quality. 2010 European Association for Cranio-Maxillo-Facial Surgery. Published by Elsevier Ltd. All rights reserved.

  17. A Pareto Algorithm for Efficient De Novo Design of Multi-functional Molecules.

    Science.gov (United States)

    Daeyaert, Frits; Deem, Micheal W

    2017-01-01

    We have introduced a Pareto sorting algorithm into Synopsis, a de novo design program that generates synthesizable molecules with desirable properties. We give a detailed description of the algorithm and illustrate its working in 2 different de novo design settings: the design of putative dual and selective FGFR and VEGFR inhibitors, and the successful design of organic structure determining agents (OSDAs) for the synthesis of zeolites. We show that the introduction of Pareto sorting not only enables the simultaneous optimization of multiple properties but also greatly improves the performance of the algorithm to generate molecules with hard-to-meet constraints. This in turn allows us to suggest approaches to address the problem of false positive hits in de novo structure based drug design by introducing structural and physicochemical constraints in the designed molecules, and by forcing essential interactions between these molecules and their target receptor. © 2017 Wiley-VCH Verlag GmbH & Co. KGaA, Weinheim.

  18. The predictive value of single-photon emission computed tomography/computed tomography for sentinel lymph node localization in head and neck cutaneous malignancy.

    Science.gov (United States)

    Remenschneider, Aaron K; Dilger, Amanda E; Wang, Yingbing; Palmer, Edwin L; Scott, James A; Emerick, Kevin S

    2015-04-01

    Preoperative localization of sentinel lymph nodes in head and neck cutaneous malignancies can be aided by single-photon emission computed tomography/computed tomography (SPECT/CT); however, its true predictive value for identifying lymph nodes intraoperatively remains unquantified. This study aims to understand the sensitivity, specificity, and positive and negative predictive values of SPECT/CT in sentinel lymph node biopsy for cutaneous malignancies of the head and neck. Blinded retrospective imaging review with comparison to intraoperative gamma probe confirmed sentinel lymph nodes. A consecutive series of patients with a head and neck cutaneous malignancy underwent preoperative SPECT/CT followed by sentinel lymph node biopsy with a gamma probe. Two nuclear medicine physicians, blinded to clinical data, independently reviewed each SPECT/CT. Activity within radiographically defined nodal basins was recorded and compared to intraoperative gamma probe findings. Sensitivity, specificity, and negative and positive predictive values were calculated with subgroup stratification by primary tumor site. Ninety-two imaging reads were performed on 47 patients with cutaneous malignancy who underwent SPECT/CT followed by sentinel lymph node biopsy. Overall sensitivity was 73%, specificity 92%, positive predictive value 54%, and negative predictive value 96%. The predictive ability of SPECT/CT to identify the basin or an adjacent basin containing the single hottest node was 92%. SPECT/CT overestimated uptake by an average of one nodal basin. In the head and neck, SPECT/CT has higher reliability for primary lesions of the eyelid, scalp, and cheek. SPECT/CT has high sensitivity, specificity, and negative predictive value, but may overestimate relevant nodal basins in sentinel lymph node biopsy. © 2014 The American Laryngological, Rhinological and Otological Society, Inc.

  19. Rapid centriole assembly in Naegleria reveals conserved roles for both de novo and mentored assembly.

    Science.gov (United States)

    Fritz-Laylin, Lillian K; Levy, Yaron Y; Levitan, Edward; Chen, Sean; Cande, W Zacheus; Lai, Elaine Y; Fulton, Chandler

    2016-03-01

    Centrioles are eukaryotic organelles whose number and position are critical for cilia formation and mitosis. Many cell types assemble new centrioles next to existing ones ("templated" or mentored assembly). Under certain conditions, centrioles also form without pre-existing centrioles (de novo). The synchronous differentiation of Naegleria amoebae to flagellates represents a unique opportunity to study centriole assembly, as nearly 100% of the population transitions from having no centrioles to having two within minutes. Here, we find that Naegleria forms its first centriole de novo, immediately followed by mentored assembly of the second. We also find both de novo and mentored assembly distributed among all major eukaryote lineages. We therefore propose that both modes are ancestral and have been conserved because they serve complementary roles, with de novo assembly as the default when no pre-existing centriole is available, and mentored assembly allowing precise regulation of number, timing, and location of centriole assembly. © 2016 Wiley Periodicals, Inc.

  20. A human-specific de novo protein-coding gene associated with human brain functions.

    Directory of Open Access Journals (Sweden)

    Chuan-Yun Li

    2010-03-01

    Full Text Available To understand whether any human-specific new genes may be associated with human brain functions, we computationally screened the genetic vulnerable factors identified through Genome-Wide Association Studies and linkage analyses of nicotine addiction and found one human-specific de novo protein-coding gene, FLJ33706 (alternative gene symbol C20orf203. Cross-species analysis revealed interesting evolutionary paths of how this gene had originated from noncoding DNA sequences: insertion of repeat elements especially Alu contributed to the formation of the first coding exon and six standard splice junctions on the branch leading to humans and chimpanzees, and two subsequent substitutions in the human lineage escaped two stop codons and created an open reading frame of 194 amino acids. We experimentally verified FLJ33706's mRNA and protein expression in the brain. Real-Time PCR in multiple tissues demonstrated that FLJ33706 was most abundantly expressed in brain. Human polymorphism data suggested that FLJ33706 encodes a protein under purifying selection. A specifically designed antibody detected its protein expression across human cortex, cerebellum and midbrain. Immunohistochemistry study in normal human brain cortex revealed the localization of FLJ33706 protein in neurons. Elevated expressions of FLJ33706 were detected in Alzheimer's brain samples, suggesting the role of this novel gene in human-specific pathogenesis of Alzheimer's disease. FLJ33706 provided the strongest evidence so far that human-specific de novo genes can have protein-coding potential and differential protein expression, and be involved in human brain functions.

  1. Predicting knee replacement damage in a simulator machine using a computational model with a consistent wear factor.

    Science.gov (United States)

    Zhao, Dong; Sakoda, Hideyuki; Sawyer, W Gregory; Banks, Scott A; Fregly, Benjamin J

    2008-02-01

    Wear of ultrahigh molecular weight polyethylene remains a primary factor limiting the longevity of total knee replacements (TKRs). However, wear testing on a simulator machine is time consuming and expensive, making it impractical for iterative design purposes. The objectives of this paper were first, to evaluate whether a computational model using a wear factor consistent with the TKR material pair can predict accurate TKR damage measured in a simulator machine, and second, to investigate how choice of surface evolution method (fixed or variable step) and material model (linear or nonlinear) affect the prediction. An iterative computational damage model was constructed for a commercial knee implant in an AMTI simulator machine. The damage model combined a dynamic contact model with a surface evolution model to predict how wear plus creep progressively alter tibial insert geometry over multiple simulations. The computational framework was validated by predicting wear in a cylinder-on-plate system for which an analytical solution was derived. The implant damage model was evaluated for 5 million cycles of simulated gait using damage measurements made on the same implant in an AMTI machine. Using a pin-on-plate wear factor for the same material pair as the implant, the model predicted tibial insert wear volume to within 2% error and damage depths and areas to within 18% and 10% error, respectively. Choice of material model had little influence, while inclusion of surface evolution affected damage depth and area but not wear volume predictions. Surface evolution method was important only during the initial cycles, where variable step was needed to capture rapid geometry changes due to the creep. Overall, our results indicate that accurate TKR damage predictions can be made with a computational model using a constant wear factor obtained from pin-on-plate tests for the same material pair, and furthermore, that surface evolution method matters only during the initial

  2. De novo assembly of the perennial ryegrass transcriptome using an RNA-seq strategy

    DEFF Research Database (Denmark)

    Farrell, Jacqueline Danielle; Byrne, Stephen; Paina, Cristiana

    2014-01-01

    a homozygous perennial ryegrass genotype can circumvent the challenge of heterozygosity. The goals of this study were to perform RNA-sequencing on multiple tissues from a highly inbred genotype to develop a reference transcriptome. This was complemented with RNA-sequencing of a highly heterozygous genotype...... for SNP calling. Result De novo transcriptome assembly of the inbred genotype created 185,833 transcripts with an average length of 830 base pairs. Within the inbred reference transcriptome 78,560 predicted open reading frames were found of which 24,434 were predicted as complete. Functional annotation...... multiple orthologs. Using the longest unique open reading frames as the reference sequence, 64,242 single nucleotide polymorphisms were found. One thousand sixty one open reading frames from the inbred genotype contained heterozygous sites, confirming the high degree of homozygosity. Conclusion Our study...

  3. Cloud computing approaches for prediction of ligand binding poses and pathways.

    Science.gov (United States)

    Lawrenz, Morgan; Shukla, Diwakar; Pande, Vijay S

    2015-01-22

    We describe an innovative protocol for ab initio prediction of ligand crystallographic binding poses and highly effective analysis of large datasets generated for protein-ligand dynamics. We include a procedure for setup and performance of distributed molecular dynamics simulations on cloud computing architectures, a model for efficient analysis of simulation data, and a metric for evaluation of model convergence. We give accurate binding pose predictions for five ligands ranging in affinity from 7 nM to > 200 μM for the immunophilin protein FKBP12, for expedited results in cases where experimental structures are difficult to produce. Our approach goes beyond single, low energy ligand poses to give quantitative kinetic information that can inform protein engineering and ligand design.

  4. Hominoid-specific de novo protein-coding genes originating from long non-coding RNAs.

    Directory of Open Access Journals (Sweden)

    Chen Xie

    2012-09-01

    Full Text Available Tinkering with pre-existing genes has long been known as a major way to create new genes. Recently, however, motherless protein-coding genes have been found to have emerged de novo from ancestral non-coding DNAs. How these genes originated is not well addressed to date. Here we identified 24 hominoid-specific de novo protein-coding genes with precise origination timing in vertebrate phylogeny. Strand-specific RNA-Seq analyses were performed in five rhesus macaque tissues (liver, prefrontal cortex, skeletal muscle, adipose, and testis, which were then integrated with public transcriptome data from human, chimpanzee, and rhesus macaque. On the basis of comparing the RNA expression profiles in the three species, we found that most of the hominoid-specific de novo protein-coding genes encoded polyadenylated non-coding RNAs in rhesus macaque or chimpanzee with a similar transcript structure and correlated tissue expression profile. According to the rule of parsimony, the majority of these hominoid-specific de novo protein-coding genes appear to have acquired a regulated transcript structure and expression profile before acquiring coding potential. Interestingly, although the expression profile was largely correlated, the coding genes in human often showed higher transcriptional abundance than their non-coding counterparts in rhesus macaque. The major findings we report in this manuscript are robust and insensitive to the parameters used in the identification and analysis of de novo genes. Our results suggest that at least a portion of long non-coding RNAs, especially those with active and regulated transcription, may serve as a birth pool for protein-coding genes, which are then further optimized at the transcriptional level.

  5. De novo mutation in the dopamine transporter gene associates dopamine dysfunction with autism spectrum disorder

    DEFF Research Database (Denmark)

    Hamilton, P J; Campbell, N G; Sharma, S

    2013-01-01

    De novo genetic variation is an important class of risk factors for autism spectrum disorder (ASD). Recently, whole-exome sequencing of ASD families has identified a novel de novo missense mutation in the human dopamine (DA) transporter (hDAT) gene, which results in a Thr to Met substitution...

  6. Novel de novo BRCA2 mutation in a patient with a family history of breast cancer

    DEFF Research Database (Denmark)

    Hansen, Thomas V O; Bisgaard, Marie Luise; Jønson, Lars

    2008-01-01

    whole blood. The paternity was determined by single nucleotide polymorphism (SNP) microarray analysis. Parental origin of the de novo mutation was determined by establishing mutation-SNP haplotypes by variant specific PCR, while de novo and mosaic status was investigated by sequencing of DNA from......BACKGROUND: BRCA2 germ-line mutations predispose to breast and ovarian cancer. Mutations are widespread and unclassified splice variants are frequently encountered. We describe the parental origin and functional characterization of a novel de novo BRCA2 splice site mutation found in a patient...... and synthesis of a truncated BRCA2 protein. The aberrant splicing was verified by RT-PCR analysis on RNA isolated from whole blood of the affected patient. The mutation was not found in any of the patient's parents or in the mother's carcinoma, showing it is a de novo mutation. Variant specific PCR indicates...

  7. De novo assembly of highly diverse viral populations

    Directory of Open Access Journals (Sweden)

    Yang Xiao

    2012-09-01

    Full Text Available Abstract Background Extensive genetic diversity in viral populations within infected hosts and the divergence of variants from existing reference genomes impede the analysis of deep viral sequencing data. A de novo population consensus assembly is valuable both as a single linear representation of the population and as a backbone on which intra-host variants can be accurately mapped. The availability of consensus assemblies and robustly mapped variants are crucial to the genetic study of viral disease progression, transmission dynamics, and viral evolution. Existing de novo assembly techniques fail to robustly assemble ultra-deep sequence data from genetically heterogeneous populations such as viruses into full-length genomes due to the presence of extensive genetic variability, contaminants, and variable sequence coverage. Results We present VICUNA, a de novo assembly algorithm suitable for generating consensus assemblies from genetically heterogeneous populations. We demonstrate its effectiveness on Dengue, Human Immunodeficiency and West Nile viral populations, representing a range of intra-host diversity. Compared to state-of-the-art assemblers designed for haploid or diploid systems, VICUNA recovers full-length consensus and captures insertion/deletion polymorphisms in diverse samples. Final assemblies maintain a high base calling accuracy. VICUNA program is publicly available at: http://www.broadinstitute.org/scientific-community/science/projects/viral-genomics/ viral-genomics-analysis-software. Conclusions We developed VICUNA, a publicly available software tool, that enables consensus assembly of ultra-deep sequence derived from diverse viral populations. While VICUNA was developed for the analysis of viral populations, its application to other heterogeneous sequence data sets such as metagenomic or tumor cell population samples may prove beneficial in these fields of research.

  8. Application of Soft Computing Techniques and Multiple Regression Models for CBR prediction of Soils

    Directory of Open Access Journals (Sweden)

    Fatimah Khaleel Ibrahim

    2017-08-01

    Full Text Available The techniques of soft computing technique such as Artificial Neutral Network (ANN have improved the predicting capability and have actually discovered application in Geotechnical engineering. The aim of this research is to utilize the soft computing technique and Multiple Regression Models (MLR for forecasting the California bearing ratio CBR( of soil from its index properties. The indicator of CBR for soil could be predicted from various soils characterizing parameters with the assist of MLR and ANN methods. The data base that collected from the laboratory by conducting tests on 86 soil samples that gathered from different projects in Basrah districts. Data gained from the experimental result were used in the regression models and soft computing techniques by using artificial neural network. The liquid limit, plastic index , modified compaction test and the CBR test have been determined. In this work, different ANN and MLR models were formulated with the different collection of inputs to be able to recognize their significance in the prediction of CBR. The strengths of the models that were developed been examined in terms of regression coefficient (R2, relative error (RE% and mean square error (MSE values. From the results of this paper, it absolutely was noticed that all the proposed ANN models perform better than that of MLR model. In a specific ANN model with all input parameters reveals better outcomes than other ANN models.

  9. Spaced Seed Data Structures for De Novo Assembly

    Directory of Open Access Journals (Sweden)

    Inanç Birol

    2015-01-01

    Full Text Available De novo assembly of the genome of a species is essential in the absence of a reference genome sequence. Many scalable assembly algorithms use the de Bruijn graph (DBG paradigm to reconstruct genomes, where a table of subsequences of a certain length is derived from the reads, and their overlaps are analyzed to assemble sequences. Despite longer subsequences unlocking longer genomic features for assembly, associated increase in compute resources limits the practicability of DBG over other assembly archetypes already designed for longer reads. Here, we revisit the DBG paradigm to adapt it to the changing sequencing technology landscape and introduce three data structure designs for spaced seeds in the form of paired subsequences. These data structures address memory and run time constraints imposed by longer reads. We observe that when a fixed distance separates seed pairs, it provides increased sequence specificity with increased gap length. Further, we note that Bloom filters would be suitable to implicitly store spaced seeds and be tolerant to sequencing errors. Building on this concept, we describe a data structure for tracking the frequencies of observed spaced seeds. These data structure designs will have applications in genome, transcriptome and metagenome assemblies, and read error correction.

  10. Computer-Aided Drug Discovery in Plant Pathology.

    Science.gov (United States)

    Shanmugam, Gnanendra; Jeon, Junhyun

    2017-12-01

    Control of plant diseases is largely dependent on use of agrochemicals. However, there are widening gaps between our knowledge on plant diseases gained from genetic/mechanistic studies and rapid translation of the knowledge into target-oriented development of effective agrochemicals. Here we propose that the time is ripe for computer-aided drug discovery/design (CADD) in molecular plant pathology. CADD has played a pivotal role in development of medically important molecules over the last three decades. Now, explosive increase in information on genome sequences and three dimensional structures of biological molecules, in combination with advances in computational and informational technologies, opens up exciting possibilities for application of CADD in discovery and development of agrochemicals. In this review, we outline two categories of the drug discovery strategies: structure- and ligand-based CADD, and relevant computational approaches that are being employed in modern drug discovery. In order to help readers to dive into CADD, we explain concepts of homology modelling, molecular docking, virtual screening, and de novo ligand design in structure-based CADD, and pharmacophore modelling, ligand-based virtual screening, quantitative structure activity relationship modelling and de novo ligand design for ligand-based CADD. We also provide the important resources available to carry out CADD. Finally, we present a case study showing how CADD approach can be implemented in reality for identification of potent chemical compounds against the important plant pathogens, Pseudomonas syringae and Colletotrichum gloeosporioides .

  11. The neural correlates of problem states: testing FMRI predictions of a computational model of multitasking.

    Directory of Open Access Journals (Sweden)

    Jelmer P Borst

    Full Text Available BACKGROUND: It has been shown that people can only maintain one problem state, or intermediate mental representation, at a time. When more than one problem state is required, for example in multitasking, performance decreases considerably. This effect has been explained in terms of a problem state bottleneck. METHODOLOGY: In the current study we use the complimentary methodologies of computational cognitive modeling and neuroimaging to investigate the neural correlates of this problem state bottleneck. In particular, an existing computational cognitive model was used to generate a priori fMRI predictions for a multitasking experiment in which the problem state bottleneck plays a major role. Hemodynamic responses were predicted for five brain regions, corresponding to five cognitive resources in the model. Most importantly, we predicted the intraparietal sulcus to show a strong effect of the problem state manipulations. CONCLUSIONS: Some of the predictions were confirmed by a subsequent fMRI experiment, while others were not matched by the data. The experiment supported the hypothesis that the problem state bottleneck is a plausible cause of the interference in the experiment and that it could be located in the intraparietal sulcus.

  12. Multiple-Swarm Ensembles: Improving the Predictive Power and Robustness of Predictive Models and Its Use in Computational Biology.

    Science.gov (United States)

    Alves, Pedro; Liu, Shuang; Wang, Daifeng; Gerstein, Mark

    2018-01-01

    Machine learning is an integral part of computational biology, and has already shown its use in various applications, such as prognostic tests. In the last few years in the non-biological machine learning community, ensembling techniques have shown their power in data mining competitions such as the Netflix challenge; however, such methods have not found wide use in computational biology. In this work, we endeavor to show how ensembling techniques can be applied to practical problems, including problems in the field of bioinformatics, and how they often outperform other machine learning techniques in both predictive power and robustness. Furthermore, we develop a methodology of ensembling, Multi-Swarm Ensemble (MSWE) by using multiple particle swarm optimizations and demonstrate its ability to further enhance the performance of ensembles.

  13. Computational intelligence models to predict porosity of tablets using minimum features

    Directory of Open Access Journals (Sweden)

    Khalid MH

    2017-01-01

    Full Text Available Mohammad Hassan Khalid,1 Pezhman Kazemi,1 Lucia Perez-Gandarillas,2 Abderrahim Michrafy,2 Jakub Szlęk,1 Renata Jachowicz,1 Aleksander Mendyk1 1Department of Pharmaceutical Technology and Biopharmaceutics, Faculty of Pharmacy, Jagiellonian University Medical College, Krakow, Poland; 2Centre National de la Recherche Scientifique, Centre RAPSODEE, Mines Albi, Université de Toulouse, Albi, France Abstract: The effects of different formulations and manufacturing process conditions on the physical properties of a solid dosage form are of importance to the pharmaceutical industry. It is vital to have in-depth understanding of the material properties and governing parameters of its processes in response to different formulations. Understanding the mentioned aspects will allow tighter control of the process, leading to implementation of quality-by-design (QbD practices. Computational intelligence (CI offers an opportunity to create empirical models that can be used to describe the system and predict future outcomes in silico. CI models can help explore the behavior of input parameters, unlocking deeper understanding of the system. This research endeavor presents CI models to predict the porosity of tablets created by roll-compacted binary mixtures, which were milled and compacted under systematically varying conditions. CI models were created using tree-based methods, artificial neural networks (ANNs, and symbolic regression trained on an experimental data set and screened using root-mean-square error (RMSE scores. The experimental data were composed of proportion of microcrystalline cellulose (MCC (in percentage, granule size fraction (in micrometers, and die compaction force (in kilonewtons as inputs and porosity as an output. The resulting models show impressive generalization ability, with ANNs (normalized root-mean-square error [NRMSE] =1% and symbolic regression (NRMSE =4% as the best-performing methods, also exhibiting reliable predictive

  14. Automated interpretable computational biology in the clinic: a framework to predict disease severity and stratify patients from clinical data

    Directory of Open Access Journals (Sweden)

    Soumya Banerjee

    2017-10-01

    Full Text Available We outline an automated computational and machine learning framework that predicts disease severity and stratifies patients. We apply our framework to available clinical data. Our algorithm automatically generates insights and predicts disease severity with minimal operator intervention. The computational framework presented here can be used to stratify patients, predict disease severity and propose novel biomarkers for disease. Insights from machine learning algorithms coupled with clinical data may help guide therapy, personalize treatment and help clinicians understand the change in disease over time. Computational techniques like these can be used in translational medicine in close collaboration with clinicians and healthcare providers. Our models are also interpretable, allowing clinicians with minimal machine learning experience to engage in model building. This work is a step towards automated machine learning in the clinic.

  15. A glance at quality score: implication for de novo transcriptome reconstruction of Illumina reads

    Directory of Open Access Journals (Sweden)

    Stanley Kimbung Mbandi

    2014-02-01

    Full Text Available Downstream analyses of short-reads from next-generation sequencing platforms are often preceded by a pre-processing step that removes uncalled and wrongly called bases. Standard approaches rely on their associated base quality scores to retain the read or a portion of it when the score is above a predefined threshold. It is difficult to differentiate sequencing error from biological variation without a reference using quality scores. The effects of quality score based trimming have not been systematically studied in de novo transcriptome assembly. Using RNA-Seq data produced from Illumina, we teased out the effects of quality score base filtering or trimming on de novo transcriptome reconstruction. We showed that assemblies produced from reads subjected to different quality score thresholds contain truncated and missing transfrags when compared to those from untrimmed reads. Our data supports the fact that de novo assembling of untrimmed data is challenging for de Bruijn graph assemblers. However, our results indicates that comparing the assemblies from untrimmed and trimmed read subsets can suggest appropriate filtering parameters and enable selection of the optimum de novo transcriptome assembly in non-model organisms.

  16. Accurate prediction of the toxicity of benzoic acid compounds in mice via oral without using any computer codes

    International Nuclear Information System (INIS)

    Keshavarz, Mohammad Hossein; Gharagheizi, Farhad; Shokrolahi, Arash; Zakinejad, Sajjad

    2012-01-01

    Highlights: ► A novel method is introduced for desk calculation of toxicity of benzoic acid derivatives. ► There is no need to use QSAR and QSTR methods, which are based on computer codes. ► The predicted results of 58 compounds are more reliable than those predicted by QSTR method. ► The present method gives good predictions for further 324 benzoic acid compounds. - Abstract: Most of benzoic acid derivatives are toxic, which may cause serious public health and environmental problems. Two novel simple and reliable models are introduced for desk calculations of the toxicity of benzoic acid compounds in mice via oral LD 50 with more reliance on their answers as one could attach to the more complex outputs. They require only elemental composition and molecular fragments without using any computer codes. The first model is based on only the number of carbon and hydrogen atoms, which can be improved by several molecular fragments in the second model. For 57 benzoic compounds, where the computed results of quantitative structure–toxicity relationship (QSTR) were recently reported, the predicted results of two simple models of present method are more reliable than QSTR computations. The present simple method is also tested with further 324 benzoic acid compounds including complex molecular structures, which confirm good forecasting ability of the second model.

  17. Efficient assembly of de novo human artificial chromosomes from large genomic loci

    Directory of Open Access Journals (Sweden)

    Stromberg Gregory

    2005-07-01

    Full Text Available Abstract Background Human Artificial Chromosomes (HACs are potentially useful vectors for gene transfer studies and for functional annotation of the genome because of their suitability for cloning, manipulating and transferring large segments of the genome. However, development of HACs for the transfer of large genomic loci into mammalian cells has been limited by difficulties in manipulating high-molecular weight DNA, as well as by the low overall frequencies of de novo HAC formation. Indeed, to date, only a small number of large (>100 kb genomic loci have been reported to be successfully packaged into de novo HACs. Results We have developed novel methodologies to enable efficient assembly of HAC vectors containing any genomic locus of interest. We report here the creation of a novel, bimolecular system based on bacterial artificial chromosomes (BACs for the construction of HACs incorporating any defined genomic region. We have utilized this vector system to rapidly design, construct and validate multiple de novo HACs containing large (100–200 kb genomic loci including therapeutically significant genes for human growth hormone (HGH, polycystic kidney disease (PKD1 and ß-globin. We report significant differences in the ability of different genomic loci to support de novo HAC formation, suggesting possible effects of cis-acting genomic elements. Finally, as a proof of principle, we have observed sustained ß-globin gene expression from HACs incorporating the entire 200 kb ß-globin genomic locus for over 90 days in the absence of selection. Conclusion Taken together, these results are significant for the development of HAC vector technology, as they enable high-throughput assembly and functional validation of HACs containing any large genomic locus. We have evaluated the impact of different genomic loci on the frequency of HAC formation and identified segments of genomic DNA that appear to facilitate de novo HAC formation. These genomic loci

  18. From structure prediction to genomic screens for novel non-coding RNAs.

    Science.gov (United States)

    Gorodkin, Jan; Hofacker, Ivo L

    2011-08-01

    Non-coding RNAs (ncRNAs) are receiving more and more attention not only as an abundant class of genes, but also as regulatory structural elements (some located in mRNAs). A key feature of RNA function is its structure. Computational methods were developed early for folding and prediction of RNA structure with the aim of assisting in functional analysis. With the discovery of more and more ncRNAs, it has become clear that a large fraction of these are highly structured. Interestingly, a large part of the structure is comprised of regular Watson-Crick and GU wobble base pairs. This and the increased amount of available genomes have made it possible to employ structure-based methods for genomic screens. The field has moved from folding prediction of single sequences to computational screens for ncRNAs in genomic sequence using the RNA structure as the main characteristic feature. Whereas early methods focused on energy-directed folding of single sequences, comparative analysis based on structure preserving changes of base pairs has been efficient in improving accuracy, and today this constitutes a key component in genomic screens. Here, we cover the basic principles of RNA folding and touch upon some of the concepts in current methods that have been applied in genomic screens for de novo RNA structures in searches for novel ncRNA genes and regulatory RNA structure on mRNAs. We discuss the strengths and weaknesses of the different strategies and how they can complement each other.

  19. De novo synthesis of milk triglycerides in humans

    Science.gov (United States)

    Mammary gland (MG) de novo lipogenesis contributes significantly to milk fat in animals but little is known in humans. Objective: To test the hypothesis that the incorporation of 13C carbons from [U-13C]glucose into fatty acids (FA) and glycerol in triglycerides (TG) will be greater: 1) in milk tha...

  20. 76 FR 68767 - Draft Guidance for Industry and Food and Drug Administration Staff; De Novo Classification...

    Science.gov (United States)

    2011-11-07

    ... DEPARTMENT OF HEALTH AND HUMAN SERVICES Food and Drug Administration [Docket No. FDA-2011-D-0689] Draft Guidance for Industry and Food and Drug Administration Staff; De Novo Classification Process... for Industry and Food and Drug Administration Staff; De Novo Classification Process (Evaluation of...

  1. Drug repurposing: translational pharmacology, chemistry, computers and the clinic.

    Science.gov (United States)

    Issa, Naiem T; Byers, Stephen W; Dakshanamurthy, Sivanesan

    2013-01-01

    The process of discovering a pharmacological compound that elicits a desired clinical effect with minimal side effects is a challenge. Prior to the advent of high-performance computing and large-scale screening technologies, drug discovery was largely a serendipitous endeavor, as in the case of thalidomide for erythema nodosum leprosum or cancer drugs in general derived from flora located in far-reaching geographic locations. More recently, de novo drug discovery has become a more rationalized process where drug-target-effect hypotheses are formulated on the basis of already known compounds/protein targets and their structures. Although this approach is hypothesis-driven, the actual success has been very low, contributing to the soaring costs of research and development as well as the diminished pharmaceutical pipeline in the United States. In this review, we discuss the evolution in computational pharmacology as the next generation of successful drug discovery and implementation in the clinic where high-performance computing (HPC) is used to generate and validate drug-target-effect hypotheses completely in silico. The use of HPC would decrease development time and errors while increasing productivity prior to in vitro, animal and human testing. We highlight approaches in chemoinformatics, bioinformatics as well as network biopharmacology to illustrate potential avenues from which to design clinically efficacious drugs. We further discuss the implications of combining these approaches into an integrative methodology for high-accuracy computational predictions within the context of drug repositioning for the efficient streamlining of currently approved drugs back into clinical trials for possible new indications.

  2. De novo insertions and deletions of predominantly paternal origin are associated with autism spectrum disorder

    Science.gov (United States)

    Dong, Shan; Walker, Michael F.; Carriero, Nicholas J.; DiCola, Michael; Willsey, A. Jeremy; Ye, Adam Y.; Waqar, Zainulabedin; Gonzalez, Luis E.; Overton, John D.; Frahm, Stephanie; Keaney, John F.; Teran, Nicole A.; Dea, Jeanselle; Mandell, Jeffrey D.; Bal, Vanessa Hus; Sullivan, Catherine A.; DiLullo, Nicholas M.; Khalil, Rehab O.; Gockley, Jake; Yuksel, Zafer; Sertel, Sinem M.; Ercan-Sencicek, A. Gulhan; Gupta, Abha R.; Mane, Shrikant M.; Sheldon, Michael; Brooks, Andrew I.; Roeder, Kathryn; Devlin, Bernie; State, Matthew W.; Wei, Liping; Sanders, Stephan J.

    2014-01-01

    SUMMARY Whole-exome sequencing (WES) studies have demonstrated the contribution of de novo loss-of-function single nucleotide variants to autism spectrum disorders (ASD). However, challenges in the reliable detection of de novo insertions and deletions (indels) have limited inclusion of these variants in prior analyses. Through the application of a robust indel detection method to WES data from 787 ASD families (2,963 individuals), we demonstrate that de novo frameshift indels contribute to ASD risk (OR=1.6; 95%CI=1.0-2.7; p=0.03), are more common in female probands (p=0.02), are enriched among genes encoding FMRP targets (p=6×10−9), and arise predominantly on the paternal chromosome (p<0.001). Based on mutation rates in probands versus unaffected siblings, de novo frameshift indels contribute to risk in approximately 3.0% of individuals with ASD. Finally, through observing clustering of mutations in unrelated probands, we report two novel ASD-associated genes: KMT2E (MLL5), a chromatin regulator, and RIMS1, a regulator of synaptic vesicle release. PMID:25284784

  3. Enhancing performance of next generation FSO communication systems using soft computing-based predictions.

    Science.gov (United States)

    Kazaura, Kamugisha; Omae, Kazunori; Suzuki, Toshiji; Matsumoto, Mitsuji; Mutafungwa, Edward; Korhonen, Timo O; Murakami, Tadaaki; Takahashi, Koichi; Matsumoto, Hideki; Wakamori, Kazuhiko; Arimoto, Yoshinori

    2006-06-12

    The deterioration and deformation of a free-space optical beam wave-front as it propagates through the atmosphere can reduce the link availability and may introduce burst errors thus degrading the performance of the system. We investigate the suitability of utilizing soft-computing (SC) based tools for improving performance of free-space optical (FSO) communications systems. The SC based tools are used for the prediction of key parameters of a FSO communications system. Measured data collected from an experimental FSO communication system is used as training and testing data for a proposed multi-layer neural network predictor (MNNP) used to predict future parameter values. The predicted parameters are essential for reducing transmission errors by improving the antenna's accuracy of tracking data beams. This is particularly essential for periods considered to be of strong atmospheric turbulence. The parameter values predicted using the proposed tool show acceptable conformity with original measurements.

  4. Prediction of scour caused by 2D horizontal jets using soft computing techniques

    Directory of Open Access Journals (Sweden)

    Masoud Karbasi

    2017-12-01

    Full Text Available This paper presents application of five soft-computing techniques, artificial neural networks, support vector regression, gene expression programming, grouping method of data handling (GMDH neural network and adaptive-network-based fuzzy inference system, to predict maximum scour hole depth downstream of a sluice gate. The input parameters affecting the scour depth are the sediment size and its gradation, apron length, sluice gate opening, jet Froude number and the tail water depth. Six non-dimensional parameters were achieved to define a functional relationship between the input and output variables. Published data were used from the experimental researches. The results of soft-computing techniques were compared with empirical and regression based equations. The results obtained from the soft-computing techniques are superior to those of empirical and regression based equations. Comparison of soft-computing techniques showed that accuracy of the ANN model is higher than other models (RMSE = 0.869. A new GEP based equation was proposed.

  5. How Adverse Outcome Pathways Can Aid the Development and Use of Computational Prediction Models for Regulatory Toxicology.

    Science.gov (United States)

    Wittwehr, Clemens; Aladjov, Hristo; Ankley, Gerald; Byrne, Hugh J; de Knecht, Joop; Heinzle, Elmar; Klambauer, Günter; Landesmann, Brigitte; Luijten, Mirjam; MacKay, Cameron; Maxwell, Gavin; Meek, M E Bette; Paini, Alicia; Perkins, Edward; Sobanski, Tomasz; Villeneuve, Dan; Waters, Katrina M; Whelan, Maurice

    2017-02-01

    Efforts are underway to transform regulatory toxicology and chemical safety assessment from a largely empirical science based on direct observation of apical toxicity outcomes in whole organism toxicity tests to a predictive one in which outcomes and risk are inferred from accumulated mechanistic understanding. The adverse outcome pathway (AOP) framework provides a systematic approach for organizing knowledge that may support such inference. Likewise, computational models of biological systems at various scales provide another means and platform to integrate current biological understanding to facilitate inference and extrapolation. We argue that the systematic organization of knowledge into AOP frameworks can inform and help direct the design and development of computational prediction models that can further enhance the utility of mechanistic and in silico data for chemical safety assessment. This concept was explored as part of a workshop on AOP-Informed Predictive Modeling Approaches for Regulatory Toxicology held September 24-25, 2015. Examples of AOP-informed model development and its application to the assessment of chemicals for skin sensitization and multiple modes of endocrine disruption are provided. The role of problem formulation, not only as a critical phase of risk assessment, but also as guide for both AOP and complementary model development is described. Finally, a proposal for actively engaging the modeling community in AOP-informed computational model development is made. The contents serve as a vision for how AOPs can be leveraged to facilitate development of computational prediction models needed to support the next generation of chemical safety assessment. © The Author 2016. Published by Oxford University Press on behalf of the Society of Toxicology.

  6. TRANSPORT OF PATIENTS FOR PRIMARY PTCA FROM GENERAL HOSPITAL NOVO MESTO TO LJUBLJANA IN 2002

    Directory of Open Access Journals (Sweden)

    Renata Okrajšek

    2004-12-01

    Full Text Available Background. The treatment of acute coronary syndrome (ACS with ST-segment elevation with primary percutaneous transluminal coronary angioplasty (PTCA is the best way to treat these patients. Primary PTCA is also practicable with patients who are admitted into institution without catheter laboratory. The transport of patients into the tertiary institution is safe, but it is important to keep the time of ischemia as short as possible and to reach the time interval of door-balloon as recommended by the guidelines. The ACS patients with ST-segment elevation that were directed into General Hospital Novo mesto after examination at the internistic emergency department have been redirected to KC Ljubljana for realization of PTCA since October 2001.Methods. A prospective analysis of patients with ACS with STsegment elevation, who had been transferred from General Hospital Novo mesto to KC Ljubljana in the period from January 1, 2002 to December 31, 2002 to have a primary PTCA, was performed. The analysis comprised the following: the time interval of handling the patients at Internistic department of General Hospital Novo mesto, the time of transport of patients to Ljubljana and total time interval from the arrival of patients to General Hospital Novo mesto to the first inflation of balloon in Ljubljana. We monitored the complications that occurred during the treatment of the patients.Results. In the above mentioned period 29 patients (24 males and 5 females were transported from the General Hospital Novo mesto to the KC Ljubljana to have a primary PTCA performed. The total time interval measured between the patients’ arrival to General Hospital Novo mesto to the first inflation of balloon in Ljubljana in the year 2002 was 145 minutes, which is 17 minutes better than in the previous period. The time interval recommended by the guidelines was achieved with four patients.Conclusions. By recognizing the problems that had encountered with directing the

  7. Illumina-based de novo transcriptome sequencing and analysis

    Indian Academy of Sciences (India)

    In the present study, we used Illumina HiSeq technology to perform de novo assembly of heart and musk gland transcriptomes from the Chinese forest musk deer. A total of 239,383 transcripts and 176,450 unigenes were obtained, of which 37,329 unigenes were matched to known sequences in the NCBI nonredundant ...

  8. The benefit of non contrast-enhanced magnetic resonance angiography for predicting vascular access surgery outcome: a computer model perspective.

    Directory of Open Access Journals (Sweden)

    Maarten A G Merkx

    Full Text Available INTRODUCTION: Vascular access (VA surgery, a prerequisite for hemodialysis treatment of end-stage renal-disease (ESRD patients, is hampered by complication rates, which are frequently related to flow enhancement. To assist in VA surgery planning, a patient-specific computer model for postoperative flow enhancement was developed. The purpose of this study is to assess the benefit of non contrast-enhanced magnetic resonance angiography (NCE-MRA data as patient-specific geometrical input for the model-based prediction of surgery outcome. METHODS: 25 ESRD patients were included in this study. All patients received a NCE-MRA examination of the upper extremity blood vessels in addition to routine ultrasound (US. Local arterial radii were assessed from NCE-MRA and converted to model input using a linear fit per artery. Venous radii were determined with US. The effect of radius measurement uncertainty on model predictions was accounted for by performing Monte-Carlo simulations. The resulting flow prediction interval of the computer model was compared with the postoperative flow obtained from US. Patients with no overlap between model-based prediction and postoperative measurement were further analyzed to determine whether an increase in geometrical detail improved computer model prediction. RESULTS: Overlap between postoperative flows and model-based predictions was obtained for 71% of patients. Detailed inspection of non-overlapping cases revealed that the geometrical details that could be assessed from NCE-MRA explained most of the differences, and moreover, upon addition of these details in the computer model the flow predictions improved. CONCLUSIONS: The results demonstrate clearly that NCE-MRA does provide valuable geometrical information for VA surgery planning. Therefore, it is recommended to use this modality, at least for patients at risk for local or global narrowing of the blood vessels as well as for patients for whom an US-based model

  9. De novo biosynthesis of anthocyanins in Saccharomyces cerevisiae.

    Science.gov (United States)

    Eichenberger, Michael; Hansson, Anders; Fischer, David; Dürr, Lara; Naesby, Michael

    2018-06-01

    Anthocyanins (ACNs) are plant secondary metabolites responsible for most of the red, purple and blue colors of flowers, fruits and vegetables. They are increasingly used in the food and beverage industry as natural alternative to artificial colorants. Production of these compounds by fermentation of microorganisms would provide an attractive alternative. In this study, Saccharomyces cerevisiae was engineered for de novo production of the three basic anthocyanins, as well as the three main trans-flavan-3-ols. Enzymes from different plant sources were screened and efficient variants found for most steps of the biosynthetic pathway. However, the anthocyanidin synthase was identified as a major obstacle to efficient production. In yeast, this enzyme converts the majority of its natural substrates leucoanthocyanidins into the off-pathway flavonols. Nonetheless, de novo biosynthesis of ACNs was shown for the first time in yeast and for the first time in a single microorganism. It provides a framework for optimizing the activity of anthocyanidin synthase and represents an important step towards sustainable industrial production of these highly relevant molecules in yeast.

  10. Computational Prediction of Blood-Brain Barrier Permeability Using Decision Tree Induction

    Directory of Open Access Journals (Sweden)

    Jörg Huwyler

    2012-08-01

    Full Text Available Predicting blood-brain barrier (BBB permeability is essential to drug development, as a molecule cannot exhibit pharmacological activity within the brain parenchyma without first transiting this barrier. Understanding the process of permeation, however, is complicated by a combination of both limited passive diffusion and active transport. Our aim here was to establish predictive models for BBB drug permeation that include both active and passive transport. A database of 153 compounds was compiled using in vivo surface permeability product (logPS values in rats as a quantitative parameter for BBB permeability. The open source Chemical Development Kit (CDK was used to calculate physico-chemical properties and descriptors. Predictive computational models were implemented by machine learning paradigms (decision tree induction on both descriptor sets. Models with a corrected classification rate (CCR of 90% were established. Mechanistic insight into BBB transport was provided by an Ant Colony Optimization (ACO-based binary classifier analysis to identify the most predictive chemical substructures. Decision trees revealed descriptors of lipophilicity (aLogP and charge (polar surface area, which were also previously described in models of passive diffusion. However, measures of molecular geometry and connectivity were found to be related to an active drug transport component.

  11. Neural Network Optimization of Ligament Stiffnesses for the Enhanced Predictive Ability of a Patient-Specific, Computational Foot/Ankle Model.

    Science.gov (United States)

    Chande, Ruchi D; Wayne, Jennifer S

    2017-09-01

    Computational models of diarthrodial joints serve to inform the biomechanical function of these structures, and as such, must be supplied appropriate inputs for performance that is representative of actual joint function. Inputs for these models are sourced from both imaging modalities as well as literature. The latter is often the source of mechanical properties for soft tissues, like ligament stiffnesses; however, such data are not always available for all the soft tissues nor is it known for patient-specific work. In the current research, a method to improve the ligament stiffness definition for a computational foot/ankle model was sought with the greater goal of improving the predictive ability of the computational model. Specifically, the stiffness values were optimized using artificial neural networks (ANNs); both feedforward and radial basis function networks (RBFNs) were considered. Optimal networks of each type were determined and subsequently used to predict stiffnesses for the foot/ankle model. Ultimately, the predicted stiffnesses were considered reasonable and resulted in enhanced performance of the computational model, suggesting that artificial neural networks can be used to optimize stiffness inputs.

  12. Uridine monophosphate synthetase enables eukaryotic de novo NAD+ biosynthesis from quinolinic acid.

    Science.gov (United States)

    McReynolds, Melanie R; Wang, Wenqing; Holleran, Lauren M; Hanna-Rose, Wendy

    2017-07-07

    NAD + biosynthesis is an attractive and promising therapeutic target for influencing health span and obesity-related phenotypes as well as tumor growth. Full and effective use of this target for therapeutic benefit requires a complete understanding of NAD + biosynthetic pathways. Here, we report a previously unrecognized role for a conserved phosphoribosyltransferase in NAD + biosynthesis. Because a required quinolinic acid phosphoribosyltransferase (QPRTase) is not encoded in its genome, Caenorhabditis elegans are reported to lack a de novo NAD + biosynthetic pathway. However, all the genes of the kynurenine pathway required for quinolinic acid (QA) production from tryptophan are present. Thus, we investigated the presence of de novo NAD + biosynthesis in this organism. By combining isotope-tracing and genetic experiments, we have demonstrated the presence of an intact de novo biosynthesis pathway for NAD + from tryptophan via QA, highlighting the functional conservation of this important biosynthetic activity. Supplementation with kynurenine pathway intermediates also boosted NAD + levels and partially reversed NAD + -dependent phenotypes caused by mutation of pnc-1 , which encodes a nicotinamidase required for NAD + salvage biosynthesis, demonstrating contribution of de novo synthesis to NAD + homeostasis. By investigating candidate phosphoribosyltransferase genes in the genome, we determined that the conserved uridine monophosphate phosphoribosyltransferase (UMPS), which acts in pyrimidine biosynthesis, is required for NAD + biosynthesis in place of the missing QPRTase. We suggest that similar underground metabolic activity of UMPS may function in other organisms. This mechanism for NAD + biosynthesis creates novel possibilities for manipulating NAD + biosynthetic pathways, which is key for the future of therapeutics. © 2017 by The American Society for Biochemistry and Molecular Biology, Inc.

  13. Complex data modeling and computationally intensive methods for estimation and prediction

    CERN Document Server

    Secchi, Piercesare; Advances in Complex Data Modeling and Computational Methods in Statistics

    2015-01-01

    The book is addressed to statisticians working at the forefront of the statistical analysis of complex and high dimensional data and offers a wide variety of statistical models, computer intensive methods and applications: network inference from the analysis of high dimensional data; new developments for bootstrapping complex data; regression analysis for measuring the downsize reputational risk; statistical methods for research on the human genome dynamics; inference in non-euclidean settings and for shape data; Bayesian methods for reliability and the analysis of complex data; methodological issues in using administrative data for clinical and epidemiological research; regression models with differential regularization; geostatistical methods for mobility analysis through mobile phone data exploration. This volume is the result of a careful selection among the contributions presented at the conference "S.Co.2013: Complex data modeling and computationally intensive methods for estimation and prediction" held...

  14. Predictive Capability Maturity Model for computational modeling and simulation.

    Energy Technology Data Exchange (ETDEWEB)

    Oberkampf, William Louis; Trucano, Timothy Guy; Pilch, Martin M.

    2007-10-01

    The Predictive Capability Maturity Model (PCMM) is a new model that can be used to assess the level of maturity of computational modeling and simulation (M&S) efforts. The development of the model is based on both the authors experience and their analysis of similar investigations in the past. The perspective taken in this report is one of judging the usefulness of a predictive capability that relies on the numerical solution to partial differential equations to better inform and improve decision making. The review of past investigations, such as the Software Engineering Institute's Capability Maturity Model Integration and the National Aeronautics and Space Administration and Department of Defense Technology Readiness Levels, indicates that a more restricted, more interpretable method is needed to assess the maturity of an M&S effort. The PCMM addresses six contributing elements to M&S: (1) representation and geometric fidelity, (2) physics and material model fidelity, (3) code verification, (4) solution verification, (5) model validation, and (6) uncertainty quantification and sensitivity analysis. For each of these elements, attributes are identified that characterize four increasing levels of maturity. Importantly, the PCMM is a structured method for assessing the maturity of an M&S effort that is directed toward an engineering application of interest. The PCMM does not assess whether the M&S effort, the accuracy of the predictions, or the performance of the engineering system satisfies or does not satisfy specified application requirements.

  15. An experimental study on prediction of gallstone composition by ultrasonography and computed tomography

    International Nuclear Information System (INIS)

    Lee, Jong Beum; Chung, Sae Yul; Kim, Kun Sang; Lee, Yong Chul; Han, Man Chung; Kim, Jin Kyu

    1992-01-01

    Prediction of chemical composition of gallstones is a prerequisite in contemplating the chemical dissolution or extracorporeal shock wave lithotripsy of gallstones. The author retrospectively analysed the correlation between quantitative chemical composition of gallstones and their ultrasonographic and computed tomographic findings. The ultrasonography(US) and computed tomography(CT) of 100 consecutive stones obtained from 100 patients were performed under the in vitro condition. Their US and CT findings were grouped with certain pattern and each group was compared with the chemical composition of the stones. Stones with entirely discernible circumference and homogeneous internal echo on US had high bilirubin and low cholesterol content. Acoustic shadows were frequently absent with those stones. Stones with variable internal echo on US had relatively high cholesterol content but their distribution range were wide. There was no correlationship between the cholesterol content and the CT No. of the gallstones. There was positive correlationship between the calcium content and the CT No. of gallstones. The near totally calcified gallstones had very low cholesterol and high residue content. There was no relationship between the calcification type and the ultrasonographic pattern. In conclusion, those stones with entirely discernible circumference and homogeneous internal echo on US were pigment stones. On the contrary, stones with variable internal echo had relatively high cholesterol content. CT could predict the calcium content with CT No., but could not predict the cholesterol content

  16. Prediction of infarction development after endovascular stroke therapy with dual-energy computed tomography.

    Science.gov (United States)

    Djurdjevic, Tanja; Rehwald, Rafael; Knoflach, Michael; Matosevic, Benjamin; Kiechl, Stefan; Gizewski, Elke Ruth; Glodny, Bernhard; Grams, Astrid Ellen

    2017-03-01

    After intraarterial recanalisation (IAR), the haemorrhage and the blood-brain barrier (BBB) disruption can be distinguished using dual-energy computed tomography (DECT). The aim of the present study was to investigate whether future infarction development can be predicted from DECT. DECT scans of 20 patients showing 45 BBB disrupted areas after IAR were assessed and compared with follow-up examinations. Receiver operator characteristic (ROC) analyses using densities from the iodine map (IM) and virtual non-contrast (VNC) were performed. Future infarction areas are denser than future non-infarction areas on IM series (23.44 ± 24.86 vs. 5.77 ± 2.77; p VNC series (29.71 ± 3.33 vs. 35.33 ± 3.50; p 17.13 HU; p VNC series allowed prediction of infarction volume. Future infarction development after IAR can be reliably predicted with the IM series. The prediction of haemorrhages and of infarction size is less reliable. • The IM series (DECT) can predict future infarction development after IAR. • Later haemorrhages can be predicted using the IM and the BW series. • The volume of definable hypodense areas in VNC correlates with infarction volume.

  17. Associations between Familial Rates of Psychiatric Disorders and De Novo Genetic Mutations in Autism

    Directory of Open Access Journals (Sweden)

    Kyleen Luhrs

    2017-01-01

    Full Text Available The purpose of this study was to examine the confluence of genetic and familial risk factors in children with Autism Spectrum Disorder (ASD with distinct de novo genetic events. We hypothesized that gene-disrupting mutations would be associated with reduced rates of familial psychiatric disorders relative to structural mutations. Participants included families of children with ASD in four groups: de novo duplication copy number variations (DUP, n=62, de novo deletion copy number variations (DEL, n=74, de novo likely gene-disrupting mutations (LGDM, n=267, and children without a known genetic etiology (NON, n=2111. Familial rates of psychiatric disorders were calculated from semistructured interviews. Results indicated overall increased rates of psychiatric disorders in DUP families compared to DEL and LGDM families, specific to paternal psychiatric histories, and particularly evident for depressive disorders. Higher rates of depressive disorders in maternal psychiatric histories were observed overall compared to paternal histories and higher rates of anxiety disorders were observed in paternal histories for LGDM families compared to DUP families. These findings support the notion of an additive contribution of genetic etiology and familial factors are associated with ASD risk and highlight critical need for continued work targeting these relationships.

  18. PREDICTING ATTENUATION OF VIRUSES DURING PERCOLATION IN SOILS: 2. USER'S GUIDE TO THE VIRULO 1.0 COMPUTER MODEL

    Science.gov (United States)

    In the EPA document Predicting Attenuation of Viruses During Percolation in Soils 1. Probabilistic Model the conceptual, theoretical, and mathematical foundations for a predictive screening model were presented. In this current volume we present a User's Guide for the computer mo...

  19. The nature and use of prediction skills in a biological computer simulation

    Science.gov (United States)

    Lavoie, Derrick R.; Good, Ron

    The primary goal of this study was to examine the science process skill of prediction using qualitative research methodology. The think-aloud interview, modeled after Ericsson and Simon (1984), let to the identification of 63 program exploration and prediction behaviors.The performance of seven formal and seven concrete operational high-school biology students were videotaped during a three-phase learning sequence on water pollution. Subjects explored the effects of five independent variables on two dependent variables over time using a computer-simulation program. Predictions were made concerning the effect of the independent variables upon dependent variables through time. Subjects were identified according to initial knowledge of the subject matter and success at solving three selected prediction problems.Successful predictors generally had high initial knowledge of the subject matter and were formal operational. Unsuccessful predictors generally had low initial knowledge and were concrete operational. High initial knowledge seemed to be more important to predictive success than stage of Piagetian cognitive development.Successful prediction behaviors involved systematic manipulation of the independent variables, note taking, identification and use of appropriate independent-dependent variable relationships, high interest and motivation, and in general, higher-level thinking skills. Behaviors characteristic of unsuccessful predictors were nonsystematic manipulation of independent variables, lack of motivation and persistence, misconceptions, and the identification and use of inappropriate independent-dependent variable relationships.

  20. Sequencing and de novo assembly of 150 genomes from Denmark as a population reference

    DEFF Research Database (Denmark)

    Maretty, Lasse; Jensen, Jacob Malte; Petersen, Bent

    2017-01-01

    Hundreds of thousands of human genomes are now being sequenced to characterize genetic variation and use this information to augment association mapping studies of complex disorders and other phenotypic traits. Genetic variation is identified mainly by mapping short reads to the reference genome......-coverage sequencing with mate-pair libraries extending up to 20 kilobases. We report de novo assemblies of 150 individuals (50 trios) from the GenomeDenmark project. The quality of these assemblies is similar to those obtained using the more expensive long-read technology. We use the assemblies to identify a rich set...... or by performing local assembly. However, these approaches are biased against discovery of structural variants and variation in the more complex parts of the genome. Hence, large-scale de novo assembly is needed. Here we show that it is possible to construct excellent de novo assemblies from high...

  1. Sequencing and de novo assembly of 150 genomes from Denmark as a population reference

    DEFF Research Database (Denmark)

    Maretty, Lasse; Jensen, Jacob Malte; Petersen, Bent

    2017-01-01

    Hundreds of thousands of human genomes are now being sequenced to characterize genetic variation and use this information to augment association mapping studies of complex disorders and other phenotypic traits. Genetic variation is identified mainly by mapping short reads to the reference genome...... or by performing local assembly. However, these approaches are biased against discovery of structural variants and variation in the more complex parts of the genome. Hence, large-scale de novo assembly is needed. Here we show that it is possible to construct excellent de novo assemblies from high......-coverage sequencing with mate-pair libraries extending up to 20 kilobases. We report de novo assemblies of 150 individuals (50 trios) from the GenomeDenmark project. The quality of these assemblies is similar to those obtained using the more expensive long-read technology. We use the assemblies to identify a rich set...

  2. De novo synthesis of purine nucleotides in different fiber types of rat skeletal muscle

    International Nuclear Information System (INIS)

    Tullson, P.C.; John-Alder, H.; Hood, D.A.; Terjung, R.L.

    1986-01-01

    The contribution of de novo purine nucleotide synthesis to nucleotide metabolism in skeletal muscles is not known. The authors have determined rates of de novo synthesis in soleus (slow-twitch red), red gastrocnemius (fast-twitch red), and white gastrocnemius (fast-twitch white) using the perfused rat hindquarter. 14 C glycine incorporation into ATP was linear after 1 and 2 hours of perfusion with 0.2 mM added glycine. The intracellular (I) and extracellular (E) specific activity of 14 C glycine was determined by HPLC of phenylisothiocyanate derivatives of neutralized PCA extracts. The rates of de novo synthesis when expressed relative to muscle ATP content show slow and fast-twitch red muscles to be similar and about twice as great as fast-twitch white muscles. This could represent a greater turnover of the adenine nucleotide pool in more oxidative red muscle types

  3. Modification in the FUDA computer code to predict fuel performance at high burnup

    Energy Technology Data Exchange (ETDEWEB)

    Das, M; Arunakumar, B V; Prasad, P N [Nuclear Power Corp., Mumbai (India)

    1997-08-01

    The computer code FUDA (FUel Design Analysis) participated in the blind exercises organized by the IAEA CRP (Co-ordinated Research Programme) on FUMEX (Fuel Modelling at Extended Burnup). While the code prediction compared well with the experiments at Halden under various parametric and operating conditions, the fission gas release and fission gas pressure were found to be slightly over-predicted, particularly at high burnups. In view of the results of 6 FUMEX cases, the main models and submodels of the code were reviewed and necessary improvements were made. The new version of the code FUDA MOD 2 is now able to predict fuel performance parameter for burn-ups up to 50000 MWD/TeU. The validation field of the code has been extended to prediction of thorium oxide fuel performance. An analysis of local deformations at pellet interfaces and near the end caps is carried out considering the hourglassing of the pellet by finite element technique. (author). 15 refs, 1 fig.

  4. Modification in the FUDA computer code to predict fuel performance at high burnup

    International Nuclear Information System (INIS)

    Das, M.; Arunakumar, B.V.; Prasad, P.N.

    1997-01-01

    The computer code FUDA (FUel Design Analysis) participated in the blind exercises organized by the IAEA CRP (Co-ordinated Research Programme) on FUMEX (Fuel Modelling at Extended Burnup). While the code prediction compared well with the experiments at Halden under various parametric and operating conditions, the fission gas release and fission gas pressure were found to be slightly over-predicted, particularly at high burnups. In view of the results of 6 FUMEX cases, the main models and submodels of the code were reviewed and necessary improvements were made. The new version of the code FUDA MOD 2 is now able to predict fuel performance parameter for burn-ups up to 50000 MWD/TeU. The validation field of the code has been extended to prediction of thorium oxide fuel performance. An analysis of local deformations at pellet interfaces and near the end caps is carried out considering the hourglassing of the pellet by finite element technique. (author). 15 refs, 1 fig

  5. De novo point mutations in patients diagnosed with ataxic cerebral palsy.

    Science.gov (United States)

    Parolin Schnekenberg, Ricardo; Perkins, Emma M; Miller, Jack W; Davies, Wayne I L; D'Adamo, Maria Cristina; Pessia, Mauro; Fawcett, Katherine A; Sims, David; Gillard, Elodie; Hudspith, Karl; Skehel, Paul; Williams, Jonathan; O'Regan, Mary; Jayawant, Sandeep; Jefferson, Rosalind; Hughes, Sarah; Lustenberger, Andrea; Ragoussis, Jiannis; Jackson, Mandy; Tucker, Stephen J; Németh, Andrea H

    2015-07-01

    Cerebral palsy is a sporadic disorder with multiple likely aetiologies, but frequently considered to be caused by birth asphyxia. Genetic investigations are rarely performed in patients with cerebral palsy and there is little proven evidence of genetic causes. As part of a large project investigating children with ataxia, we identified four patients in our cohort with a diagnosis of ataxic cerebral palsy. They were investigated using either targeted next generation sequencing or trio-based exome sequencing and were found to have mutations in three different genes, KCNC3, ITPR1 and SPTBN2. All the mutations were de novo and associated with increased paternal age. The mutations were shown to be pathogenic using a combination of bioinformatics analysis and in vitro model systems. This work is the first to report that the ataxic subtype of cerebral palsy can be caused by de novo dominant point mutations, which explains the sporadic nature of these cases. We conclude that at least some subtypes of cerebral palsy may be caused by de novo genetic mutations and patients with a clinical diagnosis of cerebral palsy should be genetically investigated before causation is ascribed to perinatal asphyxia or other aetiologies. © The Author (2015). Published by Oxford University Press on behalf of the Guarantors of Brain.

  6. Model-Based GUI Testing Using Uppaal at Novo Nordisk

    DEFF Research Database (Denmark)

    H. Hjort, Ulrik; Rasmussen, Jacob Illum; Larsen, Kim Guldstrand

    2009-01-01

    This paper details a collaboration between Aalborg University and Novo Nordiskin developing an automatic model-based test generation tool for system testing of the graphical user interface of a medical device on an embedded platform. The tool takes as input an UML Statemachine model and generates...

  7. Towards accurate de novo assembly for genomes with repeats

    NARCIS (Netherlands)

    Bucur, Doina

    2017-01-01

    De novo genome assemblers designed for short k-mer length or using short raw reads are unlikely to recover complex features of the underlying genome, such as repeats hundreds of bases long. We implement a stochastic machine-learning method which obtains accurate assemblies with repeats and

  8. Engineering and introduction of de novo disulphide bridges in ...

    Indian Academy of Sciences (India)

    The engineeringof de novo disulphide bridges has been explored as a means to increase the thermal stability of enzymes in the rationalmethod of protein engineering. In this study, Disulphide by Design software, homology modelling and moleculardynamics simulations were used to select appropriate amino acid pairs for ...

  9. Robust de novo pathway enrichment with KeyPathwayMiner 5 [version 1; referees: 2 approved

    Directory of Open Access Journals (Sweden)

    Nicolas Alcaraz

    2016-06-01

    Full Text Available Identifying functional modules or novel active pathways, recently termed de novo pathway enrichment, is a computational systems biology challenge that has gained much attention during the last decade. Given a large biological interaction network, KeyPathwayMiner extracts connected subnetworks that are enriched for differentially active entities from a series of molecular profiles encoded as binary indicator matrices. Since interaction networks constantly evolve, an important question is how robust the extracted results are when the network is modified. We enable users to study this effect through several network perturbation techniques and over a range of perturbation degrees. In addition, users may now provide a gold-standard set to determine how enriched extracted pathways are with relevant genes compared to randomized versions of the original network.

  10. Applying a new computer-aided detection scheme generated imaging marker to predict short-term breast cancer risk

    Science.gov (United States)

    Mirniaharikandehei, Seyedehnafiseh; Hollingsworth, Alan B.; Patel, Bhavika; Heidari, Morteza; Liu, Hong; Zheng, Bin

    2018-05-01

    This study aims to investigate the feasibility of identifying a new quantitative imaging marker based on false-positives generated by a computer-aided detection (CAD) scheme to help predict short-term breast cancer risk. An image dataset including four view mammograms acquired from 1044 women was retrospectively assembled. All mammograms were originally interpreted as negative by radiologists. In the next subsequent mammography screening, 402 women were diagnosed with breast cancer and 642 remained negative. An existing CAD scheme was applied ‘as is’ to process each image. From CAD-generated results, four detection features including the total number of (1) initial detection seeds and (2) the final detected false-positive regions, (3) average and (4) sum of detection scores, were computed from each image. Then, by combining the features computed from two bilateral images of left and right breasts from either craniocaudal or mediolateral oblique view, two logistic regression models were trained and tested using a leave-one-case-out cross-validation method to predict the likelihood of each testing case being positive in the next subsequent screening. The new prediction model yielded the maximum prediction accuracy with an area under a ROC curve of AUC  =  0.65  ±  0.017 and the maximum adjusted odds ratio of 4.49 with a 95% confidence interval of (2.95, 6.83). The results also showed an increasing trend in the adjusted odds ratio and risk prediction scores (p  breast cancer risk.

  11. How Adverse Outcome Pathways Can Aid the Development and Use of Computational Prediction Models for Regulatory Toxicology

    Energy Technology Data Exchange (ETDEWEB)

    Wittwehr, Clemens; Aladjov, Hristo; Ankley, Gerald; Byrne, Hugh J.; de Knecht, Joop; Heinzle, Elmar; Klambauer, Günter; Landesmann, Brigitte; Luijten, Mirjam; MacKay, Cameron; Maxwell, Gavin; Meek, M. E. (Bette); Paini, Alicia; Perkins, Edward; Sobanski, Tomasz; Villeneuve, Dan; Waters, Katrina M.; Whelan, Maurice

    2016-12-19

    Efforts are underway to transform regulatory toxicology and chemical safety assessment from a largely empirical science based on direct observation of apical toxicity outcomes in whole organism toxicity tests to a predictive one in which outcomes and risk are inferred from accumulated mechanistic understanding. The adverse outcome pathway (AOP) framework has emerged as a systematic approach for organizing knowledge that supports such inference. We argue that this systematic organization of knowledge can inform and help direct the design and development of computational prediction models that can further enhance the utility of mechanistic and in silico data for chemical safety assessment. Examples of AOP-informed model development and its application to the assessment of chemicals for skin sensitization and multiple modes of endocrine disruption are provided. The role of problem formulation, not only as a critical phase of risk assessment, but also as guide for both AOP and complementary model development described. Finally, a proposal for actively engaging the modeling community in AOP-informed computational model development is made. The contents serve as a vision for how AOPs can be leveraged to facilitate development of computational prediction models needed to support the next generation of chemical safety assessment.

  12. Combining on-chip synthesis of a focused combinatorial library with computational target prediction reveals imidazopyridine GPCR ligands.

    Science.gov (United States)

    Reutlinger, Michael; Rodrigues, Tiago; Schneider, Petra; Schneider, Gisbert

    2014-01-07

    Using the example of the Ugi three-component reaction we report a fast and efficient microfluidic-assisted entry into the imidazopyridine scaffold, where building block prioritization was coupled to a new computational method for predicting ligand-target associations. We identified an innovative GPCR-modulating combinatorial chemotype featuring ligand-efficient adenosine A1/2B and adrenergic α1A/B receptor antagonists. Our results suggest the tight integration of microfluidics-assisted synthesis with computer-based target prediction as a viable approach to rapidly generate bioactivity-focused combinatorial compound libraries with high success rates. Copyright © 2014 WILEY-VCH Verlag GmbH & Co. KGaA, Weinheim.

  13. The prediction in computer color matching of dentistry based on GA+BP neural network.

    Science.gov (United States)

    Li, Haisheng; Lai, Long; Chen, Li; Lu, Cheng; Cai, Qiang

    2015-01-01

    Although the use of computer color matching can reduce the influence of subjective factors by technicians, matching the color of a natural tooth with a ceramic restoration is still one of the most challenging topics in esthetic prosthodontics. Back propagation neural network (BPNN) has already been introduced into the computer color matching in dentistry, but it has disadvantages such as unstable and low accuracy. In our study, we adopt genetic algorithm (GA) to optimize the initial weights and threshold values in BPNN for improving the matching precision. To our knowledge, we firstly combine the BPNN with GA in computer color matching in dentistry. Extensive experiments demonstrate that the proposed method improves the precision and prediction robustness of the color matching in restorative dentistry.

  14. Infant Mortality in Novo Hamburgo: Associated Factors and Cardiovascular Causes

    Directory of Open Access Journals (Sweden)

    Camila de Andrade Brum

    2015-04-01

    Full Text Available Background: Infant mortality has decreased in Brazil, but remains high as compared to that of other developing countries. In 2010, the Rio Grande do Sul state had the lowest infant mortality rate in Brazil. However, the municipality of Novo Hamburgo had the highest infant mortality rate in the Porto Alegre metropolitan region. Objective: To describe the causes of infant mortality in the municipality of Novo Hamburgo from 2007 to 2010, identifying which causes were related to heart diseases and if they were diagnosed in the prenatal period, and to assess the access to healthcare services. Methods: This study assessed infants of the municipality of Novo Hamburgo, who died, and whose data were collected from the infant death investigation records. Results: Of the 157 deaths in that period, 35.3% were reducible through diagnosis and early treatment, 25% were reducible through partnership with other sectors, 19.2% were non-preventable, 11.5% were reducible by means of appropriate pregnancy monitoring, 5.1% were reducible through appropriate delivery care, and 3.8% were ill defined. The major cause of death related to heart disease (13.4%, which was significantly associated with the variables ‘age at death’, ‘gestational age’ and ‘birth weight’. Regarding access to healthcare services, 60.9% of the pregnant women had a maximum of six prenatal visits. Conclusion: It is mandatory to enhance prenatal care and newborn care at hospitals and basic healthcare units to prevent infant mortality.

  15. Infant Mortality in Novo Hamburgo: Associated Factors and Cardiovascular Causes

    Energy Technology Data Exchange (ETDEWEB)

    Brum, Camila de Andrade [Instituto de Cardiologia/Fundação Universitária de Cardiologia (IC/FUC), Porto Alegre, RS (Brazil); Stein, Airton Tetelbom [Universidade Federal de Ciências da Saúde de Porto Alegre (UFCSPA), Porto Alegre, RS (Brazil); Grupo Hospitalar Conceição (GHC), Porto Alegre, RS (Brazil); Universidade Luterana do Brasil (ULBRA), Porto Alegre, RS (Brazil); Pellanda, Lucia Campos, E-mail: luciapell.pesquisa@cardiologia.org.br [Instituto de Cardiologia/Fundação Universitária de Cardiologia (IC/FUC), Porto Alegre, RS (Brazil); Universidade Federal de Ciências da Saúde de Porto Alegre (UFCSPA), Porto Alegre, RS (Brazil)

    2015-04-15

    Infant mortality has decreased in Brazil, but remains high as compared to that of other developing countries. In 2010, the Rio Grande do Sul state had the lowest infant mortality rate in Brazil. However, the municipality of Novo Hamburgo had the highest infant mortality rate in the Porto Alegre metropolitan region. To describe the causes of infant mortality in the municipality of Novo Hamburgo from 2007 to 2010, identifying which causes were related to heart diseases and if they were diagnosed in the prenatal period, and to assess the access to healthcare services. This study assessed infants of the municipality of Novo Hamburgo, who died, and whose data were collected from the infant death investigation records. Of the 157 deaths in that period, 35.3% were reducible through diagnosis and early treatment, 25% were reducible through partnership with other sectors, 19.2% were non-preventable, 11.5% were reducible by means of appropriate pregnancy monitoring, 5.1% were reducible through appropriate delivery care, and 3.8% were ill defined. The major cause of death related to heart disease (13.4%), which was significantly associated with the variables ‘age at death’, ‘gestational age’ and ‘birth weight’. Regarding access to healthcare services, 60.9% of the pregnant women had a maximum of six prenatal visits. It is mandatory to enhance prenatal care and newborn care at hospitals and basic healthcare units to prevent infant mortality.

  16. Infant Mortality in Novo Hamburgo: Associated Factors and Cardiovascular Causes

    International Nuclear Information System (INIS)

    Brum, Camila de Andrade; Stein, Airton Tetelbom; Pellanda, Lucia Campos

    2015-01-01

    Infant mortality has decreased in Brazil, but remains high as compared to that of other developing countries. In 2010, the Rio Grande do Sul state had the lowest infant mortality rate in Brazil. However, the municipality of Novo Hamburgo had the highest infant mortality rate in the Porto Alegre metropolitan region. To describe the causes of infant mortality in the municipality of Novo Hamburgo from 2007 to 2010, identifying which causes were related to heart diseases and if they were diagnosed in the prenatal period, and to assess the access to healthcare services. This study assessed infants of the municipality of Novo Hamburgo, who died, and whose data were collected from the infant death investigation records. Of the 157 deaths in that period, 35.3% were reducible through diagnosis and early treatment, 25% were reducible through partnership with other sectors, 19.2% were non-preventable, 11.5% were reducible by means of appropriate pregnancy monitoring, 5.1% were reducible through appropriate delivery care, and 3.8% were ill defined. The major cause of death related to heart disease (13.4%), which was significantly associated with the variables ‘age at death’, ‘gestational age’ and ‘birth weight’. Regarding access to healthcare services, 60.9% of the pregnant women had a maximum of six prenatal visits. It is mandatory to enhance prenatal care and newborn care at hospitals and basic healthcare units to prevent infant mortality

  17. Heterologous aggregates promote de novo prion appearance via more than one mechanism.

    Directory of Open Access Journals (Sweden)

    Fatih Arslan

    2015-01-01

    Full Text Available Prions are self-perpetuating conformational variants of particular proteins. In yeast, prions cause heritable phenotypic traits. Most known yeast prions contain a glutamine (Q/asparagine (N-rich region in their prion domains. [PSI+], the prion form of Sup35, appears de novo at dramatically enhanced rates following transient overproduction of Sup35 in the presence of [PIN+], the prion form of Rnq1. Here, we establish the temporal de novo appearance of Sup35 aggregates during such overexpression in relation to other cellular proteins. Fluorescently-labeled Sup35 initially forms one or a few dots when overexpressed in [PIN+] cells. One of the dots is perivacuolar, colocalizes with the aggregated Rnq1 dot and grows into peripheral rings/lines, some of which also colocalize with Rnq1. Sup35 dots that are not near the vacuole do not always colocalize with Rnq1 and disappear by the time rings start to grow. Bimolecular fluorescence complementation failed to detect any interaction between Sup35-VN and Rnq1-VC in [PSI+][PIN+] cells. In contrast, all Sup35 aggregates, whether newly induced or in established [PSI+], completely colocalize with the molecular chaperones Hsp104, Sis1, Ssa1 and eukaryotic release factor Sup45. In the absence of [PIN+], overexpressed aggregating proteins such as the Q/N-rich Pin4C or the non-Q/N-rich Mod5 can also promote the de novo appearance of [PSI+]. Similar to Rnq1, overexpressed Pin4C transiently colocalizes with newly appearing Sup35 aggregates. However, no interaction was detected between Mod5 and Sup35 during [PSI+] induction in the absence of [PIN+]. While the colocalization of Sup35 and aggregates of Rnq1 or Pin4C are consistent with the model that the heterologous aggregates cross-seed the de novo appearance of [PSI+], the lack of interaction between Mod5 and Sup35 leaves open the possibility of other mechanisms. We also show that Hsp104 is required in the de novo appearance of [PSI+] aggregates in a [PIN

  18. A comparison of computational models with and without genotyping for prediction of response to second-line HIV therapy

    NARCIS (Netherlands)

    Revell, A. D.; Boyd, M. A.; Wang, D.; Emery, S.; Gazzard, B.; Reiss, P.; van Sighem, A. I.; Montaner, J. S.; Lane, H. C.; Larder, B. A.

    2014-01-01

    We compared the use of computational models developed with and without HIV genotype vs. genotyping itself to predict effective regimens for patients experiencing first-line virological failure. Two sets of models predicted virological response for 99 three-drug regimens for patients on a failing

  19. Whole-Genome de novo Sequencing Of Quail And Grey Partridge

    DEFF Research Database (Denmark)

    Holm, Lars-Erik; Panitz, Frank; Burt, Dave

    2011-01-01

    The development in sequencing methods has made it possible to perform whole genome de novo sequencing of species without large commercial interests. Within the EU-financed QUANTOMICS project (KBBE-2A-222664), we have performed de novo sequencing of quail (Coturnix coturnix) and grey partridge...... (Perdix perdix) on a Genome Analyzer GAII (Illumina) using paired-end sequencing. The amount of generated sequences amounts to 8 to 9 Gb for each species. The analysis and assembly of the generated sequences is ongoing. Access to the whole genome sequence from these two species will enable enhanced...... comparative studies towards the chicken genome and will aid in identifying evolutionarily conserved sequences within the Galliformes. The obtained sequences from quail and partridge represent a beginning of generating the whole genome sequence for these species. The continuation of establishing the genome...

  20. De novo assembly of plant body plan: a step ahead of Deadpool.

    Science.gov (United States)

    Kareem, Abdul; Radhakrishnan, Dhanya; Sondhi, Yash; Aiyaz, Mohammed; Roy, Merin V; Sugimoto, Kaoru; Prasad, Kalika

    2016-08-01

    While in the movie Deadpool it is possible for a human to recreate an arm from scratch, in reality plants can even surpass that. Not only can they regenerate lost parts, but also the whole plant body can be reborn from a few existing cells. Despite the decades old realization that plant cells possess the ability to regenerate a complete shoot and root system, it is only now that the underlying mechanisms are being unraveled. De novo plant regeneration involves the initiation of regenerative mass, acquisition of the pluripotent state, reconstitution of stem cells and assembly of regulatory interactions. Recent studies have furthered our understanding on the making of a complete plant system in the absence of embryonic positional cues. We review the recent studies probing the molecular mechanisms of de novo plant regeneration in response to external inductive cues and our current knowledge of direct reprogramming of root to shoot and vice versa. We further discuss how de novo regeneration can be exploited to meet the demands of green culture industries and to serve as a general model to address the fundamental questions of regeneration across the plant kingdom.

  1. Functional tuning of the catalytic residue pKa in a de novo designed esterase.

    Science.gov (United States)

    Hiebler, Katharina; Lengyel, Zsófia; Castañeda, Carlos A; Makhlynets, Olga V

    2017-09-01

    AlleyCatE is a de novo designed esterase that can be allosterically regulated by calcium ions. This artificial enzyme has been shown to hydrolyze p-nitrophenyl acetate (pNPA) and 4-nitrophenyl-(2-phenyl)-propanoate (pNPP) with high catalytic efficiency. AlleyCatE was created by introducing a single-histidine residue (His 144 ) into a hydrophobic pocket of calmodulin. In this work, we explore the determinants of catalytic properties of AlleyCatE. We obtained the pK a value of the catalytic histidine using experimental measurements by NMR and pH rate profile and compared these values to those predicted from electrostatics pK a calculations (from both empirical and continuum electrostatics calculations). Surprisingly, the pK a value of the catalytic histidine inside the hydrophobic pocket of calmodulin is elevated as compared to the model compound pK a value of this residue in water. We determined that a short-range favorable interaction with Glu 127 contributes to the elevated pK a of His 144 . We have rationally modulated local electrostatic potential in AlleyCatE to decrease the pK a of its active nucleophile, His 144 , by 0.7 units. As a direct result of the decrease in the His 144 pK a value, catalytic efficiency of the enzyme increased by 45% at pH 6. This work shows that a series of simple NMR experiments that can be performed using low field spectrometers, combined with straightforward computational analysis, provide rapid and accurate guidance to rationally improve catalytic efficiency of histidine-promoted catalysis. Proteins 2017; 85:1656-1665. © 2017 Wiley Periodicals, Inc. © 2017 Wiley Periodicals, Inc.

  2. De novo mutations in the genome organizer CTCF cause intellectual disability

    DEFF Research Database (Denmark)

    Gregor, Anne; Oti, Martin; Kouwenhoven, Evelyn N

    2013-01-01

    An increasing number of genes involved in chromatin structure and epigenetic regulation has been implicated in a variety of developmental disorders, often including intellectual disability. By trio exome sequencing and subsequent mutational screening we now identified two de novo frameshift...... mutations and one de novo missense mutation in CTCF in individuals with intellectual disability, microcephaly, and growth retardation. Furthermore, an individual with a larger deletion including CTCF was identified. CTCF (CCCTC-binding factor) is one of the most important chromatin organizers in vertebrates...... and is involved in various chromatin regulation processes such as higher order of chromatin organization, enhancer function, and maintenance of three-dimensional chromatin structure. Transcriptome analyses in all three individuals with point mutations revealed deregulation of genes involved in signal transduction...

  3. FreeContact: fast and free software for protein contact prediction from residue co-evolution.

    Science.gov (United States)

    Kaján, László; Hopf, Thomas A; Kalaš, Matúš; Marks, Debora S; Rost, Burkhard

    2014-03-26

    20 years of improved technology and growing sequences now renders residue-residue contact constraints in large protein families through correlated mutations accurate enough to drive de novo predictions of protein three-dimensional structure. The method EVfold broke new ground using mean-field Direct Coupling Analysis (EVfold-mfDCA); the method PSICOV applied a related concept by estimating a sparse inverse covariance matrix. Both methods (EVfold-mfDCA and PSICOV) are publicly available, but both require too much CPU time for interactive applications. On top, EVfold-mfDCA depends on proprietary software. Here, we present FreeContact, a fast, open source implementation of EVfold-mfDCA and PSICOV. On a test set of 140 proteins, FreeContact was almost eight times faster than PSICOV without decreasing prediction performance. The EVfold-mfDCA implementation of FreeContact was over 220 times faster than PSICOV with negligible performance decrease. EVfold-mfDCA was unavailable for testing due to its dependency on proprietary software. FreeContact is implemented as the free C++ library "libfreecontact", complete with command line tool "freecontact", as well as Perl and Python modules. All components are available as Debian packages. FreeContact supports the BioXSD format for interoperability. FreeContact provides the opportunity to compute reliable contact predictions in any environment (desktop or cloud).

  4. A computational approach for thermomechanical fatigue life prediction of dissimilarly welded superheater tubes

    Energy Technology Data Exchange (ETDEWEB)

    Krishnasamy, Ram-Kumar; Seifert, Thomas; Siegele, Dieter [Fraunhofer-Institut fuer Werkstoffmechanik (IWM), Freiburg im Breisgau (Germany)

    2010-07-01

    In this paper a computational approach for fatigue life prediction of dissimilarly welded superheater tubes is presented and applied to a dissimilar weld between tubes made of the nickel base alloy Alloy617 tube and the 12% chromium steel VM12. The approach comprises the calculation of the residual stresses in the welded tubes with a multi-pass dissimilar welding simulation, the relaxation of the residual stresses in a post weld heat treatment (PWHT) simulation and the fatigue life prediction using the remaining residual stresses as initial condition. A cyclic fiscoplasticity model is used to calculate the transient stresses and strains under thermocyclic service loadings. The fatigue life is predicted with a damage parameter which is based on fracture mechanics. The adjustable parameters of the model are determined based on LCF and TMF experiments. The simulations show, that the residual stresses that remain after PWHT further relax in the first loading cycles. The predicted fatigue lives depend on the residual stresses and, thus, on the choice of the loading cycle in which the damage parameter is evaluated. It the first loading cycle, where residual stresses are still present, is considered, lower fatigue lives are predicted compared to predictions considering loading cycles with relaxed residual stresses. (orig.)

  5. De novo status epilepticus is associated with adverse outcome: An 11-year retrospective study in Hong Kong.

    Science.gov (United States)

    Lui, Hoi Ki Kate; Hui, Kwok Fai; Fong, Wing Chi; Ip, Chun Tak; Lui, Hiu Tung Colin

    2016-08-01

    To identify predictors of poor clinical outcome in patients presenting to the intensive care units with status epilepticus (SE), in particular for patients presenting with de novo status epileptics. A retrospective review was performed on patients admitted to the intensive care units with status epilepticus in two hospitals in Hong Kong over an 11-year period from 2003 to 2013. A total of 87 SE cases were analyzed. The mean age of patients was 49.3 years (SD 14.9 years). Eighteen subjects (20.7%) had breakthrough seizure, which was the most common etiology for the status epilepticus episodes. Seventy-eight subjects (89.7%) had convulsive status epilepticus (CSE) and 9 subjects (10.3%) had non-convulsive status epilepticus (NCSE) on presentation. The 30-day mortality rate of all subjects was 18.4%. Non-convulsive status epilepticus was more common in patients with de novo status epilepticus when compared to those with existing history of epilepsy (15.5% Vs. 0%, p=0.03). Patients with de novo status epilepticus were older (52 Vs 43, p=0.009). De novo status epilepticus was associated with longer status duration (median 2.5 days, IQR 5 days), longer ICU stay (median 7.5 days, IQR 9 days) and poorer outcome (OR 4.15, 95% CI 1.53-11.2). For patients presenting to intensive care units with status epilepticus, those with de novo status epileptics were older and were more likely to develop non-convulsive status epilepticus. De novo status epilepticus was associated with poorer outcome. Continuous EEG monitoring would help identifying NCSE and potentially help improving clinical outcomes. Copyright © 2016 British Epilepsy Association. Published by Elsevier Ltd. All rights reserved.

  6. Melhoramento do cafeeiro: XXXVIII. Observações sobre progênies do cultivar Mundo-Novo de Coffea arabica na estação experimental de Mococa Coffee breeding: XXXVIII-observation on progenies of the Mundo-Novo cultivars of Coffea arabica in the Mococa experimental station

    Directory of Open Access Journals (Sweden)

    Túlio R. Rocha

    1980-01-01

    Full Text Available Os dados analisados no experimento localizado em Mococa sobre a produtividade de 112 progênies dos cultivares Mundo-Novo S1 e S2, Bourbon-Amarelo, BourbonVermelho e Caturra-Vermelho de Coffea arabica no período de 1955 a 1971, indicaram que as de Mundo-Novo S1, de prefixos MP 474, MP 502, MP 469, MP 492 e MP 475, revelaram-se como as mais produtivas, assemelhando-se a algumas progênies 'Mundo--Novo' S2. Dentre estas, destacou-se a de prefixo MP 388-6, que atingiu o nível mais elevado de produção do experimento. As progênies de 'Mundo-Novo', em conjunto, produziram 44% a mais do que as de Bourbon-Amarelo e, estas, 60% a mais do que as de Bourbon-Vermelho e Caturra-Vermelho. A altura e o diâmetro da copa atingiram valores médios mais elevados para as progênies de 'Mundo-Novo'. Verificaram-se correlações positivas e altamente significativas entre altura média da planta e diâmetro médio da copa com a produção das progênies. As progênies mais produtivas revelaram rendimento (relação entre peso de café maduro e beneficiado de aproximadamente 6,0 e porcentagem de sementes normais, do tipo chato, acima de 80. Quanto ao tamanho das sementes do tipo chato, duas progênies 'Mundo-Novo' S1, MP 474 e MP 452, apresentaram peneira média maior, permi-tindo seleção de plantas com essa característica e com elevada produção.Coffee progenies of the Mundo-Novo cultivars of Coffea arabica were studied in an experiment located at the Mococa Experimental Station of the Instituto Agronômico in comparison with Bourbon-Amarelo, Bourbon-Vermelho and Caturra-Vermelho cultivars of the same species. During a period of 17 consecutive cropping years (1955-1971, Mundo-Novo yielded approximately 44% more than Bourbon-Amarelo and this cultivars yielded 60% more than Bourbon-Vermelho and Caturra-Vermelho. Among the 89 S1 'Mundo-Novo' progenies, MP 474, MP 502, MP 469, MP 492 and MP 475 yielded as much as the two best 'Mundo-Novo' S2 progenies. Greater

  7. Progression of MDS-UPDRS Scores Over Five Years in De Novo Parkinson Disease from the Parkinson's Progression Markers Initiative Cohort.

    Science.gov (United States)

    Holden, Samantha K; Finseth, Taylor; Sillau, Stefan H; Berman, Brian D

    2018-01-01

    The Movement Disorder Society Unified Parkinson Disease Rating Scale (MDS-UDPRS) is a commonly used tool to measure Parkinson disease (PD) progression. Longitudinal changes in MDS-UPDRS scores in de novo PD have not been established. Determine progression rates of MDS-UPDRS scores in de novo PD. 362 participants from the Parkinson's Progression Markers Initiative, a multicenter longitudinal cohort study of de novo PD, were included. Longitudinal progression of MDS-UPDRS total and subscale scores were modeled using mixed model regression. MDS-UPDRS scores increased in a linear fashion over five years in de novo PD. MDS-UPDRS total score increased an estimated 4.0 points/year, Part I 0.25 points/year, Part II 1.0 points/year, and Part III 2.4 points/year. The expected average progression of MDS-UPDRS scores in de novo PD from this study can assist in clinical monitoring and provide comparative data for detection of disease modification in treatment trials.

  8. De Novo Coding Variants Are Strongly Associated with Tourette Disorder

    DEFF Research Database (Denmark)

    Willsey, A Jeremy; Fernandez, Thomas V; Yu, Dongmei

    2017-01-01

    Whole-exome sequencing (WES) and de novo variant detection have proven a powerful approach to gene discovery in complex neurodevelopmental disorders. We have completed WES of 325 Tourette disorder trios from the Tourette International Collaborative Genetics cohort and a replication sample of 186 ...

  9. Towards a general theory of neural computation based on prediction by single neurons.

    Directory of Open Access Journals (Sweden)

    Christopher D Fiorillo

    Full Text Available Although there has been tremendous progress in understanding the mechanics of the nervous system, there has not been a general theory of its computational function. Here I present a theory that relates the established biophysical properties of single generic neurons to principles of Bayesian probability theory, reinforcement learning and efficient coding. I suggest that this theory addresses the general computational problem facing the nervous system. Each neuron is proposed to mirror the function of the whole system in learning to predict aspects of the world related to future reward. According to the model, a typical neuron receives current information about the state of the world from a subset of its excitatory synaptic inputs, and prior information from its other inputs. Prior information would be contributed by synaptic inputs representing distinct regions of space, and by different types of non-synaptic, voltage-regulated channels representing distinct periods of the past. The neuron's membrane voltage is proposed to signal the difference between current and prior information ("prediction error" or "surprise". A neuron would apply a Hebbian plasticity rule to select those excitatory inputs that are the most closely correlated with reward but are the least predictable, since unpredictable inputs provide the neuron with the most "new" information about future reward. To minimize the error in its predictions and to respond only when excitation is "new and surprising," the neuron selects amongst its prior information sources through an anti-Hebbian rule. The unique inputs of a mature neuron would therefore result from learning about spatial and temporal patterns in its local environment, and by extension, the external world. Thus the theory describes how the structure of the mature nervous system could reflect the structure of the external world, and how the complexity and intelligence of the system might develop from a population of

  10. De Novo Insertions and Deletions of Predominantly Paternal Origin Are Associated with Autism Spectrum Disorder

    Directory of Open Access Journals (Sweden)

    Shan Dong

    2014-10-01

    Full Text Available Summary: Whole-exome sequencing (WES studies have demonstrated the contribution of de novo loss-of-function single-nucleotide variants (SNVs to autism spectrum disorder (ASD. However, challenges in the reliable detection of de novo insertions and deletions (indels have limited inclusion of these variants in prior analyses. By applying a robust indel detection method to WES data from 787 ASD families (2,963 individuals, we demonstrate that de novo frameshift indels contribute to ASD risk (OR = 1.6; 95% CI = 1.0–2.7; p = 0.03, are more common in female probands (p = 0.02, are enriched among genes encoding FMRP targets (p = 6 × 10−9, and arise predominantly on the paternal chromosome (p < 0.001. On the basis of mutation rates in probands versus unaffected siblings, we conclude that de novo frameshift indels contribute to risk in approximately 3% of individuals with ASD. Finally, by observing clustering of mutations in unrelated probands, we uncover two ASD-associated genes: KMT2E (MLL5, a chromatin regulator, and RIMS1, a regulator of synaptic vesicle release. : Insertions and deletions (indels have proven especially difficult to detect in exome sequencing data. Dong et al. now identify indels in exome data for 787 autism spectrum disorder (ASD families. They demonstrate association between de novo indels that alter the reading frame and ASD. Furthermore, by observing clustering of indels in unrelated probands, they uncover two additional ASD-associated genes: KMT2E (MLL5, a chromatin regulator, and RIMS1, a regulator of synaptic vesicle release.

  11. Novo-desenvolvimento, capital social e desigualdade social

    Directory of Open Access Journals (Sweden)

    Ana Cristina de Oliveira Oliveira

    2012-03-01

    Full Text Available Este artigo aborda a tendência de enfrentamento da desigualdade social a partir, no campo econômico, da versão do novo-desenvolvimentismo e, no campo político e ideológico, a partir da noção de capital social, na tentativa de realizar um "capitalismo com face mais humana". Discutiremos duas ordens de questões, considerando a especificidade da formação social brasileira de capitalismo dependente: 1 a “construção de Estados fortes” para

    assegurar as condições de acumulação do capital, ampliando as margens do mercado de consumo, aliviando a pobreza e controlando possíveis tensões políticas e 2 a difusão da necessidade de construir uma sociedade em harmonia, que se traduz na incorporação da ética empreendedora dos empresários em todas as esferas sociais. Entendemos que este escopo político-econômico revela uma nova pedagogia da hegemonia, sustentada numa suposta alternativa
    de gerenciamento das novas expressões da “questão social”, voltada para educar o conformismo e ocultar o conflito de classes.
    Palavras-chave:  questão social; novo-desenvolvimentismo; capital social; inclusão forçada

  12. From structure prediction to genomic screens for novel non-coding RNAs.

    Directory of Open Access Journals (Sweden)

    Jan Gorodkin

    2011-08-01

    Full Text Available Non-coding RNAs (ncRNAs are receiving more and more attention not only as an abundant class of genes, but also as regulatory structural elements (some located in mRNAs. A key feature of RNA function is its structure. Computational methods were developed early for folding and prediction of RNA structure with the aim of assisting in functional analysis. With the discovery of more and more ncRNAs, it has become clear that a large fraction of these are highly structured. Interestingly, a large part of the structure is comprised of regular Watson-Crick and GU wobble base pairs. This and the increased amount of available genomes have made it possible to employ structure-based methods for genomic screens. The field has moved from folding prediction of single sequences to computational screens for ncRNAs in genomic sequence using the RNA structure as the main characteristic feature. Whereas early methods focused on energy-directed folding of single sequences, comparative analysis based on structure preserving changes of base pairs has been efficient in improving accuracy, and today this constitutes a key component in genomic screens. Here, we cover the basic principles of RNA folding and touch upon some of the concepts in current methods that have been applied in genomic screens for de novo RNA structures in searches for novel ncRNA genes and regulatory RNA structure on mRNAs. We discuss the strengths and weaknesses of the different strategies and how they can complement each other.

  13. Computational Prediction of MicroRNAs from Toxoplasma gondii Potentially Regulating the Hosts’ Gene Expression

    Directory of Open Access Journals (Sweden)

    Müşerref Duygu Saçar

    2014-10-01

    Full Text Available MicroRNAs (miRNAs were discovered two decades ago, yet there is still a great need for further studies elucidating their genesis and targeting in different phyla. Since experimental discovery and validation of miRNAs is difficult, computational predictions are indispensable and today most computational approaches employ machine learning. Toxoplasma gondii, a parasite residing within the cells of its hosts like human, uses miRNAs for its post-transcriptional gene regulation. It may also regulate its hosts’ gene expression, which has been shown in brain cancer. Since previous studies have shown that overexpressed miRNAs within the host are causal for disease onset, we hypothesized that T. gondii could export miRNAs into its host cell. We computationally predicted all hairpins from the genome of T. gondii and used mouse and human models to filter possible candidates. These were then further compared to known miRNAs in human and rodents and their expression was examined for T. gondii grown in mouse and human hosts, respectively. We found that among the millions of potential hairpins in T. gondii, only a few thousand pass filtering using a human or mouse model and that even fewer of those are expressed. Since they are expressed and differentially expressed in rodents and human, we suggest that there is a chance that T. gondii may export miRNAs into its hosts for direct regulation.

  14. De novo mutations in ATP1A3 cause alternating hemiplegia of childhood

    DEFF Research Database (Denmark)

    Heinzen, Erin L; Swoboda, Kathryn J; Hitomi, Yuki

    2012-01-01

    and their unaffected parents to identify de novo nonsynonymous mutations in ATP1A3 in all seven individuals. In a subsequent sequence analysis of ATP1A3 in 98 other patients with AHC, we found that ATP1A3 mutations were likely to be responsible for at least 74% of the cases; we also identified one inherited mutation...... affecting the level of protein expression. This work identifies de novo ATP1A3 mutations as the primary cause of AHC and offers insight into disease pathophysiology by expanding the spectrum of phenotypes associated with mutations in ATP1A3....

  15. Computational predictive methods for fracture and fatigue

    Science.gov (United States)

    Cordes, J.; Chang, A. T.; Nelson, N.; Kim, Y.

    1994-09-01

    The damage-tolerant design philosophy as used by aircraft industries enables aircraft components and aircraft structures to operate safely with minor damage, small cracks, and flaws. Maintenance and inspection procedures insure that damages developed during service remain below design values. When damage is found, repairs or design modifications are implemented and flight is resumed. Design and redesign guidelines, such as military specifications MIL-A-83444, have successfully reduced the incidence of damage and cracks. However, fatigue cracks continue to appear in aircraft well before the design life has expired. The F16 airplane, for instance, developed small cracks in the engine mount, wing support, bulk heads, the fuselage upper skin, the fuel shelf joints, and along the upper wings. Some cracks were found after 600 hours of the 8000 hour design service life and design modifications were required. Tests on the F16 plane showed that the design loading conditions were close to the predicted loading conditions. Improvements to analytic methods for predicting fatigue crack growth adjacent to holes, when multiple damage sites are present, and in corrosive environments would result in more cost-effective designs, fewer repairs, and fewer redesigns. The overall objective of the research described in this paper is to develop, verify, and extend the computational efficiency of analysis procedures necessary for damage tolerant design. This paper describes an elastic/plastic fracture method and an associated fatigue analysis method for damage tolerant design. Both methods are unique in that material parameters such as fracture toughness, R-curve data, and fatigue constants are not required. The methods are implemented with a general-purpose finite element package. Several proof-of-concept examples are given. With further development, the methods could be extended for analysis of multi-site damage, creep-fatigue, and corrosion fatigue problems.

  16. Catalysis by a de novo zinc-mediated protein interface: implications for natural enzyme evolution and rational enzyme engineering.

    Science.gov (United States)

    Der, Bryan S; Edwards, David R; Kuhlman, Brian

    2012-05-08

    Here we show that a recent computationally designed zinc-mediated protein interface is serendipitously capable of catalyzing carboxyester and phosphoester hydrolysis. Although the original motivation was to design a de novo zinc-mediated protein-protein interaction (called MID1-zinc), we observed in the homodimer crystal structure a small cleft and open zinc coordination site. We investigated if the cleft and zinc site at the designed interface were sufficient for formation of a primitive active site that can perform hydrolysis. MID1-zinc hydrolyzes 4-nitrophenyl acetate with a rate acceleration of 10(5) and a k(cat)/K(M) of 630 M(-1) s(-1) and 4-nitrophenyl phosphate with a rate acceleration of 10(4) and a k(cat)/K(M) of 14 M(-1) s(-1). These rate accelerations by an unoptimized active site highlight the catalytic power of zinc and suggest that the clefts formed by protein-protein interactions are well-suited for creating enzyme active sites. This discovery has implications for protein evolution and engineering: from an evolutionary perspective, three-coordinated zinc at a homodimer interface cleft represents a simple evolutionary path to nascent enzymatic activity; from a protein engineering perspective, future efforts in de novo design of enzyme active sites may benefit from exploring clefts at protein interfaces for active site placement.

  17. A computer model to predict temperatures and gas flows during AGR fuel handling

    International Nuclear Information System (INIS)

    Bishop, D.C.; Bowler, P.G.

    1986-01-01

    The paper describes the development of a comprehensive computer model (HOSTAGE) that has been developed for the Heysham II/Torness AGRs to predict temperature transients for all the important components during normal and fault conditions. It models not only the charge and discharge or fuel from an on-load reactor but also follows the fuel down the rest of the fuel route until it is dismantled. The main features of the physical model of gas and heat flow are described. Experimental results are used where appropriate and an indication will be given of how the predictions by HOSTAGE correlate with operating AGR reactors. The role of HOSTAGE in the Heysham II/Torness safety case is briefly discussed. (author)

  18. Computational design of proteins with novel structure and functions

    International Nuclear Information System (INIS)

    Yang Wei; Lai Lu-Hua

    2016-01-01

    Computational design of proteins is a relatively new field, where scientists search the enormous sequence space for sequences that can fold into desired structure and perform desired functions. With the computational approach, proteins can be designed, for example, as regulators of biological processes, novel enzymes, or as biotherapeutics. These approaches not only provide valuable information for understanding of sequence–structure–function relations in proteins, but also hold promise for applications to protein engineering and biomedical research. In this review, we briefly introduce the rationale for computational protein design, then summarize the recent progress in this field, including de novo protein design, enzyme design, and design of protein–protein interactions. Challenges and future prospects of this field are also discussed. (topical review)

  19. Archaeology Through Computational Linguistics: Inscription Statistics Predict Excavation Sites of Indus Valley Artifacts.

    Science.gov (United States)

    Recchia, Gabriel L; Louwerse, Max M

    2016-11-01

    Computational techniques comparing co-occurrences of city names in texts allow the relative longitudes and latitudes of cities to be estimated algorithmically. However, these techniques have not been applied to estimate the provenance of artifacts with unknown origins. Here, we estimate the geographic origin of artifacts from the Indus Valley Civilization, applying methods commonly used in cognitive science to the Indus script. We show that these methods can accurately predict the relative locations of archeological sites on the basis of artifacts of known provenance, and we further apply these techniques to determine the most probable excavation sites of four sealings of unknown provenance. These findings suggest that inscription statistics reflect historical interactions among locations in the Indus Valley region, and they illustrate how computational methods can help localize inscribed archeological artifacts of unknown origin. The success of this method offers opportunities for the cognitive sciences in general and for computational anthropology specifically. Copyright © 2015 Cognitive Science Society, Inc.

  20. De novo protein structure prediction by dynamic fragment assembly and conformational space annealing.

    Science.gov (United States)

    Lee, Juyong; Lee, Jinhyuk; Sasaki, Takeshi N; Sasai, Masaki; Seok, Chaok; Lee, Jooyoung

    2011-08-01

    Ab initio protein structure prediction is a challenging problem that requires both an accurate energetic representation of a protein structure and an efficient conformational sampling method for successful protein modeling. In this article, we present an ab initio structure prediction method which combines a recently suggested novel way of fragment assembly, dynamic fragment assembly (DFA) and conformational space annealing (CSA) algorithm. In DFA, model structures are scored by continuous functions constructed based on short- and long-range structural restraint information from a fragment library. Here, DFA is represented by the full-atom model by CHARMM with the addition of the empirical potential of DFIRE. The relative contributions between various energy terms are optimized using linear programming. The conformational sampling was carried out with CSA algorithm, which can find low energy conformations more efficiently than simulated annealing used in the existing DFA study. The newly introduced DFA energy function and CSA sampling algorithm are implemented into CHARMM. Test results on 30 small single-domain proteins and 13 template-free modeling targets of the 8th Critical Assessment of protein Structure Prediction show that the current method provides comparable and complementary prediction results to existing top methods. Copyright © 2011 Wiley-Liss, Inc.

  1. Novel prediction model of renal function after nephrectomy from automated renal volumetry with preoperative multidetector computed tomography (MDCT).

    Science.gov (United States)

    Isotani, Shuji; Shimoyama, Hirofumi; Yokota, Isao; Noma, Yasuhiro; Kitamura, Kousuke; China, Toshiyuki; Saito, Keisuke; Hisasue, Shin-ichi; Ide, Hisamitsu; Muto, Satoru; Yamaguchi, Raizo; Ukimura, Osamu; Gill, Inderbir S; Horie, Shigeo

    2015-10-01

    The predictive model of postoperative renal function may impact on planning nephrectomy. To develop the novel predictive model using combination of clinical indices with computer volumetry to measure the preserved renal cortex volume (RCV) using multidetector computed tomography (MDCT), and to prospectively validate performance of the model. Total 60 patients undergoing radical nephrectomy from 2011 to 2013 participated, including a development cohort of 39 patients and an external validation cohort of 21 patients. RCV was calculated by voxel count using software (Vincent, FUJIFILM). Renal function before and after radical nephrectomy was assessed via the estimated glomerular filtration rate (eGFR). Factors affecting postoperative eGFR were examined by regression analysis to develop the novel model for predicting postoperative eGFR with a backward elimination method. The predictive model was externally validated and the performance of the model was compared with that of the previously reported models. The postoperative eGFR value was associated with age, preoperative eGFR, preserved renal parenchymal volume (RPV), preserved RCV, % of RPV alteration, and % of RCV alteration (p volumetry and clinical indices might yield an important tool for predicting postoperative renal function.

  2. Prediction of pork loin quality using online computer vision system and artificial intelligence model.

    Science.gov (United States)

    Sun, Xin; Young, Jennifer; Liu, Jeng-Hung; Newman, David

    2018-06-01

    The objective of this project was to develop a computer vision system (CVS) for objective measurement of pork loin under industry speed requirement. Color images of pork loin samples were acquired using a CVS. Subjective color and marbling scores were determined according to the National Pork Board standards by a trained evaluator. Instrument color measurement and crude fat percentage were used as control measurements. Image features (18 color features; 1 marbling feature; 88 texture features) were extracted from whole pork loin color images. Artificial intelligence prediction model (support vector machine) was established for pork color and marbling quality grades. The results showed that CVS with support vector machine modeling reached the highest prediction accuracy of 92.5% for measured pork color score and 75.0% for measured pork marbling score. This research shows that the proposed artificial intelligence prediction model with CVS can provide an effective tool for predicting color and marbling in the pork industry at online speeds. Copyright © 2018 Elsevier Ltd. All rights reserved.

  3. Decision tree analysis to stratify risk of de novo non-melanoma skin cancer following liver transplantation.

    Science.gov (United States)

    Tanaka, Tomohiro; Voigt, Michael D

    2018-03-01

    Non-melanoma skin cancer (NMSC) is the most common de novo malignancy in liver transplant (LT) recipients; it behaves more aggressively and it increases mortality. We used decision tree analysis to develop a tool to stratify and quantify risk of NMSC in LT recipients. We performed Cox regression analysis to identify which predictive variables to enter into the decision tree analysis. Data were from the Organ Procurement Transplant Network (OPTN) STAR files of September 2016 (n = 102984). NMSC developed in 4556 of the 105984 recipients, a mean of 5.6 years after transplant. The 5/10/20-year rates of NMSC were 2.9/6.3/13.5%, respectively. Cox regression identified male gender, Caucasian race, age, body mass index (BMI) at LT, and sirolimus use as key predictive or protective factors for NMSC. These factors were entered into a decision tree analysis. The final tree stratified non-Caucasians as low risk (0.8%), and Caucasian males > 47 years, BMI decision tree model accurately stratifies the risk of developing NMSC in the long-term after LT.

  4. De Novo Prediction of Stem Cell Identity using Single-Cell Transcriptome Data

    NARCIS (Netherlands)

    Grun, D.; Muraro, M.J.; Boisset, J.C.; Wiebrands, K.; Lyubimova, A.; Dharmadhikari, G.; Born, M. van den; Es, J. van; Jansen, E.; Clevers, H.; Koning, E.J. de; Oudenaarden, A. van

    2016-01-01

    Adult mitotic tissues like the intestine, skin, and blood undergo constant turnover throughout the life of an organism. Knowing the identity of the stem cell is crucial to understanding tissue homeostasis and its aberrations upon disease. Here we present a computational method for the derivation of

  5. Prediction of infarction development after endovascular stroke therapy with dual-energy computed tomography

    International Nuclear Information System (INIS)

    Djurdjevic, Tanja; Gizewski, Elke Ruth; Grams, Astrid Ellen; Rehwald, Rafael; Glodny, Bernhard; Knoflach, Michael; Matosevic, Benjamin; Kiechl, Stefan

    2017-01-01

    After intraarterial recanalisation (IAR), the haemorrhage and the blood-brain barrier (BBB) disruption can be distinguished using dual-energy computed tomography (DECT). The aim of the present study was to investigate whether future infarction development can be predicted from DECT. DECT scans of 20 patients showing 45 BBB disrupted areas after IAR were assessed and compared with follow-up examinations. Receiver operator characteristic (ROC) analyses using densities from the iodine map (IM) and virtual non-contrast (VNC) were performed. Future infarction areas are denser than future non-infarction areas on IM series (23.44 ± 24.86 vs. 5.77 ± 2.77; p < 0.0001) and more hypodense on VNC series (29.71 ± 3.33 vs. 35.33 ± 3.50; p < 0.0001). ROC analyses for the IM series showed an area under the curve (AUC) of 0.99 (cut-off: <9.97 HU; p < 0.05; sensitivity 91.18 %; specificity 100.00 %; accuracy 0.93) for the prediction of future infarctions. The AUC for the prediction of haemorrhagic infarctions was 0.78 (cut-off >17.13 HU; p < 0.05; sensitivity 90.00 %; specificity 62.86 %; accuracy 0.69). The VNC series allowed prediction of infarction volume. Future infarction development after IAR can be reliably predicted with the IM series. The prediction of haemorrhages and of infarction size is less reliable. (orig.)

  6. Prediction of infarction development after endovascular stroke therapy with dual-energy computed tomography

    Energy Technology Data Exchange (ETDEWEB)

    Djurdjevic, Tanja; Gizewski, Elke Ruth; Grams, Astrid Ellen [Medical University of Innsbruck, Department of Neuroradiology, Innsbruck (Austria); Rehwald, Rafael; Glodny, Bernhard [Medical University of Innsbruck, Department of Radiology, Innsbruck (Austria); Knoflach, Michael; Matosevic, Benjamin; Kiechl, Stefan [Medical University of Innsbruck, Department of Neurology, Innsbruck (Austria)

    2017-03-15

    After intraarterial recanalisation (IAR), the haemorrhage and the blood-brain barrier (BBB) disruption can be distinguished using dual-energy computed tomography (DECT). The aim of the present study was to investigate whether future infarction development can be predicted from DECT. DECT scans of 20 patients showing 45 BBB disrupted areas after IAR were assessed and compared with follow-up examinations. Receiver operator characteristic (ROC) analyses using densities from the iodine map (IM) and virtual non-contrast (VNC) were performed. Future infarction areas are denser than future non-infarction areas on IM series (23.44 ± 24.86 vs. 5.77 ± 2.77; p < 0.0001) and more hypodense on VNC series (29.71 ± 3.33 vs. 35.33 ± 3.50; p < 0.0001). ROC analyses for the IM series showed an area under the curve (AUC) of 0.99 (cut-off: <9.97 HU; p < 0.05; sensitivity 91.18 %; specificity 100.00 %; accuracy 0.93) for the prediction of future infarctions. The AUC for the prediction of haemorrhagic infarctions was 0.78 (cut-off >17.13 HU; p < 0.05; sensitivity 90.00 %; specificity 62.86 %; accuracy 0.69). The VNC series allowed prediction of infarction volume. Future infarction development after IAR can be reliably predicted with the IM series. The prediction of haemorrhages and of infarction size is less reliable. (orig.)

  7. Dysplastic vs. Common Naevus-associated vs. De novo Melanomas: An Observational Retrospective Study of 1,021 Patients

    Directory of Open Access Journals (Sweden)

    Alejandro Martin-Gorgojo

    2018-03-01

    Full Text Available The aim of this case-case study was to determine the differences between dysplastic and common naevus-associated melanomas (NAM and de novo melanomas. A total of 1,021 prospectively collected patients with invasive cutaneous melanoma from an oncology referral centre were included in the study. Of these, 75.51% had de novo melanomas, 12.93% dysplastic NAM, and 11.56% common NAM. Dysplastic NAM, compared with de novo melanomas, were associated with intermittently photo-exposed sites, atypical melanocytic naevi, decreased tumour thickness, and presence of MC1R non-synonymous variants. Common NAM were more frequent on the trunk and of superficial spreading type. Comparison of dysplastic with common NAM showed significant difference only with regard to mitoses. Both subtypes of NAM shared less aggressive traits than de novo melanomas, albeit with no significant differences in survival after multivariate adjustment. In conclusion, NAM present with less aggressive traits, mostly due to a greater awareness among patients of changing moles than due to their intrinsic biological characteristics.

  8. Application of large computers for predicting the oil field production

    Energy Technology Data Exchange (ETDEWEB)

    Philipp, W; Gunkel, W; Marsal, D

    1971-10-01

    The flank injection drive plays a dominant role in the exploitation of the BEB-oil fields. Therefore, 2-phase flow computer models were built up, adapted to a predominance of a single flow direction and combining a high accuracy of prediction with a low job time. Any case study starts with the partitioning of the reservoir into blocks. Then the statistics of the time-independent reservoir properties are analyzed by means of an IBM 360/25 unit. Using these results and the past production of oil, water and gas, a Fortran-program running on a CDC-3300 computer yields oil recoveries and the ratios of the relative permeabilities as a function of the local oil saturation for all blocks penetrated by mobile water. In order to assign kDwU/KDoU-functions to blocks not yet reached by the advancing water-front, correlation analysis is used to relate reservoir properties to kDwU/KDoU-functions. All these results are used as input into a CDC-660 Fortran program, allowing short-, medium-, and long-term forecasts as well as the handling of special problems.

  9. Computational prediction and molecular confirmation of Helitron transposons in the maize genome

    Directory of Open Access Journals (Sweden)

    He Limei

    2008-01-01

    Full Text Available Abstract Background Helitrons represent a new class of transposable elements recently uncovered in plants and animals. One remarkable feature of Helitrons is their ability to capture gene sequences, which makes them of considerable potential evolutionary importance. However, because Helitrons lack the typical structural features of other DNA transposable elements, identifying them is a challenge. Currently, most researchers identify Helitrons manually by comparing sequences. With the maize whole genome sequencing project underway, an automated computational Helitron searching tool is needed. The characterization of Helitron activities in maize needs to be addressed in order to better understand the impact of Helitrons on the organization of the genome. Results We developed and implemented a heuristic searching algorithm in PERL for identifying Helitrons. Our HelitronFinder program will (i take FASTA-formatted DNA sequences as input and identify the hairpin looping patterns, and (ii exploit the consensus 5' and 3' end sequences of known Helitrons to identify putative ends. We randomly selected five predicted Helitrons from the program's high quality output for molecular verification. Four out of the five predicted Helitrons were confirmed by PCR assays and DNA sequencing in different maize inbred lines. The HelitronFinder program identified two head-to-head dissimilar Helitrons in a maize BAC sequence. Conclusion We have identified 140 new Helitron candidates in maize with our computational tool HelitronFinder by searching maize DNA sequences currently available in GenBank. Four out of five candidates were confirmed to be real by empirical methods, thus validating the predictions of HelitronFinder. Additional points to emerge from our study are that Helitrons do not always insert at an AT dinucleotide in the host sequences, that they can insert immediately adjacent to an existing Helitron, and that their movement may cause changes in the flanking

  10. Computer-aided global breast MR image feature analysis for prediction of tumor response to chemotherapy: performance assessment

    Science.gov (United States)

    Aghaei, Faranak; Tan, Maxine; Hollingsworth, Alan B.; Zheng, Bin; Cheng, Samuel

    2016-03-01

    Dynamic contrast-enhanced breast magnetic resonance imaging (DCE-MRI) has been used increasingly in breast cancer diagnosis and assessment of cancer treatment efficacy. In this study, we applied a computer-aided detection (CAD) scheme to automatically segment breast regions depicting on MR images and used the kinetic image features computed from the global breast MR images acquired before neoadjuvant chemotherapy to build a new quantitative model to predict response of the breast cancer patients to the chemotherapy. To assess performance and robustness of this new prediction model, an image dataset involving breast MR images acquired from 151 cancer patients before undergoing neoadjuvant chemotherapy was retrospectively assembled and used. Among them, 63 patients had "complete response" (CR) to chemotherapy in which the enhanced contrast levels inside the tumor volume (pre-treatment) was reduced to the level as the normal enhanced background parenchymal tissues (post-treatment), while 88 patients had "partially response" (PR) in which the high contrast enhancement remain in the tumor regions after treatment. We performed the studies to analyze the correlation among the 22 global kinetic image features and then select a set of 4 optimal features. Applying an artificial neural network trained with the fusion of these 4 kinetic image features, the prediction model yielded an area under ROC curve (AUC) of 0.83+/-0.04. This study demonstrated that by avoiding tumor segmentation, which is often difficult and unreliable, fusion of kinetic image features computed from global breast MR images without tumor segmentation can also generate a useful clinical marker in predicting efficacy of chemotherapy.

  11. THE INTEGRATED USE OF COMPUTATIONAL CHEMISTRY, SCANNING PROBE MICROSCOPY, AND VIRTUAL REALITY TO PREDICT THE CHEMICAL REACTIVITY OF ENVIRONMENTAL SURFACES

    Science.gov (United States)

    In the last decade three new techniques scanning probe microscopy (SPM), virtual reality (YR) and computational chemistry ave emerged with the combined capability of a priori predicting the chemically reactivity of environmental surfaces. Computational chemistry provides the cap...

  12. The NOVO Network: the original scientific basis for its establishment and our R&D vision

    OpenAIRE

    Winkel, Jørgen; Edwards, Kasper; Dellve, L.; Schiller, B.; Westgaard, Rolf H.

    2017-01-01

    The NOVO network is a Nordic non-governmental professional association whose aims are to foster the scientific progress, knowledge and development of the working environment within Healthcare as an integrated part of production system development. The vision is a “Nordic Model for Sustainable Systems” in the healthcare sector. It was founded in 2006 in Copenhagen and was financially supported by the Nordic Council of Ministers from 2007 to 2015. The motivation to establish the NOVO Network ar...

  13. Targeted intervention: Computational approaches to elucidate and predict relapse in alcoholism.

    Science.gov (United States)

    Heinz, Andreas; Deserno, Lorenz; Zimmermann, Ulrich S; Smolka, Michael N; Beck, Anne; Schlagenhauf, Florian

    2017-05-01

    Alcohol use disorder (AUD) and addiction in general is characterized by failures of choice resulting in repeated drug intake despite severe negative consequences. Behavioral change is hard to accomplish and relapse after detoxification is common and can be promoted by consumption of small amounts of alcohol as well as exposure to alcohol-associated cues or stress. While those environmental factors contributing to relapse have long been identified, the underlying psychological and neurobiological mechanism on which those factors act are to date incompletely understood. Based on the reinforcing effects of drugs of abuse, animal experiments showed that drug, cue and stress exposure affect Pavlovian and instrumental learning processes, which can increase salience of drug cues and promote habitual drug intake. In humans, computational approaches can help to quantify changes in key learning mechanisms during the development and maintenance of alcohol dependence, e.g. by using sequential decision making in combination with computational modeling to elucidate individual differences in model-free versus more complex, model-based learning strategies and their neurobiological correlates such as prediction error signaling in fronto-striatal circuits. Computational models can also help to explain how alcohol-associated cues trigger relapse: mechanisms such as Pavlovian-to-Instrumental Transfer can quantify to which degree Pavlovian conditioned stimuli can facilitate approach behavior including alcohol seeking and intake. By using generative models of behavioral and neural data, computational approaches can help to quantify individual differences in psychophysiological mechanisms that underlie the development and maintenance of AUD and thus promote targeted intervention. Copyright © 2016 Elsevier Inc. All rights reserved.

  14. Predicting effects of noncoding variants with deep learning-based sequence model.

    Science.gov (United States)

    Zhou, Jian; Troyanskaya, Olga G

    2015-10-01

    Identifying functional effects of noncoding variants is a major challenge in human genetics. To predict the noncoding-variant effects de novo from sequence, we developed a deep learning-based algorithmic framework, DeepSEA (http://deepsea.princeton.edu/), that directly learns a regulatory sequence code from large-scale chromatin-profiling data, enabling prediction of chromatin effects of sequence alterations with single-nucleotide sensitivity. We further used this capability to improve prioritization of functional variants including expression quantitative trait loci (eQTLs) and disease-associated variants.

  15. Predicting Motivation: Computational Models of PFC Can Explain Neural Coding of Motivation and Effort-based Decision-making in Health and Disease.

    Science.gov (United States)

    Vassena, Eliana; Deraeve, James; Alexander, William H

    2017-10-01

    Human behavior is strongly driven by the pursuit of rewards. In daily life, however, benefits mostly come at a cost, often requiring that effort be exerted to obtain potential benefits. Medial PFC (MPFC) and dorsolateral PFC (DLPFC) are frequently implicated in the expectation of effortful control, showing increased activity as a function of predicted task difficulty. Such activity partially overlaps with expectation of reward and has been observed both during decision-making and during task preparation. Recently, novel computational frameworks have been developed to explain activity in these regions during cognitive control, based on the principle of prediction and prediction error (predicted response-outcome [PRO] model [Alexander, W. H., & Brown, J. W. Medial prefrontal cortex as an action-outcome predictor. Nature Neuroscience, 14, 1338-1344, 2011], hierarchical error representation [HER] model [Alexander, W. H., & Brown, J. W. Hierarchical error representation: A computational model of anterior cingulate and dorsolateral prefrontal cortex. Neural Computation, 27, 2354-2410, 2015]). Despite the broad explanatory power of these models, it is not clear whether they can also accommodate effects related to the expectation of effort observed in MPFC and DLPFC. Here, we propose a translation of these computational frameworks to the domain of effort-based behavior. First, we discuss how the PRO model, based on prediction error, can explain effort-related activity in MPFC, by reframing effort-based behavior in a predictive context. We propose that MPFC activity reflects monitoring of motivationally relevant variables (such as effort and reward), by coding expectations and discrepancies from such expectations. Moreover, we derive behavioral and neural model-based predictions for healthy controls and clinical populations with impairments of motivation. Second, we illustrate the possible translation to effort-based behavior of the HER model, an extended version of PRO

  16. Characterization and analysis of a de novo transcriptome from the pygmy grasshopper Tetrix japonica.

    Science.gov (United States)

    Qiu, Zhongying; Liu, Fei; Lu, Huimeng; Huang, Yuan

    2017-05-01

    The pygmy grasshopper Tetrix japonica is a common insect distributed throughout the world, and it has the potential for use in studies of body colour polymorphism, genomics and the biology of Tetrigoidea (Insecta: Orthoptera). However, limited biological information is available for this insect. Here, we conducted a de novo transcriptome study of adult and larval T. japonica to provide a better understanding of its gene expression and develop genomic resources for future work. We sequenced and explored the characteristics of the de novo transcriptome of T. japonica using Illumina HiSeq 2000 platform. A total of 107 608 206 paired-end clean reads were assembled into 61 141 unigenes using the trinity software; the mean unigene size was 771 bp, and the N50 length was 1238 bp. A total of 29 225 unigenes were functionally annotated to the NCBI nonredundant protein sequences (Nr), NCBI nonredundant nucleotide sequences (Nt), a manually annotated and reviewed protein sequence database (Swiss-Prot), Gene Ontology (GO) and Kyoto Encyclopedia of Genes and Genomes (KEGG) databases. A large number of putative genes that are potentially involved in pigment pathways, juvenile hormone (JH) metabolism and signalling pathways were identified in the T. japonica transcriptome. Additionally, 165 769 and 156 796 putative single nucleotide polymorphisms occurred in the adult and larvae transcriptomes, respectively, and a total of 3162 simple sequence repeats were detected in this assembly. This comprehensive transcriptomic data for T. japonica will provide a usable resource for gene predictions, signalling pathway investigations and molecular marker development for this species and other pygmy grasshoppers. © 2016 John Wiley & Sons Ltd.

  17. Prediction of intramuscular fat levels in Texel lamb loins using X-ray computed tomography scanning.

    Science.gov (United States)

    Clelland, N; Bunger, L; McLean, K A; Conington, J; Maltin, C; Knott, S; Lambe, N R

    2014-10-01

    For the consumer, tenderness, juiciness and flavour are often described as the most important factors for meat eating quality, all of which have a close association with intramuscular fat (IMF). X-ray computed tomography (CT) can measure fat, muscle and bone volumes and weights, in vivo in sheep and CT predictions of carcass composition have been used in UK sheep breeding programmes over the last few decades. This study aimed to determine the most accurate combination of CT variables to predict IMF percentage of M. longissimus lumborum in Texel lambs. As expected, predicted carcass fat alone accounted for a moderate amount of the variation (R(2)=0.51) in IMF. Prediction accuracies were significantly improved (Adj R(2)>0.65) using information on fat and muscle densities measured from three CT reference scans, showing that CT can provide an accurate prediction of IMF in the loin of purebred Texel sheep. Copyright © 2014. Published by Elsevier Ltd.

  18. Prognostic factors in de novo myelodysplastic syndrome in young and middle-aged people

    Directory of Open Access Journals (Sweden)

    Наталья Николаевна Климкович

    2015-01-01

    Full Text Available We spent multivariate analysis of clinical and laboratory parameters for the prediction of de-novo myelodysplastic syndromes (MDS patients aged 18-60 years. The results of clinical application of prognostic systems in MDS show that there is a large variability within individual risk groups, especially at low-risk MDS. So now hematologists conduct research aimed at identifying additional adverse risk MDS. This is done so that patients with low-risk MDS embodiments and unfavorable prognosis could benefit from early therapeutic intervention, and not only be clinician monitored until disease progression. We found that additional adverse risk factors for the development of MDS are the expression of CD95 in bone marrow ≤40 % and FLT3≥60 %. The expression level of CD95 in bone marrow cells≤40 % and FLT3≥60 % can be considered as a prognostic marker progression of MDS and time start specific therapy

  19. A cyber-linked undergraduate research experience in computational biomolecular structure prediction and design.

    Science.gov (United States)

    Alford, Rebecca F; Leaver-Fay, Andrew; Gonzales, Lynda; Dolan, Erin L; Gray, Jeffrey J

    2017-12-01

    Computational biology is an interdisciplinary field, and many computational biology research projects involve distributed teams of scientists. To accomplish their work, these teams must overcome both disciplinary and geographic barriers. Introducing new training paradigms is one way to facilitate research progress in computational biology. Here, we describe a new undergraduate program in biomolecular structure prediction and design in which students conduct research at labs located at geographically-distributed institutions while remaining connected through an online community. This 10-week summer program begins with one week of training on computational biology methods development, transitions to eight weeks of research, and culminates in one week at the Rosetta annual conference. To date, two cohorts of students have participated, tackling research topics including vaccine design, enzyme design, protein-based materials, glycoprotein modeling, crowd-sourced science, RNA processing, hydrogen bond networks, and amyloid formation. Students in the program report outcomes comparable to students who participate in similar in-person programs. These outcomes include the development of a sense of community and increases in their scientific self-efficacy, scientific identity, and science values, all predictors of continuing in a science research career. Furthermore, the program attracted students from diverse backgrounds, which demonstrates the potential of this approach to broaden the participation of young scientists from backgrounds traditionally underrepresented in computational biology.

  20. A cyber-linked undergraduate research experience in computational biomolecular structure prediction and design.

    Directory of Open Access Journals (Sweden)

    Rebecca F Alford

    2017-12-01

    Full Text Available Computational biology is an interdisciplinary field, and many computational biology research projects involve distributed teams of scientists. To accomplish their work, these teams must overcome both disciplinary and geographic barriers. Introducing new training paradigms is one way to facilitate research progress in computational biology. Here, we describe a new undergraduate program in biomolecular structure prediction and design in which students conduct research at labs located at geographically-distributed institutions while remaining connected through an online community. This 10-week summer program begins with one week of training on computational biology methods development, transitions to eight weeks of research, and culminates in one week at the Rosetta annual conference. To date, two cohorts of students have participated, tackling research topics including vaccine design, enzyme design, protein-based materials, glycoprotein modeling, crowd-sourced science, RNA processing, hydrogen bond networks, and amyloid formation. Students in the program report outcomes comparable to students who participate in similar in-person programs. These outcomes include the development of a sense of community and increases in their scientific self-efficacy, scientific identity, and science values, all predictors of continuing in a science research career. Furthermore, the program attracted students from diverse backgrounds, which demonstrates the potential of this approach to broaden the participation of young scientists from backgrounds traditionally underrepresented in computational biology.

  1. Spontaneous de novo vaginal adenosis resembling Bartholin’s ...

    African Journals Online (AJOL)

    Adebayo Alade Adewole

    Spontaneous de novo vaginal adenosis resembling Bartholin's cyst: A case report ... 6 by 5 cm. The cervix, uterus, adnexa and Pouch of Douglas (POD) were normal. .... of vaginal cancer.2–4 Although, DES exposed daughters have an.

  2. COMPUTING THERAPY FOR PRECISION MEDICINE: COLLABORATIVE FILTERING INTEGRATES AND PREDICTS MULTI-ENTITY INTERACTIONS.

    Science.gov (United States)

    Regenbogen, Sam; Wilkins, Angela D; Lichtarge, Olivier

    2016-01-01

    Biomedicine produces copious information it cannot fully exploit. Specifically, there is considerable need to integrate knowledge from disparate studies to discover connections across domains. Here, we used a Collaborative Filtering approach, inspired by online recommendation algorithms, in which non-negative matrix factorization (NMF) predicts interactions among chemicals, genes, and diseases only from pairwise information about their interactions. Our approach, applied to matrices derived from the Comparative Toxicogenomics Database, successfully recovered Chemical-Disease, Chemical-Gene, and Disease-Gene networks in 10-fold cross-validation experiments. Additionally, we could predict each of these interaction matrices from the other two. Integrating all three CTD interaction matrices with NMF led to good predictions of STRING, an independent, external network of protein-protein interactions. Finally, this approach could integrate the CTD and STRING interaction data to improve Chemical-Gene cross-validation performance significantly, and, in a time-stamped study, it predicted information added to CTD after a given date, using only data prior to that date. We conclude that collaborative filtering can integrate information across multiple types of biological entities, and that as a first step towards precision medicine it can compute drug repurposing hypotheses.

  3. Prediction of 5-year overall survival in cervical cancer patients treated with radical hysterectomy using computational intelligence methods.

    Science.gov (United States)

    Obrzut, Bogdan; Kusy, Maciej; Semczuk, Andrzej; Obrzut, Marzanna; Kluska, Jacek

    2017-12-12

    Computational intelligence methods, including non-linear classification algorithms, can be used in medical research and practice as a decision making tool. This study aimed to evaluate the usefulness of artificial intelligence models for 5-year overall survival prediction in patients with cervical cancer treated by radical hysterectomy. The data set was collected from 102 patients with cervical cancer FIGO stage IA2-IIB, that underwent primary surgical treatment. Twenty-three demographic, tumor-related parameters and selected perioperative data of each patient were collected. The simulations involved six computational intelligence methods: the probabilistic neural network (PNN), multilayer perceptron network, gene expression programming classifier, support vector machines algorithm, radial basis function neural network and k-Means algorithm. The prediction ability of the models was determined based on the accuracy, sensitivity, specificity, as well as the area under the receiver operating characteristic curve. The results of the computational intelligence methods were compared with the results of linear regression analysis as a reference model. The best results were obtained by the PNN model. This neural network provided very high prediction ability with an accuracy of 0.892 and sensitivity of 0.975. The area under the receiver operating characteristics curve of PNN was also high, 0.818. The outcomes obtained by other classifiers were markedly worse. The PNN model is an effective tool for predicting 5-year overall survival in cervical cancer patients treated with radical hysterectomy.

  4. Towards early software reliability prediction for computer forensic tools (case study).

    Science.gov (United States)

    Abu Talib, Manar

    2016-01-01

    Versatility, flexibility and robustness are essential requirements for software forensic tools. Researchers and practitioners need to put more effort into assessing this type of tool. A Markov model is a robust means for analyzing and anticipating the functioning of an advanced component based system. It is used, for instance, to analyze the reliability of the state machines of real time reactive systems. This research extends the architecture-based software reliability prediction model for computer forensic tools, which is based on Markov chains and COSMIC-FFP. Basically, every part of the computer forensic tool is linked to a discrete time Markov chain. If this can be done, then a probabilistic analysis by Markov chains can be performed to analyze the reliability of the components and of the whole tool. The purposes of the proposed reliability assessment method are to evaluate the tool's reliability in the early phases of its development, to improve the reliability assessment process for large computer forensic tools over time, and to compare alternative tool designs. The reliability analysis can assist designers in choosing the most reliable topology for the components, which can maximize the reliability of the tool and meet the expected reliability level specified by the end-user. The approach of assessing component-based tool reliability in the COSMIC-FFP context is illustrated with the Forensic Toolkit Imager case study.

  5. Immobilization of cadmium in soils by UV-mutated Bacillus subtilis 38 bioaugmentation and NovoGro amendment

    International Nuclear Information System (INIS)

    Jiang Chunxiao; Sun Hongwen; Sun Tieheng; Zhang Qingmin; Zhang Yanfeng

    2009-01-01

    Immobilization of cadmium (10 mg Cd per kilogram soil) in soil by bioaugmentation of a UV-mutated microorganism, Bacillus subtilis 38 accompanied with amendment of a bio-fertilizer, NovoGro was investigated using extractable cadmium (E-Cd) by DTPA. B. subtilis 38, the mutant with the strongest resistance against Cd, could bioaccumulate Cd four times greater than the original wild type. Single bioaugmentation of B. subtilis 38 (SB treatment) to soil however did not reduce E-Cd significantly, while the amendment of NovoGro (SN treatment) reduced E-Cd remarkably. Simultaneous application of B. subtilis 38 and NovoGro (SNB treatment) exhibited a synergetic effect compared to the single SB and SN treatment. The immobilization effect was significantly affected by temperature, soil moisture, and pH. It seems that the immobilization on Cd reached the maximum when environmental conditions favored the activity of microorganisms. Under the optimum conditions, after 90 days incubation, E-Cd was 3.34, 3.39, 2.25 and 0.87 mg kg -1 in the control soil, SB, SN and SNB soils, respectively. NovoGro not only showed a great capacity for Cd adsorption, but also promoted the growth of B. subtilis 38. This study provides a potential cost-effective technique for in situ remediation of Cd contaminated soils with bioaugmentation.

  6. De novo nonsense mutations in ASXL1 cause Bohring-Opitz syndrome

    NARCIS (Netherlands)

    Hoischen, Alexander; van Bon, Bregje W. M.; Rodríguez-Santiago, Benjamín; Gilissen, Christian; Vissers, Lisenka E. L. M.; de Vries, Petra; Janssen, Irene; van Lier, Bart; Hastings, Rob; Smithson, Sarah F.; Newbury-Ecob, Ruth; Kjaergaard, Susanne; Goodship, Judith; McGowan, Ruth; Bartholdi, Deborah; Rauch, Anita; Peippo, Maarit; Cobben, Jan M.; Wieczorek, Dagmar; Gillessen-Kaesbach, Gabriele; Veltman, Joris A.; Brunner, Han G.; de Vries, Bert B. B. A.

    2011-01-01

    Bohring-Opitz syndrome is characterized by severe intellectual disability, distinctive facial features and multiple congenital malformations. We sequenced the exomes of three individuals with Bohring-Opitz syndrome and in each identified heterozygous de novo nonsense mutations in ASXL1, which is

  7. Human native lipoprotein-induced de novo DNA methylation is associated with repression of inflammatory genes in THP-1 macrophages.

    Science.gov (United States)

    Rangel-Salazar, Rubén; Wickström-Lindholm, Marie; Aguilar-Salinas, Carlos A; Alvarado-Caudillo, Yolanda; Døssing, Kristina B V; Esteller, Manel; Labourier, Emmanuel; Lund, Gertrud; Nielsen, Finn C; Rodríguez-Ríos, Dalia; Solís-Martínez, Martha O; Wrobel, Katarzyna; Wrobel, Kazimierz; Zaina, Silvio

    2011-11-25

    We previously showed that a VLDL- and LDL-rich mix of human native lipoproteins induces a set of repressive epigenetic marks, i.e. de novo DNA methylation, histone 4 hypoacetylation and histone 4 lysine 20 (H4K20) hypermethylation in THP-1 macrophages. Here, we: 1) ask what gene expression changes accompany these epigenetic responses; 2) test the involvement of candidate factors mediating the latter. We exploited genome expression arrays to identify target genes for lipoprotein-induced silencing, in addition to RNAi and expression studies to test the involvement of candidate mediating factors. The study was conducted in human THP-1 macrophages. Native lipoprotein-induced de novo DNA methylation was associated with a general repression of various critical genes for macrophage function, including pro-inflammatory genes. Lipoproteins showed differential effects on epigenetic marks, as de novo DNA methylation was induced by VLDL and to a lesser extent by LDL, but not by HDL, and VLDL induced H4K20 hypermethylation, while HDL caused H4 deacetylation. The analysis of candidate factors mediating VLDL-induced DNA hypermethylation revealed that this response was: 1) surprisingly, mediated exclusively by the canonical maintenance DNA methyltransferase DNMT1, and 2) independent of the Dicer/micro-RNA pathway. Our work provides novel insights into epigenetic gene regulation by native lipoproteins. Furthermore, we provide an example of DNMT1 acting as a de novo DNA methyltransferase independently of canonical de novo enzymes, and show proof of principle that de novo DNA methylation can occur independently of a functional Dicer/micro-RNA pathway in mammals.

  8. An 11bp region with stem formation potential is essential for de novo DNA methylation of the RPS element.

    Directory of Open Access Journals (Sweden)

    Matthew Gentry

    Full Text Available The initiation of DNA methylation in Arabidopsis is controlled by the RNA-directed DNA methylation (RdDM pathway that uses 24nt siRNAs to recruit de novo methyltransferase DRM2 to the target site. We previously described the REPETITIVE PETUNIA SEQUENCE (RPS fragment that acts as a hot spot for de novo methylation, for which it requires the cooperative activity of all three methyltransferases MET1, CMT3 and DRM2, but not the RdDM pathway. RPS contains two identical 11nt elements in inverted orientation, interrupted by a 18nt spacer, which resembles the features of a stemloop structure. The analysis of deletion/substitution derivatives of this region showed that deletion of one 11nt element RPS is sufficient to eliminate de novo methylation of RPS. In addition, deletion of a 10nt region directly adjacent to one of the 11nt elements, significantly reduced de novo methylation. When both 11nt regions were replaced by two 11nt elements with altered DNA sequence but unchanged inverted repeat homology, DNA methylation was not affected, indicating that de novo methylation was not targeted to a specific DNA sequence element. These data suggest that de novo DNA methylation is attracted by a secondary structure to which the two 11nt elements contribute, and that the adjacent 10nt region influences the stability of this structure. This resembles the recognition of structural features by DNA methyltransferases in animals and suggests that similar mechanisms exist in plants.

  9. Permeability Surface of Deep Middle Cerebral Artery Territory on Computed Tomographic Perfusion Predicts Hemorrhagic Transformation After Stroke.

    Science.gov (United States)

    Li, Qiao; Gao, Xinyi; Yao, Zhenwei; Feng, Xiaoyuan; He, Huijin; Xue, Jing; Gao, Peiyi; Yang, Lumeng; Cheng, Xin; Chen, Weijian; Yang, Yunjun

    2017-09-01

    Permeability surface (PS) on computed tomographic perfusion reflects blood-brain barrier permeability and is related to hemorrhagic transformation (HT). HT of deep middle cerebral artery (MCA) territory can occur after recanalization of proximal large-vessel occlusion. We aimed to determine the relationship between HT and PS of deep MCA territory. We retrospectively reviewed 70 consecutive acute ischemic stroke patients presenting with occlusion of the distal internal carotid artery or M1 segment of the MCA. All patients underwent computed tomographic perfusion within 6 hours after symptom onset. Computed tomographic perfusion data were postprocessed to generate maps of different perfusion parameters. Risk factors were identified for increased deep MCA territory PS. Receiver operating characteristic curve analysis was performed to calculate the optimal PS threshold to predict HT of deep MCA territory. Increased PS was associated with HT of deep MCA territory. After adjustments for age, sex, onset time to computed tomographic perfusion, and baseline National Institutes of Health Stroke Scale, poor collateral status (odds ratio, 7.8; 95% confidence interval, 1.67-37.14; P =0.009) and proximal MCA-M1 occlusion (odds ratio, 4.12; 95% confidence interval, 1.03-16.52; P =0.045) were independently associated with increased deep MCA territory PS. Relative PS most accurately predicted HT of deep MCA territory (area under curve, 0.94; optimal threshold, 2.89). Increased PS can predict HT of deep MCA territory after recanalization therapy for cerebral proximal large-vessel occlusion. Proximal MCA-M1 complete occlusion and distal internal carotid artery occlusion in conjunction with poor collaterals elevate deep MCA territory PS. © 2017 American Heart Association, Inc.

  10. De novo complex intra chromosomal rearrangement after ICSI: characterisation by BACs micro array-CGH

    Directory of Open Access Journals (Sweden)

    Quimsiyeh Mazin

    2008-12-01

    Full Text Available Abstract Background In routine Assisted Reproductive Technology (ART men with severe oligozoospermia or azoospermia should be informed about the risk of de novo congenital or chromosomal abnormalities in ICSI program. Also the benefits of preimplantation or prenatal genetic diagnosis practice need to be explained to the couple. Methods From a routine ICSI attempt, using ejaculated sperm from male with severe oligozoospermia and having normal karyotype, a 30 years old pregnant woman was referred to prenatal diagnosis in the 17th week for bichorionic biamniotic twin gestation. Amniocentesis was performed because of the detection of an increased foetal nuchal translucency for one of the fetus by the sonographic examination during the 12th week of gestation (WG. Chromosome and DNA studies of the fetus were realized on cultured amniocytes Results Conventional, molecular cytogenetic and microarray CGH experiments allowed us to conclude that the fetus had a de novo pericentromeric inversion associated with a duplication of the 9p22.1-p24 chromosomal region, 46,XY,invdup(9(p22.1p24 [arrCGH 9p22.1p24 (RP11-130C19 → RP11-87O1x3]. As containing the critical 9p22 region, our case is in coincidence with the general phenotype features of the partial trisomy 9p syndrome with major growth retardation, microcephaly and microretrognathia. Conclusion This de novo complex chromosome rearrangement illustrates the possible risk of chromosome or gene defects in ICSI program and the contribution of array-CGH for mapping rapidly de novo chromosomal imbalance.

  11. De novo and salvage pathway precursor incorporation during DNA replication at the nuclear matrix

    International Nuclear Information System (INIS)

    Panzeter, P.L.

    1988-01-01

    Total nuclear DNA can be empirically subdivided into low salt-soluble (LS) DNA (75-80%), high salt-soluble (HS) DNA (18-23%), and nuclear matrix-associated (NM) DNA which remains tightly bound to the nuclear matrix (∼2%). The most-newly replicated DNA is that associated with the nuclear matrix in regenerating rat liver. Analyses of the DNA fractions after various pulse times revealed that the salvage and de novo pathway DNA precursors investigated were incorporated preferentially into NM-DNA at early pulse times, after which the radioactivity became progressively incorporated into HS- and LS-DNA, respectively. These results support two models of nuclear matrix-associated DNA replication, proposed previously, and a third model presented in this dissertation. In addition, the incorporation of de novo pathway precursors lagged significantly (> 10 minutes) behind the incorporation of precursors entering through the salvage pathway. Channeling of salvage pathway precursors to DNA replication sites would explain the more rapid uptake of salvage precursors into NM-DNA than de novo precursors. To investigate the possibility of this heretofore in vitro phenomenon, the incorporation of the salvage precursor, ( 3 H)deoxythymidine, and the de novo precursor, ( 14 C)orotic acid, into NM-DNA and dTTP was examined in regenerating rat liver. There was no significant difference between the incorporation pattern of ( 14 C)orotic acid into NM-DNA thymine and that of ( 14 C)orotic acid into soluble dTTP. Contrastingly, the salvage pathway precursor, ( 3 H)deoxythymidine, labeled NM-DNA before labeling the dTTP pool

  12. Computational modeling to predict mechanical function of joints: application to the lower leg with simulation of two cadaver studies.

    Science.gov (United States)

    Liacouras, Peter C; Wayne, Jennifer S

    2007-12-01

    Computational models of musculoskeletal joints and limbs can provide useful information about joint mechanics. Validated models can be used as predictive devices for understanding joint function and serve as clinical tools for predicting the outcome of surgical procedures. A new computational modeling approach was developed for simulating joint kinematics that are dictated by bone/joint anatomy, ligamentous constraints, and applied loading. Three-dimensional computational models of the lower leg were created to illustrate the application of this new approach. Model development began with generating three-dimensional surfaces of each bone from CT images and then importing into the three-dimensional solid modeling software SOLIDWORKS and motion simulation package COSMOSMOTION. Through SOLIDWORKS and COSMOSMOTION, each bone surface file was filled to create a solid object and positioned necessary components added, and simulations executed. Three-dimensional contacts were added to inhibit intersection of the bones during motion. Ligaments were represented as linear springs. Model predictions were then validated by comparison to two different cadaver studies, syndesmotic injury and repair and ankle inversion following ligament transection. The syndesmotic injury model was able to predict tibial rotation, fibular rotation, and anterior/posterior displacement. In the inversion simulation, calcaneofibular ligament extension and angles of inversion compared well. Some experimental data proved harder to simulate accurately, due to certain software limitations and lack of complete experimental data. Other parameters that could not be easily obtained experimentally can be predicted and analyzed by the computational simulations. In the syndesmotic injury study, the force generated in the tibionavicular and calcaneofibular ligaments reduced with the insertion of the staple, indicating how this repair technique changes joint function. After transection of the calcaneofibular

  13. Melhoramento do cafeeiro: XLII. Produtividade de progênies derivadas de hibridação dos cultivares Laurina e Mundo Novo Coffee breeding: XLII. Yield of progenies from crosses of Laurina and Mundo Novo cultivars of Coffea arabica L.

    Directory of Open Access Journals (Sweden)

    Alcides Carvalho

    1988-01-01

    Full Text Available O cultivar Laurina de Coffea arabica L. caracteriza-se pelo pequeno porte, folhas de dimensões reduzidas, frutos afilados na base, sementes pequenas e afiladas, pequeno rendimento e reduzida produção. Apresenta, no entanto, bebida de boa qualidade e baixo teor de cafeína nas sementes. Suas principais características são controladas pela ação de um par de alelos recessivos lrlr, de acentuado efeito pleiotrópico. Devido ao atual interesse do comércio por produto de baixo teor de cafeína, iniciaram-se pesquisas tendo em vista principalmente aumentar a produtividade do 'Laurina'. Para esse fim, realizaram-se numerosas hibridações de cafeeiros do 'Laurina' com os do 'Mundo Novo' (Coffea arabica e, posteriormente, retrocruzamentos com o 'Mundo Novo'. Estudaram-se as progênies F2 e retrocruzamentos com o 'Mundo Novo' (RC em Campinas, em um experimento, anotando-se as produções por oito anos consecutivos. Separaram-se algumas progênies F2 em dois grupos, antes do plantio: normais (LrLr,Lrlr e laurina (Irlr. Como testemunhas, usaram-se progênies do 'Mundo Novo' e 'Catuaí Amarelo' de C. arabica. O conjunto de plantas F2 do grupo laurina e os retrocruzamentos tiveram produção média maior do que as plantas F2 normais, porém menor do que as testemunhas. Alguns retrocruzamentos e progênies F2 apresentaram plantas com razoável produtividade, indicando que, através de retrocruzamentos com o 'Mundo Novo', podem-se obter novos tipos comerciais com as características morfológicas do 'Laurina'. Fizeram-se considerações sobre a melhor capacidade de combinação do 'Laurina' com algumas seleções do 'Mundo Novo'.The Laurina cultivars of Coffea arabica L. has a reduced plant size, small leaves, small and pointed seeds and low yield capacity. However the seeds have a good cup quality and the desirable characteristic of low caffeine content The Laurina phenotype is supposed to be controlled by a pair of recessive alleles lrlr, with

  14. PredMP: A Web Resource for Computationally Predicted Membrane Proteins via Deep Learning

    KAUST Repository

    Wang, Sheng

    2018-02-06

    Experimental determination of membrane protein (MP) structures is challenging as they are often too large for nuclear magnetic resonance (NMR) experiments and difficult to crystallize. Currently there are only about 510 non-redundant MPs with solved structures in Protein Data Bank (PDB). To elucidate the MP structures computationally, we developed a novel web resource, denoted as PredMP (http://52.87.130.56:3001/#/proteinindex), that delivers one-dimensional (1D) annotation of the membrane topology and secondary structure, two-dimensional (2D) prediction of the contact/distance map, together with three-dimensional (3D) modeling of the MP structure in the lipid bilayer, for each MP target from a given model organism. The precision of the computationally constructed MP structures is leveraged by state-of-the-art deep learning methods as well as cutting-edge modeling strategies. In particular, (i) we annotate 1D property via DeepCNF (Deep Convolutional Neural Fields) that not only models complex sequence-structure relationship but also interdependency between adjacent property labels; (ii) we predict 2D contact/distance map through Deep Transfer Learning which learns the patterns as well as the complex relationship between contacts/distances and protein features from non-membrane proteins; and (iii) we model 3D structure by feeding its predicted contacts and secondary structure to the Crystallography & NMR System (CNS) suite combined with a membrane burial potential that is residue-specific and depth-dependent. PredMP currently contains more than 2,200 multi-pass transmembrane proteins (length<700 residues) from Human. These transmembrane proteins are classified according to IUPHAR/BPS Guide, which provides a hierarchical organization of receptors, channels, transporters, enzymes and other drug targets according to their molecular relationships and physiological functions. Among these MPs, we estimated that our approach could predict correct folds for 1

  15. Demonstration of de novo synthesis of enzymes by density labelling with stable isotopes

    International Nuclear Information System (INIS)

    Huebner, G.; Hirschberg, K.

    1977-01-01

    The technique of in vivo density labelling of proteins with H 2 18 O and 2 H 2 O has been used to investigate hormonal regulation and developmental expression of enzymes in plant cells. Buoyant density data obtained from isopycnic equilibrium centrifugation demonstrated that the cytokinine-induced nitrate reductase activity and the gibberellic acid-induced phosphatase activity in isolated embryos of Agrostemma githago are activities of enzymes synthesized de novo. The increase in alanine-specific aminopeptidase in germinating A. githago seeds is not due to de novo synthesis but to the release of preformed enzyme. On the basis of this result it is possible to apply the enzyme aminopeptidase as an internal density standard in equilibrium centrifugation. Density labelling experiments on proteins in pea cotyledons have been used to study the change in the activity of acid phosphatase, alanine-specific aminopeptidase, and peroxidase during germination. The activities of these enzymes increase in cotyledons of Pisum sativum. Density labelling by 18 O and 2 H demonstrates de novo synthesis of these three enzymes. The differential time course of enzyme induction shows the advantage of using H 2 18 O as labelling substance in cases when the enzyme was synthesized immediately at the beginning of germination. At this stage of development the amino-acid pool available for synthesis is formed principally by means of hydrolysis of storage proteins. The incorporation of 2 H into the new proteins takes place in a measurable amount at a stage of growth in which the amino acids are also synthesized de novo. The enzyme acid phosphatase of pea cotyledons was chosen to demonstrate the possibility of using the density labelling technique to detect protein turnover. (author)

  16. Optimizing de novo common wheat transcriptome assembly using short-read RNA-Seq data

    Directory of Open Access Journals (Sweden)

    Duan Jialei

    2012-08-01

    Full Text Available Abstract Background Rapid advances in next-generation sequencing methods have provided new opportunities for transcriptome sequencing (RNA-Seq. The unprecedented sequencing depth provided by RNA-Seq makes it a powerful and cost-efficient method for transcriptome study, and it has been widely used in model organisms and non-model organisms to identify and quantify RNA. For non-model organisms lacking well-defined genomes, de novo assembly is typically required for downstream RNA-Seq analyses, including SNP discovery and identification of genes differentially expressed by phenotypes. Although RNA-Seq has been successfully used to sequence many non-model organisms, the results of de novo assembly from short reads can still be improved by using recent bioinformatic developments. Results In this study, we used 212.6 million pair-end reads, which accounted for 16.2 Gb, to assemble the hexaploid wheat transcriptome. Two state-of-the-art assemblers, Trinity and Trans-ABySS, which use the single and multiple k-mer methods, respectively, were used, and the whole de novo assembly process was divided into the following four steps: pre-assembly, merging different samples, removal of redundancy and scaffolding. We documented every detail of these steps and how these steps influenced assembly performance to gain insight into transcriptome assembly from short reads. After optimization, the assembled transcripts were comparable to Sanger-derived ESTs in terms of both continuity and accuracy. We also provided considerable new wheat transcript data to the community. Conclusions It is feasible to assemble the hexaploid wheat transcriptome from short reads. Special attention should be paid to dealing with multiple samples to balance the spectrum of expression levels and redundancy. To obtain an accurate overview of RNA profiling, removal of redundancy may be crucial in de novo assembly.

  17. Data-Based Predictive Control with Multirate Prediction Step

    Science.gov (United States)

    Barlow, Jonathan S.

    2010-01-01

    Data-based predictive control is an emerging control method that stems from Model Predictive Control (MPC). MPC computes current control action based on a prediction of the system output a number of time steps into the future and is generally derived from a known model of the system. Data-based predictive control has the advantage of deriving predictive models and controller gains from input-output data. Thus, a controller can be designed from the outputs of complex simulation code or a physical system where no explicit model exists. If the output data happens to be corrupted by periodic disturbances, the designed controller will also have the built-in ability to reject these disturbances without the need to know them. When data-based predictive control is implemented online, it becomes a version of adaptive control. One challenge of MPC is computational requirements increasing with prediction horizon length. This paper develops a closed-loop dynamic output feedback controller that minimizes a multi-step-ahead receding-horizon cost function with multirate prediction step. One result is a reduced influence of prediction horizon and the number of system outputs on the computational requirements of the controller. Another result is an emphasis on portions of the prediction window that are sampled more frequently. A third result is the ability to include more outputs in the feedback path than in the cost function.

  18. Seismic Response Prediction of Buildings with Base Isolation Using Advanced Soft Computing Approaches

    Directory of Open Access Journals (Sweden)

    Mosbeh R. Kaloop

    2017-01-01

    Full Text Available Modeling response of structures under seismic loads is an important factor in Civil Engineering as it crucially affects the design and management of structures, especially for the high-risk areas. In this study, novel applications of advanced soft computing techniques are utilized for predicting the behavior of centrically braced frame (CBF buildings with lead-rubber bearing (LRB isolation system under ground motion effects. These techniques include least square support vector machine (LSSVM, wavelet neural networks (WNN, and adaptive neurofuzzy inference system (ANFIS along with wavelet denoising. The simulation of a 2D frame model and eight ground motions are considered in this study to evaluate the prediction models. The comparison results indicate that the least square support vector machine is superior to other techniques in estimating the behavior of smart structures.

  19. Development of computer code for determining prediction parameters of radionuclide migration in soil layer

    International Nuclear Information System (INIS)

    Ogawa, Hiromichi; Ohnuki, Toshihiko

    1986-07-01

    A computer code (MIGSTEM-FIT) has been developed to determine the prediction parameters, retardation factor, water flow velocity, dispersion coefficient, etc., of radionuclide migration in soil layer from the concentration distribution of radionuclide in soil layer or in effluent. In this code, the solution of the predicting equation for radionuclide migration is compared with the concentration distribution measured, and the most adequate values of parameter can be determined by the flexible tolerance method. The validity of finite differential method, which was one of the method to solve the predicting equation, was confirmed by comparison with the analytical solution, and also the validity of fitting method was confirmed by the fitting of the concentration distribution calculated from known parameters. From the examination about the error, it was found that the error of the parameter obtained by using this code was smaller than that of the concentration distribution measured. (author)

  20. NxRepair: error correction in de novo sequence assembly using Nextera mate pairs

    Directory of Open Access Journals (Sweden)

    Rebecca R. Murphy

    2015-06-01

    Full Text Available Scaffolding errors and incorrect repeat disambiguation during de novo assembly can result in large scale misassemblies in draft genomes. Nextera mate pair sequencing data provide additional information to resolve assembly ambiguities during scaffolding. Here, we introduce NxRepair, an open source toolkit for error correction in de novo assemblies that uses Nextera mate pair libraries to identify and correct large-scale errors. We show that NxRepair can identify and correct large scaffolding errors, without use of a reference sequence, resulting in quantitative improvements in the assembly quality. NxRepair can be downloaded from GitHub or PyPI, the Python Package Index; a tutorial and user documentation are also available.

  1. Protein structure prediction using bee colony optimization metaheuristic

    DEFF Research Database (Denmark)

    Fonseca, Rasmus; Paluszewski, Martin; Winter, Pawel

    2010-01-01

    of the proteins structure, an energy potential and some optimization algorithm that ¿nds the structure with minimal energy. Bee Colony Optimization (BCO) is a relatively new approach to solving opti- mization problems based on the foraging behaviour of bees. Several variants of BCO have been suggested......Predicting the native structure of proteins is one of the most challenging problems in molecular biology. The goal is to determine the three-dimensional struc- ture from the one-dimensional amino acid sequence. De novo prediction algorithms seek to do this by developing a representation...... our BCO method to generate good solutions to the protein structure prediction problem. The results show that BCO generally ¿nds better solutions than simulated annealing which so far has been the metaheuristic of choice for this problem....

  2. Novos encontros de anofelíneos em recipientes artificiais

    Directory of Open Access Journals (Sweden)

    Oswaldo Paulo Forattini

    1998-12-01

    Full Text Available Assinalam-se novos encontros de anofelíneos em recipientes artificiais. Um deles diz respeito a formas imaturas de Anopheles bellator em criadouros experimentais e outro é concernente ao achado de An. albitarsis l.s., em recipiente abandonado. Tecem-se considerações sobre a pressão seletiva representada pela produção, cada vez maior, de objetos descartáveis.

  3. FIRAC: a computer code to predict fire-accident effects in nuclear facilities

    International Nuclear Information System (INIS)

    Bolstad, J.W.; Krause, F.R.; Tang, P.K.; Andrae, R.W.; Martin, R.A.; Gregory, W.S.

    1983-01-01

    FIRAC is a medium-sized computer code designed to predict fire-induced flows, temperatures, and material transport within the ventilating systems and other airflow pathways in nuclear-related facilities. The code is designed to analyze the behavior of interconnected networks of rooms and typical ventilation system components. This code is one in a family of computer codes that is designed to provide improved methods of safety analysis for the nuclear industry. The structure of this code closely follows that of the previously developed TVENT and EVENT codes. Because a lumped-parameter formulation is used, this code is particularly suitable for calculating the effects of fires in the far field (that is, in regions removed from the fire compartment), where the fire may be represented parametrically. However, a fire compartment model to simulate conditions in the enclosure is included. This model provides transport source terms to the ventilation system that can affect its operation and in turn affect the fire

  4. Human native lipoprotein-induced de novo DNA methylation is associated with repression of inflammatory genes in THP-1 macrophages

    Directory of Open Access Journals (Sweden)

    Rangel-Salazar Rubén

    2011-11-01

    Full Text Available Abstract Background We previously showed that a VLDL- and LDL-rich mix of human native lipoproteins induces a set of repressive epigenetic marks, i.e. de novo DNA methylation, histone 4 hypoacetylation and histone 4 lysine 20 (H4K20 hypermethylation in THP-1 macrophages. Here, we: 1 ask what gene expression changes accompany these epigenetic responses; 2 test the involvement of candidate factors mediating the latter. We exploited genome expression arrays to identify target genes for lipoprotein-induced silencing, in addition to RNAi and expression studies to test the involvement of candidate mediating factors. The study was conducted in human THP-1 macrophages. Results Native lipoprotein-induced de novo DNA methylation was associated with a general repression of various critical genes for macrophage function, including pro-inflammatory genes. Lipoproteins showed differential effects on epigenetic marks, as de novo DNA methylation was induced by VLDL and to a lesser extent by LDL, but not by HDL, and VLDL induced H4K20 hypermethylation, while HDL caused H4 deacetylation. The analysis of candidate factors mediating VLDL-induced DNA hypermethylation revealed that this response was: 1 surprisingly, mediated exclusively by the canonical maintenance DNA methyltransferase DNMT1, and 2 independent of the Dicer/micro-RNA pathway. Conclusions Our work provides novel insights into epigenetic gene regulation by native lipoproteins. Furthermore, we provide an example of DNMT1 acting as a de novo DNA methyltransferase independently of canonical de novo enzymes, and show proof of principle that de novo DNA methylation can occur independently of a functional Dicer/micro-RNA pathway in mammals.

  5. De novo post-pollen mitosis II tobacco pollen tube transcriptome

    Czech Academy of Sciences Publication Activity Database

    Hafidh, Said; Breznenová, Katarína; Honys, David

    2012-01-01

    Roč. 7, č. 8 (2012), s. 918-921 ISSN 1559-2316 R&D Projects: GA ČR GPP501/11/P321; GA ČR GA522/09/0858 Institutional research plan: CEZ:AV0Z50380511 Keywords : de novo pollen tube transcriptome * male gametophyte development * pollen tube growth Subject RIV: ED - Physiology

  6. Distinguishing between Selective Sweeps from Standing Variation and from a De Novo Mutation

    Science.gov (United States)

    Peter, Benjamin M.; Huerta-Sanchez, Emilia; Nielsen, Rasmus

    2012-01-01

    An outstanding question in human genetics has been the degree to which adaptation occurs from standing genetic variation or from de novo mutations. Here, we combine several common statistics used to detect selection in an Approximate Bayesian Computation (ABC) framework, with the goal of discriminating between models of selection and providing estimates of the age of selected alleles and the selection coefficients acting on them. We use simulations to assess the power and accuracy of our method and apply it to seven of the strongest sweeps currently known in humans. We identify two genes, ASPM and PSCA, that are most likely affected by selection on standing variation; and we find three genes, ADH1B, LCT, and EDAR, in which the adaptive alleles seem to have swept from a new mutation. We also confirm evidence of selection for one further gene, TRPV6. In one gene, G6PD, neither neutral models nor models of selective sweeps fit the data, presumably because this locus has been subject to balancing selection. PMID:23071458

  7. Distinguishing between selective sweeps from standing variation and from a de novo mutation.

    Directory of Open Access Journals (Sweden)

    Benjamin M Peter

    Full Text Available An outstanding question in human genetics has been the degree to which adaptation occurs from standing genetic variation or from de novo mutations. Here, we combine several common statistics used to detect selection in an Approximate Bayesian Computation (ABC framework, with the goal of discriminating between models of selection and providing estimates of the age of selected alleles and the selection coefficients acting on them. We use simulations to assess the power and accuracy of our method and apply it to seven of the strongest sweeps currently known in humans. We identify two genes, ASPM and PSCA, that are most likely affected by selection on standing variation; and we find three genes, ADH1B, LCT, and EDAR, in which the adaptive alleles seem to have swept from a new mutation. We also confirm evidence of selection for one further gene, TRPV6. In one gene, G6PD, neither neutral models nor models of selective sweeps fit the data, presumably because this locus has been subject to balancing selection.

  8. A computational model that predicts behavioral sensitivity to intracortical microstimulation

    Science.gov (United States)

    Kim, Sungshin; Callier, Thierri; Bensmaia, Sliman J.

    2017-02-01

    Objective. Intracortical microstimulation (ICMS) is a powerful tool to investigate the neural mechanisms of perception and can be used to restore sensation for patients who have lost it. While sensitivity to ICMS has previously been characterized, no systematic framework has been developed to summarize the detectability of individual ICMS pulse trains or the discriminability of pairs of pulse trains. Approach. We develop a simple simulation that describes the responses of a population of neurons to a train of electrical pulses delivered through a microelectrode. We then perform an ideal observer analysis on the simulated population responses to predict the behavioral performance of non-human primates in ICMS detection and discrimination tasks. Main results. Our computational model can predict behavioral performance across a wide range of stimulation conditions with high accuracy (R 2 = 0.97) and generalizes to novel ICMS pulse trains that were not used to fit its parameters. Furthermore, the model provides a theoretical basis for the finding that amplitude discrimination based on ICMS violates Weber’s law. Significance. The model can be used to characterize the sensitivity to ICMS across the range of perceptible and safe stimulation regimes. As such, it will be a useful tool for both neuroscience and neuroprosthetics.

  9. DR2DI: a powerful computational tool for predicting novel drug-disease associations

    Science.gov (United States)

    Lu, Lu; Yu, Hua

    2018-05-01

    Finding the new related candidate diseases for known drugs provides an effective method for fast-speed and low-risk drug development. However, experimental identification of drug-disease associations is expensive and time-consuming. This motivates the need for developing in silico computational methods that can infer true drug-disease pairs with high confidence. In this study, we presented a novel and powerful computational tool, DR2DI, for accurately uncovering the potential associations between drugs and diseases using high-dimensional and heterogeneous omics data as information sources. Based on a unified and extended similarity kernel framework, DR2DI inferred the unknown relationships between drugs and diseases using Regularized Kernel Classifier. Importantly, DR2DI employed a semi-supervised and global learning algorithm which can be applied to uncover the diseases (drugs) associated with known and novel drugs (diseases). In silico global validation experiments showed that DR2DI significantly outperforms recent two approaches for predicting drug-disease associations. Detailed case studies further demonstrated that the therapeutic indications and side effects of drugs predicted by DR2DI could be validated by existing database records and literature, suggesting that DR2DI can be served as a useful bioinformatic tool for identifying the potential drug-disease associations and guiding drug repositioning. Our software and comparison codes are freely available at https://github.com/huayu1111/DR2DI.

  10. DR2DI: a powerful computational tool for predicting novel drug-disease associations

    Science.gov (United States)

    Lu, Lu; Yu, Hua

    2018-04-01

    Finding the new related candidate diseases for known drugs provides an effective method for fast-speed and low-risk drug development. However, experimental identification of drug-disease associations is expensive and time-consuming. This motivates the need for developing in silico computational methods that can infer true drug-disease pairs with high confidence. In this study, we presented a novel and powerful computational tool, DR2DI, for accurately uncovering the potential associations between drugs and diseases using high-dimensional and heterogeneous omics data as information sources. Based on a unified and extended similarity kernel framework, DR2DI inferred the unknown relationships between drugs and diseases using Regularized Kernel Classifier. Importantly, DR2DI employed a semi-supervised and global learning algorithm which can be applied to uncover the diseases (drugs) associated with known and novel drugs (diseases). In silico global validation experiments showed that DR2DI significantly outperforms recent two approaches for predicting drug-disease associations. Detailed case studies further demonstrated that the therapeutic indications and side effects of drugs predicted by DR2DI could be validated by existing database records and literature, suggesting that DR2DI can be served as a useful bioinformatic tool for identifying the potential drug-disease associations and guiding drug repositioning. Our software and comparison codes are freely available at https://github.com/huayu1111/DR2DI.

  11. A novel de novo mutation in ATP1A3 and childhood-onset schizophrenia

    Science.gov (United States)

    Smedemark-Margulies, Niklas; Brownstein, Catherine A.; Vargas, Sigella; Tembulkar, Sahil K.; Towne, Meghan C.; Shi, Jiahai; Gonzalez-Cuevas, Elisa; Liu, Kevin X.; Bilguvar, Kaya; Kleiman, Robin J.; Han, Min-Joon; Torres, Alcy; Berry, Gerard T.; Yu, Timothy W.; Beggs, Alan H.; Agrawal, Pankaj B.; Gonzalez-Heydrich, Joseph

    2016-01-01

    We describe a child with onset of command auditory hallucinations and behavioral regression at 6 yr of age in the context of longer standing selective mutism, aggression, and mild motor delays. His genetic evaluation included chromosomal microarray analysis and whole-exome sequencing. Sequencing revealed a previously unreported heterozygous de novo mutation c.385G>A in ATP1A3, predicted to result in a p.V129M amino acid change. This gene codes for a neuron-specific isoform of the catalytic α-subunit of the ATP-dependent transmembrane sodium–potassium pump. Heterozygous mutations in this gene have been reported as causing both sporadic and inherited forms of alternating hemiplegia of childhood and rapid-onset dystonia parkinsonism. We discuss the literature on phenotypes associated with known variants in ATP1A3, examine past functional studies of the role of ATP1A3 in neuronal function, and describe a novel clinical presentation associated with mutation of this gene. PMID:27626066

  12. Axonal regeneration and development of de novo axons from distal dendrites of adult feline commissural interneurons after a proximal axotomy

    DEFF Research Database (Denmark)

    Fenrich, Keith K; Skelton, Nicole; MacDermid, Victoria E

    2007-01-01

    Following proximal axotomy, several types of neurons sprout de novo axons from distal dendrites. These processes may represent a means of forming new circuits following spinal cord injury. However, it is not know whether mammalian spinal interneurons, axotomized as a result of a spinal cord injury......, develop de novo axons. Our goal was to determine whether spinal commissural interneurons (CINs), axotomized by 3-4-mm midsagittal transection at C3, form de novo axons from distal dendrites. All experiments were performed on adult cats. CINs in C3 were stained with extracellular injections of Neurobiotin...... at 4-5 weeks post injury. The somata of axotomized CINs were identified by the presence of immunoreactivity for the axonal growth-associated protein-43 (GAP-43). Nearly half of the CINs had de novo axons that emerged from distal dendrites. These axons lacked immunoreactivity for the dendritic protein...

  13. De novo and inherited private variants in MAP1B in periventricular nodular heterotopia.

    Science.gov (United States)

    Heinzen, Erin L; O'Neill, Adam C; Zhu, Xiaolin; Allen, Andrew S; Bahlo, Melanie; Chelly, Jamel; Dobyns, William B; Freytag, Saskia; Guerrini, Renzo; Leventer, Richard J; Poduri, Annapurna; Robertson, Stephen P; Walsh, Christopher A; Zhang, Mengqi

    2018-05-08

    Periventricular nodular heterotopia (PVNH) is a malformation of cortical development commonly associated with epilepsy. We exome sequenced 202 individuals with sporadic PVNH to identify novel genetic risk loci. We first performed a trio-based analysis and identified 219 de novo variants. Although no novel genes were implicated in this initial analysis, PVNH cases were found overall to have a significant excess of nonsynonymous de novo variants in intolerant genes (p = 3.27x10-7), suggesting a role for rare new alleles in genes yet to be associated with the condition. Using a gene-level collapsing analysis comparing cases and controls, we identified a genome-wide significant signal driven by four ultra-rare loss-of-function heterozygous variants in MAP1B, including one de novo variant. In at least one instance, the MAP1B variant was inherited from a parent with previously undiagnosed PVNH. The PVNH was frontally predominant and associated with perisylvian polymicrogyria. These results implicate MAP1B in PVNH. More broadly, our findings suggest that detrimental mutations likely arising in immediately preceding generations with incomplete penetrance may also be responsible for some apparently sporadic diseases.

  14. Norgal: extraction and de novo assembly of mitochondrial DNA from whole-genome sequencing data.

    Science.gov (United States)

    Al-Nakeeb, Kosai; Petersen, Thomas Nordahl; Sicheritz-Pontén, Thomas

    2017-11-21

    Whole-genome sequencing (WGS) projects provide short read nucleotide sequences from nuclear and possibly organelle DNA depending on the source of origin. Mitochondrial DNA is present in animals and fungi, while plants contain DNA from both mitochondria and chloroplasts. Current techniques for separating organelle reads from nuclear reads in WGS data require full reference or partial seed sequences for assembling. Norgal (de Novo ORGAneLle extractor) avoids this requirement by identifying a high frequency subset of k-mers that are predominantly of mitochondrial origin and performing a de novo assembly on a subset of reads that contains these k-mers. The method was applied to WGS data from a panda, brown algae seaweed, butterfly and filamentous fungus. We were able to extract full circular mitochondrial genomes and obtained sequence identities to the reference sequences in the range from 98.5 to 99.5%. We also assembled the chloroplasts of grape vines and cucumbers using Norgal together with seed-based de novo assemblers. Norgal is a pipeline that can extract and assemble full or partial mitochondrial and chloroplast genomes from WGS short reads without prior knowledge. The program is available at: https://bitbucket.org/kosaidtu/norgal .

  15. Emergence, Retention and Selection: A Trilogy of Origination for Functional De Novo Proteins from Ancestral LncRNAs in Primates.

    Directory of Open Access Journals (Sweden)

    Jia-Yu Chen

    2015-07-01

    Full Text Available While some human-specific protein-coding genes have been proposed to originate from ancestral lncRNAs, the transition process remains poorly understood. Here we identified 64 hominoid-specific de novo genes and report a mechanism for the origination of functional de novo proteins from ancestral lncRNAs with precise splicing structures and specific tissue expression profiles. Whole-genome sequencing of dozens of rhesus macaque animals revealed that these lncRNAs are generally not more selectively constrained than other lncRNA loci. The existence of these newly-originated de novo proteins is also not beyond anticipation under neutral expectation, as they generally have longer theoretical lifespan than their current age, due to their GC-rich sequence property enabling stable ORFs with lower chance of non-sense mutations. Interestingly, although the emergence and retention of these de novo genes are likely driven by neutral forces, population genetics study in 67 human individuals and 82 macaque animals revealed signatures of purifying selection on these genes specifically in human population, indicating a proportion of these newly-originated proteins are already functional in human. We thus propose a mechanism for creation of functional de novo proteins from ancestral lncRNAs during the primate evolution, which may contribute to human-specific genetic novelties by taking advantage of existed genomic contexts.

  16. Critical importance of the de novo pyrimidine biosynthesis pathway for Trypanosoma cruzi growth in the mammalian host cell cytoplasm

    International Nuclear Information System (INIS)

    Hashimoto, Muneaki; Morales, Jorge; Fukai, Yoshihisa; Suzuki, Shigeo; Takamiya, Shinzaburo; Tsubouchi, Akiko; Inoue, Syou; Inoue, Masayuki; Kita, Kiyoshi; Harada, Shigeharu; Tanaka, Akiko; Aoki, Takashi; Nara, Takeshi

    2012-01-01

    Highlights: ► We established Trypanosoma cruzi lacking the gene for carbamoyl phosphate synthetase II. ► Disruption of the cpsII gene significantly reduced the growth of epimastigotes. ► In particular, the CPSII-null mutant severely retarded intracellular growth. ► The de novo pyrimidine pathway is critical for the parasite growth in the host cell. -- Abstract: The intracellular parasitic protist Trypanosoma cruzi is the causative agent of Chagas disease in Latin America. In general, pyrimidine nucleotides are supplied by both de novo biosynthesis and salvage pathways. While epimastigotes—an insect form—possess both activities, amastigotes—an intracellular replicating form of T. cruzi—are unable to mediate the uptake of pyrimidine. However, the requirement of de novo pyrimidine biosynthesis for parasite growth and survival has not yet been elucidated. Carbamoyl-phosphate synthetase II (CPSII) is the first and rate-limiting enzyme of the de novo biosynthetic pathway, and increased CPSII activity is associated with the rapid proliferation of tumor cells. In the present study, we showed that disruption of the T. cruzicpsII gene significantly reduced parasite growth. In particular, the growth of amastigotes lacking the cpsII gene was severely suppressed. Thus, the de novo pyrimidine pathway is important for proliferation of T. cruzi in the host cell cytoplasm and represents a promising target for chemotherapy against Chagas disease.

  17. Cutting edge: identification of novel T cell epitopes in Lol p5a by computational prediction.

    Science.gov (United States)

    de Lalla, C; Sturniolo, T; Abbruzzese, L; Hammer, J; Sidoli, A; Sinigaglia, F; Panina-Bordignon, P

    1999-08-15

    Although atopic allergy affects Lol p5a allergen from rye grass. In vitro binding studies confirmed the promiscuous binding characteristics of these peptides. Moreover, most of the predicted ligands were novel T cell epitopes that were able to stimulate T cells from atopic patients. We generated a panel of Lol p5a-specific T cell clones, the majority of which recognized the peptides in a cross-reactive fashion. The computational prediction of DR ligands might thus allow the design of T cell epitopes with potential useful application in novel immunotherapy strategies.

  18. Novos liberalismos e a Grande Recessão: princípios para uma política externa crítica

    Directory of Open Access Journals (Sweden)

    Igor Abdalla

    2014-06-01

    Full Text Available O artigo analisa a emergência, nas últimas décadas, de novo liberalismo internacionalista de cunho tecnocrático, que se divorcia do liberalismo clássico criado pelo filósofo crítico Immanuel Kant. O novo liberalismo, que coincide com o processo de globalização das finanças, inverte o elemento emancipatório do liberalismo kantiano para apresentar-se como instância de ratificação do poder. Como resultado, os novos liberais são incapazes de analisar criticamente eventos como a Grande Recessão. Em contraposição ao novo liberalismo tecnocrático propõem-se princípios para uma política externa crítica para o Brasil. Em termos empíricos, escrutina-se a evolução do processo de globalização das finanças do ponto de vista do poder, com enfoque sobre as crises financeiras no mundo em desenvolvimento e a Grande Recessão de 2008. Propugnam-se os seguintes argumentos: (i o novo liberalismo contradiz o liberalismo clássico; (ii o novo liberalismo legitima interesses de atores hegemônicos voltados para a liberalização e a desregulamentação financeiras sem limites, que se encontram na raiz da Grande Recessão; (iii a política externa brasileira deve resgatar elementos do liberalismo clássico no contexto de crise gerado pela Grande Recessão.

  19. Long-read sequencing and de novo assembly of a Chinese genome

    Science.gov (United States)

    Short-read sequencing has enabled the de novo assembly of several individual human genomes, but with inherent limitations in characterizing repeat elements. Here we sequence a Chinese individual HX1 by single-molecule real-time (SMRT) long-read sequencing, construct a physical map by NanoChannel arr...

  20. De novo centriole formation in human cells is error-prone and does not require SAS-6 self-assembly.

    Science.gov (United States)

    Wang, Won-Jing; Acehan, Devrim; Kao, Chien-Han; Jane, Wann-Neng; Uryu, Kunihiro; Tsou, Meng-Fu Bryan

    2015-11-26

    Vertebrate centrioles normally propagate through duplication, but in the absence of preexisting centrioles, de novo synthesis can occur. Consistently, centriole formation is thought to strictly rely on self-assembly, involving self-oligomerization of the centriolar protein SAS-6. Here, through reconstitution of de novo synthesis in human cells, we surprisingly found that normal looking centrioles capable of duplication and ciliation can arise in the absence of SAS-6 self-oligomerization. Moreover, whereas canonically duplicated centrioles always form correctly, de novo centrioles are prone to structural errors, even in the presence of SAS-6 self-oligomerization. These results indicate that centriole biogenesis does not strictly depend on SAS-6 self-assembly, and may require preexisting centrioles to ensure structural accuracy, fundamentally deviating from the current paradigm.

  1. Cinema utópico: a construção de um novo homem e um novo mundo

    OpenAIRE

    Erika Savernini Lopes

    2011-01-01

    O cinema, desde seus primórdios, prefigurou o espaço cibernético como um novo espaço imaterial construído coletivamente. A concepção desse outro lugar não físico para o qual o homem poderia migrar estabelece para o cinema e para o ciberespaço uma relação direta com as utopias. Na acepção do romance filosófico de Thomas More, a Utopia define-se como um outro espaço não-existente, irrealizável e ideal que diagnostica o atual. O cinema carregaria caracteres fundamentais da Utopia tanto no que se...

  2. De novo mutations in HCN1 cause early infantile epileptic encephalopathy.

    Science.gov (United States)

    Nava, Caroline; Dalle, Carine; Rastetter, Agnès; Striano, Pasquale; de Kovel, Carolien G F; Nabbout, Rima; Cancès, Claude; Ville, Dorothée; Brilstra, Eva H; Gobbi, Giuseppe; Raffo, Emmanuel; Bouteiller, Delphine; Marie, Yannick; Trouillard, Oriane; Robbiano, Angela; Keren, Boris; Agher, Dahbia; Roze, Emmanuel; Lesage, Suzanne; Nicolas, Aude; Brice, Alexis; Baulac, Michel; Vogt, Cornelia; El Hajj, Nady; Schneider, Eberhard; Suls, Arvid; Weckhuysen, Sarah; Gormley, Padhraig; Lehesjoki, Anna-Elina; De Jonghe, Peter; Helbig, Ingo; Baulac, Stéphanie; Zara, Federico; Koeleman, Bobby P C; Haaf, Thomas; LeGuern, Eric; Depienne, Christel

    2014-06-01

    Hyperpolarization-activated, cyclic nucleotide-gated (HCN) channels contribute to cationic Ih current in neurons and regulate the excitability of neuronal networks. Studies in rat models have shown that the Hcn1 gene has a key role in epilepsy, but clinical evidence implicating HCN1 mutations in human epilepsy is lacking. We carried out exome sequencing for parent-offspring trios with fever-sensitive, intractable epileptic encephalopathy, leading to the discovery of two de novo missense HCN1 mutations. Screening of follow-up cohorts comprising 157 cases in total identified 4 additional amino acid substitutions. Patch-clamp recordings of Ih currents in cells expressing wild-type or mutant human HCN1 channels showed that the mutations had striking but divergent effects on homomeric channels. Individuals with mutations had clinical features resembling those of Dravet syndrome with progression toward atypical absences, intellectual disability and autistic traits. These findings provide clear evidence that de novo HCN1 point mutations cause a recognizable early-onset epileptic encephalopathy in humans.

  3. Quantitative analysis and prediction of regional lymph node status in rectal cancer based on computed tomography imaging

    Energy Technology Data Exchange (ETDEWEB)

    Cui, Chunyan; Liu, Lizhi; Li, Li [Sun Yat-sen University, State Key Laboratory of Oncology in Southern China, Imaging Diagnosis and Interventional Center, Cancer Center, Guangzhou, Guangdong (China); Cai, Hongmin; Tian, Haiying [Sun Yat-Sen University, Department of Automation, School of Science Information and Technology, Guangzhou (China); Li, Liren [Sun Yat-sen University, State Key Laboratory of Oncology in Southern China, Department of Abdominal (colon and rectal) Surgery, Cancer Center, Guangzhou (China)

    2011-11-15

    To quantitatively evaluate regional lymph nodes in rectal cancer patients by using an automated, computer-aided approach, and to assess the accuracy of this approach in differentiating benign and malignant lymph nodes. Patients (228) with newly diagnosed rectal cancer, confirmed by biopsy, underwent enhanced computed tomography (CT). Patients were assigned to the benign node or malignant node group according to histopathological analysis of node samples. All CT-detected lymph nodes were segmented using the edge detection method, and seven quantitative parameters of each node were measured. To increase the prediction accuracy, a hierarchical model combining the merits of the support and relevance vector machines was proposed to achieve higher performance. Of the 220 lymph nodes evaluated, 125 were positive and 95 were negative for metastases. Fractal dimension obtained by the Minkowski box-counting approach was higher in malignant nodes than in benign nodes, and there was a significant difference in heterogeneity between metastatic and non-metastatic lymph nodes. The overall performance of the proposed model is shown to have accuracy as high as 88% using morphological characterisation of lymph nodes. Computer-aided quantitative analysis can improve the prediction of node status in rectal cancer. (orig.)

  4. Prediction of Clinical Outcome After Acute Ischemic Stroke: The Value of Repeated Noncontrast Computed Tomography, Computed Tomographic Angiography, and Computed Tomographic Perfusion.

    Science.gov (United States)

    Dankbaar, Jan W; Horsch, Alexander D; van den Hoven, Andor F; Kappelle, L Jaap; van der Schaaf, Irene C; van Seeters, Tom; Velthuis, Birgitta K

    2017-09-01

    Early prediction of outcome in acute ischemic stroke is important for clinical management. This study aimed to compare the relationship between early follow-up multimodality computed tomographic (CT) imaging and clinical outcome at 90 days in a large multicenter stroke study. From the DUST study (Dutch Acute Stroke Study), patients were selected with (1) anterior circulation occlusion on CT angiography (CTA) and ischemic deficit on CT perfusion (CTP) on admission, and (2) day 3 follow-up noncontrast CT, CTP, and CTA. Follow-up infarct volume on noncontrast CT, poor recanalization on CTA, and poor reperfusion on CTP (mean transit time index ≤75%) were related to unfavorable outcome after 90 days defined as modified Rankin Scale 3 to 6. Four multivariable models were constructed: (1) only baseline variables (model 1), (2) model 1 with addition of infarct volume, (3) model 1 with addition of recanalization, and (4) model 1 with addition of reperfusion. Area under the curves of the receiver operating characteristic curves of the models were compared using the DeLong test. A total of 242 patients were included. Poor recanalization was found in 21%, poor reperfusion in 37%, and unfavorable outcome in 44%. The area under the curve of the receiver operating characteristic curve without follow-up imaging was 0.81, with follow-up noncontrast CT 0.85 ( P =0.02), CTA 0.86 ( P =0.01), and CTP 0.86 ( P =0.01). All 3 follow-up imaging modalities improved outcome prediction compared with no imaging. There was no difference between the imaging models. Follow-up imaging after 3 days improves outcome prediction compared with prediction based on baseline variables alone. CTA recanalization and CTP reperfusion do not outperform noncontrast CT at this time point. URL: http://www.clinicaltrials.gov. Unique identifier: NCT00880113. © 2017 American Heart Association, Inc.

  5. De novo FGF12 mutation in 2 patients with neonatal-onset epilepsy

    Science.gov (United States)

    Guella, Ilaria; Huh, Linda; McKenzie, Marna B.; Toyota, Eric B.; Bebin, E. Martina; Thompson, Michelle L.; Cooper, Gregory M.; Evans, Daniel M.; Buerki, Sarah E.; Adam, Shelin; Van Allen, Margot I.; Nelson, Tanya N.; Connolly, Mary B.; Farrer, Matthew J.

    2016-01-01

    Objective: We describe 2 additional patients with early-onset epilepsy with a de novo FGF12 mutation. Methods: Whole-exome sequencing was performed in 2 unrelated patients with early-onset epilepsy and their unaffected parents. Genetic variants were assessed by comparative trio analysis. Clinical evolution, EEG, and neuroimaging are described. The phenotype and response to treatment was reviewed and compared to affected siblings in the original report. Results: We identified the same FGF12 de novo mutation reported previously (c.G155A, p.R52H) in 2 additional patients with early-onset epilepsy. Similar to the original brothers described, both presented with tonic seizures in the first month of life. In the first patient, seizures responded to sodium channel blockers and her development was normal at 11 months. Patient 2 is a 15-year-old girl with treatment-resistant focal epilepsy, moderate intellectual disability, and autism. Carbamazepine (sodium channel blocker) was tried later in her course but not continued due to an allergic reaction. Conclusions: The identification of a recurrent de novo mutation in 2 additional unrelated probands with early-onset epilepsy supports the role of FGF12 p.R52H in disease pathogenesis. Affected carriers presented with similar early clinical phenotypes; however, this report expands the phenotype associated with this mutation which contrasts with the progressive course and early mortality of the siblings in the original report. PMID:27872899

  6. Computational Techniques for Model Predictive Control of Large-Scale Systems with Continuous-Valued and Discrete-Valued Inputs

    Directory of Open Access Journals (Sweden)

    Koichi Kobayashi

    2013-01-01

    Full Text Available We propose computational techniques for model predictive control of large-scale systems with both continuous-valued control inputs and discrete-valued control inputs, which are a class of hybrid systems. In the proposed method, we introduce the notion of virtual control inputs, which are obtained by relaxing discrete-valued control inputs to continuous variables. In online computation, first, we find continuous-valued control inputs and virtual control inputs minimizing a cost function. Next, using the obtained virtual control inputs, only discrete-valued control inputs at the current time are computed in each subsystem. In addition, we also discuss the effect of quantization errors. Finally, the effectiveness of the proposed method is shown by a numerical example. The proposed method enables us to reduce and decentralize the computation load.

  7. Predicting adenocarcinoma recurrence using computational texture models of nodule components in lung CT

    International Nuclear Information System (INIS)

    Depeursinge, Adrien; Yanagawa, Masahiro; Leung, Ann N.; Rubin, Daniel L.

    2015-01-01

    Purpose: To investigate the importance of presurgical computed tomography (CT) intensity and texture information from ground-glass opacities (GGO) and solid nodule components for the prediction of adenocarcinoma recurrence. Methods: For this study, 101 patients with surgically resected stage I adenocarcinoma were selected. During the follow-up period, 17 patients had disease recurrence with six associated cancer-related deaths. GGO and solid tumor components were delineated on presurgical CT scans by a radiologist. Computational texture models of GGO and solid regions were built using linear combinations of steerable Riesz wavelets learned with linear support vector machines (SVMs). Unlike other traditional texture attributes, the proposed texture models are designed to encode local image scales and directions that are specific to GGO and solid tissue. The responses of the locally steered models were used as texture attributes and compared to the responses of unaligned Riesz wavelets. The texture attributes were combined with CT intensities to predict tumor recurrence and patient hazard according to disease-free survival (DFS) time. Two families of predictive models were compared: LASSO and SVMs, and their survival counterparts: Cox-LASSO and survival SVMs. Results: The best-performing predictive model of patient hazard was associated with a concordance index (C-index) of 0.81 ± 0.02 and was based on the combination of the steered models and CT intensities with survival SVMs. The same feature group and the LASSO model yielded the highest area under the receiver operating characteristic curve (AUC) of 0.8 ± 0.01 for predicting tumor recurrence, although no statistically significant difference was found when compared to using intensity features solely. For all models, the performance was found to be significantly higher when image attributes were based on the solid components solely versus using the entire tumors (p < 3.08 × 10 −5 ). Conclusions: This study

  8. Identification of de novo mutations of Duchénnè/Becker muscular dystrophies in southern Spain.

    Science.gov (United States)

    Garcia, Susana; de Haro, Tomás; Zafra-Ceres, Mercedes; Poyatos, Antonio; Gomez-Capilla, Jose A; Gomez-Llorente, Carolina

    2014-01-01

    Duchénnè/Becker muscular dystrophies (DMD/BMD) are X-linked diseases, which are caused by a de novo gene mutation in one-third of affected males. The study objectives were to determine the incidence of DMD/BMD in Andalusia (Spain) and to establish the percentage of affected males in whom a de novo gene mutation was responsible. Multiplex ligation-dependent probe amplification (MLPA) technology was applied to determine the incidence of DMD/BMD in 84 males with suspicion of the disease and 106 female relatives. Dystrophin gene exon deletion (89.5%) or duplication (10.5%) was detected in 38 of the 84 males by MLPA technology; de novo mutations account for 4 (16.7%) of the 24 mother-son pairs studied. MLPA technology is adequate for the molecular diagnosis of DMD/BMD and establishes whether the mother carries the molecular alteration responsible for the disease, a highly relevant issue for genetic counseling.

  9. Management, nutrition, and lactation performance are related to bulk tank milk de novo fatty acid concentration on northeastern US dairy farms.

    Science.gov (United States)

    Woolpert, M E; Dann, H M; Cotanch, K W; Melilli, C; Chase, L E; Grant, R J; Barbano, D M

    2016-10-01

    This study investigated the relationship of management practices, dietary characteristics, milk composition, and lactation performance with de novo fatty acid (FA) concentration in bulk tank milk from commercial dairy farms with Holstein, Jersey, and mixed-breed cows. It was hypothesized that farms with higher de novo milk FA concentrations would more commonly use management and nutrition practices known to optimize ruminal conditions that enhance de novo synthesis of milk FA. Farms (n=44) located in Vermont and northeastern New York were selected based on a history of high de novo (HDN; 26.18±0.94g/100g of FA; mean ± standard deviation) or low de novo (LDN; 24.19±1.22g/100g of FA) FA in bulk tank milk. Management practices were assessed during one visit to each farm in March or April, 2014. Total mixed ration samples were collected and analyzed for chemical composition using near infrared spectroscopy. We found no differences in days in milk at the farm level. Yield of milk fat, true protein, and de novo FA per cow per day were higher for HDN versus LDN farms. The HDN farms had lower freestall stocking density (cows/stall) than LDN farms. Additionally, tiestall feeding frequency was higher for HDN than LDN farms. No differences between HDN and LDN farms were detected for dietary dry matter, crude protein, neutral detergent fiber, starch, or percentage of forage in the diet. However, dietary ether extract was lower for HDN than LDN farms. This research indicates that overcrowded freestalls, reduced feeding frequency, and greater dietary ether extract content are associated with lower de novo FA synthesis and reduced milk fat and true protein yields on commercial dairy farms. Copyright © 2016 American Dairy Science Association. Published by Elsevier Inc. All rights reserved.

  10. Alternative normalization methods demonstrate widespread cortical hypometabolism in untreated de novo Parkinson's disease

    DEFF Research Database (Denmark)

    Berti, Valentina; Polito, C; Borghammer, Per

    2012-01-01

    , recent studies suggested that conventional data normalization procedures may not always be valid, and demonstrated that alternative normalization strategies better allow detection of low magnitude changes. We hypothesized that these alternative normalization procedures would disclose more widespread...... metabolic alterations in de novo PD. METHODS: [18F]FDG PET scans of 26 untreated de novo PD patients (Hoehn & Yahr stage I-II) and 21 age-matched controls were compared using voxel-based analysis. Normalization was performed using gray matter (GM), white matter (WM) reference regions and Yakushev...... normalization. RESULTS: Compared to GM normalization, WM and Yakushev normalization procedures disclosed much larger cortical regions of relative hypometabolism in the PD group with extensive involvement of frontal and parieto-temporal-occipital cortices, and several subcortical structures. Furthermore...

  11. Drug-Eluting Balloons in the Treatment of Coronary De Novo Lesions

    DEFF Research Database (Denmark)

    Richelsen, Rasmus Kapalu Broge; Overvad, Thure Filskov; Jensen, Svend Eggert

    2016-01-01

    Drug-eluting balloons (DEBs) have emerged as a new application in percutaneous coronary intervention. DEBs have proven successful in the treatment of in-stent restenosis, but their role in de novo lesions is less clear. This paper provides a review of the current studies where DEBs have been used...

  12. Computational prediction of binding affinity for CYP1A2-ligand complexes using empirical free energy calculations

    DEFF Research Database (Denmark)

    Poongavanam, Vasanthanathan; Olsen, Lars; Jørgensen, Flemming Steen

    2010-01-01

    , and methods based on statistical mechanics. In the present investigation, we started from an LIE model to predict the binding free energy of structurally diverse compounds of cytochrome P450 1A2 ligands, one of the important human metabolizing isoforms of the cytochrome P450 family. The data set includes both...... substrates and inhibitors. It appears that the electrostatic contribution to the binding free energy becomes negligible in this particular protein and a simple empirical model was derived, based on a training set of eight compounds. The root mean square error for the training set was 3.7 kJ/mol. Subsequent......Predicting binding affinities for receptor-ligand complexes is still one of the challenging processes in computational structure-based ligand design. Many computational methods have been developed to achieve this goal, such as docking and scoring methods, the linear interaction energy (LIE) method...

  13. ACToR-AGGREGATED COMPUTATIONAL TOXICOLOGY ...

    Science.gov (United States)

    One goal of the field of computational toxicology is to predict chemical toxicity by combining computer models with biological and toxicological data. predict chemical toxicity by combining computer models with biological and toxicological data

  14. A computation method for mass flowrate predictions in critical flows of initially subcooled liquid in long channels

    International Nuclear Information System (INIS)

    Celata, G.P.; D'Annibale, F.; Farello, G.E.

    1985-01-01

    It is suggested a fast and accurate computation method for the prediction of mass flowrate in critical flows initially subcooled liquid from ''long'' discharge channels (high LID values). Starting from a previous very simple correlation proposed by the authors, further improvements in the model enable to widen the method reliability up to initial saturation conditions. A comparison of computed values with 145 experimental data regarding several investigations carried out at the Heat Transfer Laboratory (TERM/ISP, ENEA Casaccia) shows an excellent agreement. The computed data shifting from experimental ones is within ±10% for almost all data, with a slight increase towards low inlet subcoolings. The average error, for all the considered data, is 4,6%

  15. Computational design of a Diels-Alderase from a thermophilic esterase: the importance of dynamics

    Science.gov (United States)

    Linder, Mats; Johansson, Adam Johannes; Olsson, Tjelvar S. G.; Liebeschuetz, John; Brinck, Tore

    2012-09-01

    A novel computational Diels-Alderase design, based on a relatively rare form of carboxylesterase from Geobacillus stearothermophilus, is presented and theoretically evaluated. The structure was found by mining the PDB for a suitable oxyanion hole-containing structure, followed by a combinatorial approach to find suitable substrates and rational mutations. Four lead designs were selected and thoroughly modeled to obtain realistic estimates of substrate binding and prearrangement. Molecular dynamics simulations and DFT calculations were used to optimize and estimate binding affinity and activation energies. A large quantum chemical model was used to capture the salient interactions in the crucial transition state (TS). Our quantitative estimation of kinetic parameters was validated against four experimentally characterized Diels-Alderases with good results. The final designs in this work are predicted to have rate enhancements of ≈103-106 and high predicted proficiencies. This work emphasizes the importance of considering protein dynamics in the design approach, and provides a quantitative estimate of the how the TS stabilization observed in most de novo and redesigned enzymes is decreased compared to a minimal, `ideal' model. The presented design is highly interesting for further optimization and applications since it is based on a thermophilic enzyme ( T opt = 70 °C).

  16. Computer-aided breast MR image feature analysis for prediction of tumor response to chemotherapy

    International Nuclear Information System (INIS)

    Aghaei, Faranak; Tan, Maxine; Liu, Hong; Zheng, Bin; Hollingsworth, Alan B.; Qian, Wei

    2015-01-01

    Purpose: To identify a new clinical marker based on quantitative kinetic image features analysis and assess its feasibility to predict tumor response to neoadjuvant chemotherapy. Methods: The authors assembled a dataset involving breast MR images acquired from 68 cancer patients before undergoing neoadjuvant chemotherapy. Among them, 25 patients had complete response (CR) and 43 had partial and nonresponse (NR) to chemotherapy based on the response evaluation criteria in solid tumors. The authors developed a computer-aided detection scheme to segment breast areas and tumors depicted on the breast MR images and computed a total of 39 kinetic image features from both tumor and background parenchymal enhancement regions. The authors then applied and tested two approaches to classify between CR and NR cases. The first one analyzed each individual feature and applied a simple feature fusion method that combines classification results from multiple features. The second approach tested an attribute selected classifier that integrates an artificial neural network (ANN) with a wrapper subset evaluator, which was optimized using a leave-one-case-out validation method. Results: In the pool of 39 features, 10 yielded relatively higher classification performance with the areas under receiver operating characteristic curves (AUCs) ranging from 0.61 to 0.78 to classify between CR and NR cases. Using a feature fusion method, the maximum AUC = 0.85 ± 0.05. Using the ANN-based classifier, AUC value significantly increased to 0.96 ± 0.03 (p < 0.01). Conclusions: This study demonstrated that quantitative analysis of kinetic image features computed from breast MR images acquired prechemotherapy has potential to generate a useful clinical marker in predicting tumor response to chemotherapy

  17. Computer-aided breast MR image feature analysis for prediction of tumor response to chemotherapy

    Energy Technology Data Exchange (ETDEWEB)

    Aghaei, Faranak; Tan, Maxine; Liu, Hong; Zheng, Bin, E-mail: Bin.Zheng-1@ou.edu [School of Electrical and Computer Engineering, University of Oklahoma, Norman, Oklahoma 73019 (United States); Hollingsworth, Alan B. [Mercy Women’s Center, Mercy Health Center, Oklahoma City, Oklahoma 73120 (United States); Qian, Wei [Department of Electrical and Computer Engineering, University of Texas, El Paso, Texas 79968 (United States)

    2015-11-15

    Purpose: To identify a new clinical marker based on quantitative kinetic image features analysis and assess its feasibility to predict tumor response to neoadjuvant chemotherapy. Methods: The authors assembled a dataset involving breast MR images acquired from 68 cancer patients before undergoing neoadjuvant chemotherapy. Among them, 25 patients had complete response (CR) and 43 had partial and nonresponse (NR) to chemotherapy based on the response evaluation criteria in solid tumors. The authors developed a computer-aided detection scheme to segment breast areas and tumors depicted on the breast MR images and computed a total of 39 kinetic image features from both tumor and background parenchymal enhancement regions. The authors then applied and tested two approaches to classify between CR and NR cases. The first one analyzed each individual feature and applied a simple feature fusion method that combines classification results from multiple features. The second approach tested an attribute selected classifier that integrates an artificial neural network (ANN) with a wrapper subset evaluator, which was optimized using a leave-one-case-out validation method. Results: In the pool of 39 features, 10 yielded relatively higher classification performance with the areas under receiver operating characteristic curves (AUCs) ranging from 0.61 to 0.78 to classify between CR and NR cases. Using a feature fusion method, the maximum AUC = 0.85 ± 0.05. Using the ANN-based classifier, AUC value significantly increased to 0.96 ± 0.03 (p < 0.01). Conclusions: This study demonstrated that quantitative analysis of kinetic image features computed from breast MR images acquired prechemotherapy has potential to generate a useful clinical marker in predicting tumor response to chemotherapy.

  18. De novo transcriptome assembly of the calanoid copepod Neocalanus flemingeri: A new resource for emergence from diapause.

    Science.gov (United States)

    Roncalli, Vittoria; Cieslak, Matthew C; Sommer, Stephanie A; Hopcroft, Russell R; Lenz, Petra H

    2018-02-01

    Copepods, small planktonic crustaceans, are key links between primary producers and upper trophic levels, including many economically important fishes. In the subarctic North Pacific, the life cycle of copepods like Neocalanus flemingeri includes an ontogenetic migration to depth followed by a period of diapause (a type of dormancy) characterized by arrested development and low metabolic activity. The end of diapause is marked by the production of the first brood of eggs. Recent temperature anomalies in the North Pacific have raised concerns about potential negative effects on N. flemingeri. Since diapause is a developmental program, its progress can be tracked using through global gene expression. Thus, a reference transcriptome was developed as a first step towards physiological profiling of diapausing females using high-throughput Illumina sequencing. The de novo transcriptome, the first for this species was designed to investigate the diapause period. RNA-Seq reads were obtained for dormant to reproductive N. flemingeri females. A high quality de novo transcriptome was obtained by first assembling reads from each individual using Trinity software followed by clustering with CAP3 Assembly Program. This assembly consisted of 140,841transcripts (contigs). Bench-marking universal single-copy orthologs analysis identified 85% of core eukaryotic genes, with 79% predicted to be complete. Comparison with other calanoid transcriptomes confirmed its quality and degree of completeness. Trinity assembly of reads originating from multiple individuals led to fragmentation. Thus, the workflow applied here differed from the one recommended by Trinity, but was required to obtain a good assembly. Copyright © 2017 The Authors. Published by Elsevier B.V. All rights reserved.

  19. Computational methods using weighed-extreme learning machine to predict protein self-interactions with protein evolutionary information.

    Science.gov (United States)

    An, Ji-Yong; Zhang, Lei; Zhou, Yong; Zhao, Yu-Jun; Wang, Da-Fu

    2017-08-18

    Self-interactions Proteins (SIPs) is important for their biological activity owing to the inherent interaction amongst their secondary structures or domains. However, due to the limitations of experimental Self-interactions detection, one major challenge in the study of prediction SIPs is how to exploit computational approaches for SIPs detection based on evolutionary information contained protein sequence. In the work, we presented a novel computational approach named WELM-LAG, which combined the Weighed-Extreme Learning Machine (WELM) classifier with Local Average Group (LAG) to predict SIPs based on protein sequence. The major improvement of our method lies in presenting an effective feature extraction method used to represent candidate Self-interactions proteins by exploring the evolutionary information embedded in PSI-BLAST-constructed position specific scoring matrix (PSSM); and then employing a reliable and robust WELM classifier to carry out classification. In addition, the Principal Component Analysis (PCA) approach is used to reduce the impact of noise. The WELM-LAG method gave very high average accuracies of 92.94 and 96.74% on yeast and human datasets, respectively. Meanwhile, we compared it with the state-of-the-art support vector machine (SVM) classifier and other existing methods on human and yeast datasets, respectively. Comparative results indicated that our approach is very promising and may provide a cost-effective alternative for predicting SIPs. In addition, we developed a freely available web server called WELM-LAG-SIPs to predict SIPs. The web server is available at http://219.219.62.123:8888/WELMLAG/ .

  20. Novel application of quantitative single-photon emission computed-tomography/computed tomography to predict early response to methimazole in Graves' disease

    Energy Technology Data Exchange (ETDEWEB)

    Kim, Hyun Joo; Bang, Ji In; Kim, Ji Young; Moon, Jae Hoon [Seoul National University Bundang Hospital, Seoul National University College of Medicine, Seongnam (Korea, Republic of); So, Young [Dept. of Nuclear Medicine, Konkuk University Medical Center, Seoul (Korea, Republic of); Lee, Won Woo [Institute of Radiation Medicine, Medical Research Center, Seoul National University, Seoul (Korea, Republic of)

    2017-06-15

    Since Graves' disease (GD) is resistant to antithyroid drugs (ATDs), an accurate quantitative thyroid function measurement is required for the prediction of early responses to ATD. Quantitative parameters derived from the novel technology, single-photon emission computed tomography/computed tomography (SPECT/CT), were investigated for the prediction of achievement of euthyroidism after methimazole (MMI) treatment in GD. A total of 36 GD patients (10 males, 26 females; mean age, 45.3 ± 13.8 years) were enrolled for this study, from April 2015 to January 2016. They underwent quantitative thyroid SPECT/CT 20 minutes post-injection of {sup 99m}Tc-pertechnetate (5 mCi). Association between the time to biochemical euthyroidism after MMI treatment and uptake, standardized uptake value (SUV), functional thyroid mass (SUVmean × thyroid volume) from the SPECT/CT, and clinical/biochemical variables, were investigated. GD patients had a significantly greater %uptake (6.9 ± 6.4%) than historical control euthyroid patients (n = 20, 0.8 ± 0.5%, p < 0.001) from the same quantitative SPECT/CT protocol. Euthyroidism was achieved in 14 patients at 156 ± 62 days post-MMI treatment, but 22 patients had still not achieved euthyroidism by the last follow-up time-point (208 ± 80 days). In the univariate Cox regression analysis, the initial MMI dose (p = 0.014), %uptake (p = 0.015), and functional thyroid mass (p = 0.016) were significant predictors of euthyroidism in response to MMI treatment. However, only uptake remained significant in a multivariate Cox regression analysis (p = 0.034). A uptake cutoff of 5.0% dichotomized the faster responding versus the slower responding GD patients (p = 0.006). A novel parameter of thyroid uptake from quantitative SPECT/CT is a predictive indicator of an early response to MMI in GD patients.

  1. Computational Prediction of Human Salivary Proteins from Blood Circulation and Application to Diagnostic Biomarker Identification

    Science.gov (United States)

    Wang, Jiaxin; Liang, Yanchun; Wang, Yan; Cui, Juan; Liu, Ming; Du, Wei; Xu, Ying

    2013-01-01

    Proteins can move from blood circulation into salivary glands through active transportation, passive diffusion or ultrafiltration, some of which are then released into saliva and hence can potentially serve as biomarkers for diseases if accurately identified. We present a novel computational method for predicting salivary proteins that come from circulation. The basis for the prediction is a set of physiochemical and sequence features we found to be discerning between human proteins known to be movable from circulation to saliva and proteins deemed to be not in saliva. A classifier was trained based on these features using a support-vector machine to predict protein secretion into saliva. The classifier achieved 88.56% average recall and 90.76% average precision in 10-fold cross-validation on the training data, indicating that the selected features are informative. Considering the possibility that our negative training data may not be highly reliable (i.e., proteins predicted to be not in saliva), we have also trained a ranking method, aiming to rank the known salivary proteins from circulation as the highest among the proteins in the general background, based on the same features. This prediction capability can be used to predict potential biomarker proteins for specific human diseases when coupled with the information of differentially expressed proteins in diseased versus healthy control tissues and a prediction capability for blood-secretory proteins. Using such integrated information, we predicted 31 candidate biomarker proteins in saliva for breast cancer. PMID:24324552

  2. Predictive equations for lung volumes from computed tomography for size matching in pulmonary transplantation.

    Science.gov (United States)

    Konheim, Jeremy A; Kon, Zachary N; Pasrija, Chetan; Luo, Qingyang; Sanchez, Pablo G; Garcia, Jose P; Griffith, Bartley P; Jeudy, Jean

    2016-04-01

    Size matching for lung transplantation is widely accomplished using height comparisons between donors and recipients. This gross approximation allows for wide variation in lung size and, potentially, size mismatch. Three-dimensional computed tomography (3D-CT) volumetry comparisons could offer more accurate size matching. Although recipient CT scans are universally available, donor CT scans are rarely performed. Therefore, predicted donor lung volumes could be used for comparison to measured recipient lung volumes, but no such predictive equations exist. We aimed to use 3D-CT volumetry measurements from a normal patient population to generate equations for predicted total lung volume (pTLV), predicted right lung volume (pRLV), and predicted left lung volume (pLLV), for size-matching purposes. Chest CT scans of 400 normal patients were retrospectively evaluated. 3D-CT volumetry was performed to measure total lung volume, right lung volume, and left lung volume of each patient, and predictive equations were generated. The fitted model was tested in a separate group of 100 patients. The model was externally validated by comparison of total lung volume with total lung capacity from pulmonary function tests in a subset of those patients. Age, gender, height, and race were independent predictors of lung volume. In the test group, there were strong linear correlations between predicted and actual lung volumes measured by 3D-CT volumetry for pTLV (r = 0.72), pRLV (r = 0.72), and pLLV (r = 0.69). A strong linear correlation was also observed when comparing pTLV and total lung capacity (r = 0.82). We successfully created a predictive model for pTLV, pRLV, and pLLV. These may serve as reference standards and predict donor lung volume for size matching in lung transplantation. Copyright © 2016 The American Association for Thoracic Surgery. Published by Elsevier Inc. All rights reserved.

  3. Metabolic flexibility of mitochondrial respiratory chain disorders predicted by computer modelling.

    Science.gov (United States)

    Zieliński, Łukasz P; Smith, Anthony C; Smith, Alexander G; Robinson, Alan J

    2016-11-01

    Mitochondrial respiratory chain dysfunction causes a variety of life-threatening diseases affecting about 1 in 4300 adults. These diseases are genetically heterogeneous, but have the same outcome; reduced activity of mitochondrial respiratory chain complexes causing decreased ATP production and potentially toxic accumulation of metabolites. Severity and tissue specificity of these effects varies between patients by unknown mechanisms and treatment options are limited. So far most research has focused on the complexes themselves, and the impact on overall cellular metabolism is largely unclear. To illustrate how computer modelling can be used to better understand the potential impact of these disorders and inspire new research directions and treatments, we simulated them using a computer model of human cardiomyocyte mitochondrial metabolism containing over 300 characterised reactions and transport steps with experimental parameters taken from the literature. Overall, simulations were consistent with patient symptoms, supporting their biological and medical significance. These simulations predicted: complex I deficiencies could be compensated using multiple pathways; complex II deficiencies had less metabolic flexibility due to impacting both the TCA cycle and the respiratory chain; and complex III and IV deficiencies caused greatest decreases in ATP production with metabolic consequences that parallel hypoxia. Our study demonstrates how results from computer models can be compared to a clinical phenotype and used as a tool for hypothesis generation for subsequent experimental testing. These simulations can enhance understanding of dysfunctional mitochondrial metabolism and suggest new avenues for research into treatment of mitochondrial disease and other areas of mitochondrial dysfunction. Copyright © 2016 The Authors. Published by Elsevier B.V. All rights reserved.

  4. A recurrent de novo mutation in KCNC1 causes progressive myoclonus epilepsy

    DEFF Research Database (Denmark)

    Muona, M.; Berkovic, S. F.; Dibbens, L. M.

    2015-01-01

    Progressive myoclonus epilepsies (PMEs) are a group of rare, inherited disorders manifesting with action myoclonus, tonicclonic seizures and ataxia. We sequenced the exomes of 84 unrelated individuals with PME of unknown cause and molecularly solved 26 cases (31%). Remarkably, a recurrent de novo...

  5. A SA8000 e a responsabilidade social das empresas: a emergência de um novo paradigma?

    OpenAIRE

    Lopes, Ana Catarina Marques Figueiredo Caetano

    2004-01-01

    Mestrado em Desenvolvimento e Cooperação Internacional A Responsabilidade Social das Empresas, não é um tema novo. Durante muito tempo foi discutido e interpretado no âmbito do debate sobre as responsabilidades que uma empresa deve assumir para além daquelas que tem perante os seus accionistas, e das impostas por lei. Hoje, observamos a emergência de um novo paradigma assente no reconhecimento que o sector privado não só pode, mas deve, fazer mais para combater a pobreza, preservar o meio ...

  6. Computational predictions of damage propagation preceding dissection of ascending thoracic aortic aneurysms.

    Science.gov (United States)

    Mousavi, S Jamaleddin; Farzaneh, Solmaz; Avril, Stéphane

    2018-04-01

    Dissections of ascending thoracic aortic aneurysms (ATAAs) cause significant morbidity and mortality worldwide. They occur when a tear in the intima-media of the aorta permits the penetration of the blood and the subsequent delamination and separation of the wall in 2 layers, forming a false channel. To predict computationally the risk of tear formation, stress analyses should be performed layer-specifically and they should consider internal or residual stresses that exist in the tissue. In the present paper, we propose a novel layer-specific damage model based on the constrained mixture theory, which intrinsically takes into account these internal stresses and can predict appropriately the tear formation. The model is implemented in finite-element commercial software Abaqus coupled with user material subroutine. Its capability is tested by applying it to the simulation of different exemplary situations, going from in vitro bulge inflation experiments on aortic samples to in vivo overpressurizing of patient-specific ATAAs. The simulations reveal that damage correctly starts from the intimal layer (luminal side) and propagates across the media as a tear but never hits the adventitia. This scenario is typically the first stage of development of an acute dissection, which is predicted for pressures of about 2.5 times the diastolic pressure by the model after calibrating the parameters against experimental data performed on collected ATAA samples. Further validations on a larger cohort of patients should hopefully confirm the potential of the model in predicting patient-specific damage evolution and possible risk of dissection during aneurysm growth for clinical applications. Copyright © 2017 John Wiley & Sons, Ltd.

  7. Computational prediction of muon stopping sites using ab initio random structure searching (AIRSS)

    Science.gov (United States)

    Liborio, Leandro; Sturniolo, Simone; Jochym, Dominik

    2018-04-01

    The stopping site of the muon in a muon-spin relaxation experiment is in general unknown. There are some techniques that can be used to guess the muon stopping site, but they often rely on approximations and are not generally applicable to all cases. In this work, we propose a purely theoretical method to predict muon stopping sites in crystalline materials from first principles. The method is based on a combination of ab initio calculations, random structure searching, and machine learning, and it has successfully predicted the MuT and MuBC stopping sites of muonium in Si, diamond, and Ge, as well as the muonium stopping site in LiF, without any recourse to experimental results. The method makes use of Soprano, a Python library developed to aid ab initio computational crystallography, that was publicly released and contains all the software tools necessary to reproduce our analysis.

  8. Study on Strategic Planning of Road and Bridge Infrastructure Development in City Planning: Taking Porto-novo City of Benin Republic as Example

    Directory of Open Access Journals (Sweden)

    Boko-haya Dossa Didier

    2018-01-01

    Full Text Available Concern about the townlet infrastructure construction in developing country is one of the crucial part of county town planning and development. By taking the overall planning and design in a case study of Porto-novo city at Republic of Benin, this paper analyzes the characteristics and opportunities of Porto-novo city and puts forward corresponding infrastructure construction strategy. In the end, the paper comes up with specific plan of planning and design under the background of Porto-novo's planning of development strategy.

  9. Antimicrobial peptide capsids of de novo design.

    Science.gov (United States)

    De Santis, Emiliana; Alkassem, Hasan; Lamarre, Baptiste; Faruqui, Nilofar; Bella, Angelo; Noble, James E; Micale, Nicola; Ray, Santanu; Burns, Jonathan R; Yon, Alexander R; Hoogenboom, Bart W; Ryadnov, Maxim G

    2017-12-22

    The spread of bacterial resistance to antibiotics poses the need for antimicrobial discovery. With traditional search paradigms being exhausted, approaches that are altogether different from antibiotics may offer promising and creative solutions. Here, we introduce a de novo peptide topology that-by emulating the virus architecture-assembles into discrete antimicrobial capsids. Using the combination of high-resolution and real-time imaging, we demonstrate that these artificial capsids assemble as 20-nm hollow shells that attack bacterial membranes and upon landing on phospholipid bilayers instantaneously (seconds) convert into rapidly expanding pores causing membrane lysis (minutes). The designed capsids show broad antimicrobial activities, thus executing one primary function-they destroy bacteria on contact.

  10. Computer mouse use predicts acute pain but not prolonged or chronic pain in the neck and shoulder

    DEFF Research Database (Denmark)

    Andersen, Johan Hviid; Harhoff, Mette; Grimstrup, Søren

    2007-01-01

    BACKGROUND: Computer use is one of the commonest work place exposures in modern society. An adverse effect on musculoskeletal outcomes has been claimed for decades, mainly on the basis of self reports of exposure. The purpose of this study was to assess the risk of neck and shoulder pain associat...... psychosocial factors predicted the risk of prolonged pain. CONCLUSIONS: From the NUDATA-study we can conclude that most computer workers have no or minor neck and shoulder pain, few experience prolonged pain, and even fewer, chronic neck and shoulder pain....

  11. Predictability of bone density at posterior mandibular implant sites using cone-beam computed tomography intensity values.

    Science.gov (United States)

    Alkhader, Mustafa; Hudieb, Malik; Khader, Yousef

    2017-01-01

    The aim of this study was to investigate the predictability of bone density at posterior mandibular implant sites using cone-beam computed tomography (CBCT) intensity values. CBCT cross-sectional images for 436 posterior mandibular implant sites were selected for the study. Using Invivo software (Anatomage, San Jose, California, USA), two observers classified the bone density into three categories: low, intermediate, and high, and CBCT intensity values were generated. Based on the consensus of the two observers, 15.6% of sites were of low bone density, 47.9% were of intermediate density, and 36.5% were of high density. Receiver-operating characteristic analysis showed that CBCT intensity values had a high predictive power for predicting high density sites (area under the curve [AUC] =0.94, P < 0.005) and intermediate density sites (AUC = 0.81, P < 0.005). The best cut-off value for intensity to predict intermediate density sites was 218 (sensitivity = 0.77 and specificity = 0.76) and the best cut-off value for intensity to predict high density sites was 403 (sensitivity = 0.93 and specificity = 0.77). CBCT intensity values are considered useful for predicting bone density at posterior mandibular implant sites.

  12. Impact of cone-beam computed tomography on implant planning and on prediction of implant size

    International Nuclear Information System (INIS)

    Pedroso, Ludmila Assuncao de Mello; Silva, Maria Alves Garcia Santos; Garcia, Robson Rodrigues; Leles, Jose Luiz Rodrigues; Leles, Claudio Rodrigues

    2013-01-01

    The aim was to investigate the impact of cone-beam computed tomography (CBCT) on implant planning and on prediction of final implant size. Consecutive patients referred for implant treatment were submitted to clinical examination, panoramic (PAN) radiography and a CBCT exam. Initial planning of implant length and width was assessed based on clinical and PAN exams, and final planning, on CBCT exam to complement diagnosis. The actual dimensions of the implants placed during surgery were compared with those obtained during initial and final planning, using the McNemmar test (p 0.05). It was concluded that CBCT improves the ability of predicting the actual implant length and reduces inaccuracy in surgical dental implant planning. (author)

  13. Reducing usage of the computational resources by event driven approach to model predictive control

    Science.gov (United States)

    Misik, Stefan; Bradac, Zdenek; Cela, Arben

    2017-08-01

    This paper deals with a real-time and optimal control of dynamic systems while also considers the constraints which these systems might be subject to. Main objective of this work is to propose a simple modification of the existing Model Predictive Control approach to better suit needs of computational resource-constrained real-time systems. An example using model of a mechanical system is presented and the performance of the proposed method is evaluated in a simulated environment.

  14. Sequencing and De Novo Assembly of the Toxicodendron radicans (Poison Ivy) Transcriptome.

    Science.gov (United States)

    Weisberg, Alexandra J; Kim, Gunjune; Westwood, James H; Jelesko, John G

    2017-11-10

    Contact with poison ivy plants is widely dreaded because they produce a natural product called urushiol that is responsible for allergenic contact delayed-dermatitis symptoms lasting for weeks. For this reason, the catchphrase most associated with poison ivy is "leaves of three, let it be", which serves the purpose of both identification and an appeal for avoidance. Ironically, despite this notoriety, there is a dearth of specific knowledge about nearly all other aspects of poison ivy physiology and ecology. As a means of gaining a more molecular-oriented understanding of poison ivy physiology and ecology, Next Generation DNA sequencing technology was used to develop poison ivy root and leaf RNA-seq transcriptome resources. De novo assembled transcriptomes were analyzed to generate a core set of high quality expressed transcripts present in poison ivy tissue. The predicted protein sequences were evaluated for similarity to SwissProt homologs and InterProScan domains, as well as assigned both GO terms and KEGG annotations. Over 23,000 simple sequence repeats were identified in the transcriptome, and corresponding oligo nucleotide primer pairs were designed. A pan-transcriptome analysis of existing Anacardiaceae transcriptomes revealed conserved and unique transcripts among these species.

  15. Resveratrol induces growth inhibition and apoptosis in metastatic breast cancer cells via de novo ceramide signaling.

    Science.gov (United States)

    Scarlatti, Francesca; Sala, Giusy; Somenzi, Giulia; Signorelli, Paola; Sacchi, Nicoletta; Ghidoni, Riccardo

    2003-12-01

    Resveratrol (3,4',5-trans-trihydroxystilbene), a phytoalexin present in grapes and red wine, is emerging as a natural compound with potential anticancer properties. Here we show that resveratrol can induce growth inhibition and apoptosis in MDA-MB-231, a highly invasive and metastatic breast cancer cell line, in concomitance with a dramatic endogenous increase of growth inhibitory/proapoptotic ceramide. We found that accumulation of ceramide derives from both de novo ceramide synthesis and sphingomyelin hydrolysis. More specifically we demonstrated that ceramide accumulation induced by resveratrol can be traced to the activation of serine palmitoyltransferase (SPT), the key enzyme of de novo ceramide biosynthetic pathway, and neutral sphingomyelinase (nSMase), a main enzyme involved in the sphingomyelin/ceramide pathway. However, by using specific inhibitors of SPT, myriocin and L-cycloserine, and nSMase, gluthatione and manumycin, we found that only the SPT inhibitors could counteract the biological effects induced by resveratrol. Thus, resveratrol seems to exert its growth inhibitory/apoptotic effect on the metastatic breast cancer cell line MDA-MB-231 by activating the de novo ceramide synthesis pathway.

  16. A Swedish family with de novo alpha-synuclein A53T mutation: evidence for early cortical dysfunction

    DEFF Research Database (Denmark)

    Puschmann, Andreas; Ross, Owen A; Vilariño-Güell, Carles

    2009-01-01

    A de novo alpha-synuclein A53T (p.Ala53 Th; c.209G > A) mutation has been identified in a Swedish family with autosomal dominant Parkinson's disease (PD). Two affected individuals had early-onset (before 31 and 40 years), severe levodopa-responsive PD with prominent dysphasia, dysarthria, and cog......A de novo alpha-synuclein A53T (p.Ala53 Th; c.209G > A) mutation has been identified in a Swedish family with autosomal dominant Parkinson's disease (PD). Two affected individuals had early-onset (before 31 and 40 years), severe levodopa-responsive PD with prominent dysphasia, dysarthria......) and the Greek-American Family H kindreds. One unaffected family member carried the mutation haplotype without the c.209A mutation, strongly suggesting its de novo occurrence within this family. Furthermore, a novel mutation c.488G > A (p.Arg163His; R163H) in the presenilin-2 (PSEN2) gene was detected...

  17. Evaluation of effect of Ophiostoma novo-ulmi on four major wood ...

    African Journals Online (AJOL)

    Evaluation of effect of Ophiostoma novo-ulmi on four major wood species of the elm family in Rasht (North West of Iran) ... the diameter size of vessels and the number of xylary rays in four species: Ulmus carpinifolia, Ulmus glabra, Zelkova carpinifolia and Celtis australis as important factors in host resistance to elm disease.

  18. Energy Consumption and Indoor Environment Predicted by a Combination of Computational Fluid Dynamics and Building Energy Performance Simulation

    DEFF Research Database (Denmark)

    Nielsen, Peter Vilhelm

    2003-01-01

    An interconnection between a building energy performance simulation program and a Computational Fluid Dynamics program (CFD) for room air distribution is introduced for improvement of the predictions of both the energy consumption and the indoor environment.The article describes a calculation...

  19. High Incidence of De Novo and Subclinical Atrial Fibrillation in Patients With Hypertrophic Cardiomyopathy and Cardiac Rhythm Management Device.

    Science.gov (United States)

    Wilke, Iris; Witzel, Katrin; Münch, Julia; Pecha, Simon; Blankenberg, Stephan; Reichenspurner, Hermann; Willems, Stephan; Patten, Monica; Aydin, Ali

    2016-07-01

    Atrial fibrillation (AF) is an important prognostic parameter in patients with hypertrophic cardiomyopathy (HCM). Though cardiac rhythm management (CRM) devices (e.g., ICD, pacemaker or implantable loop recorder) can detect subclinical AF, data describing the incidence of AF are rare. We therefore investigated the incidence and clinical impact of de novo and subclinical AF detected by CRM devices in patients with HCM. In our retrospective single-center study, we included patients with HCM and need for CRM devices. The primary endpoint of the study was the incidence of clinical and subclinical de novo AF. During follow-up, patients were screened for adverse events like stroke, ventricular arrhythmia, heart failure, or death. From 192 HCM patients, 44 patients received a CRM device (38 ICDs, 5 pacemakers, 1 implantable loop recorder). In 14 of these patients (32%), AF had been documented before device implantation. Thirty (68%) patients were free from AF at the time of implantation. During a median follow-up of 595 days (interquartile range, 367-890 days), de novo AF was recorded in 16 of these 30 patients (53%). Fourteen (88%) of the 16 patients with de novo AF were free from any clinical symptoms, so these patients were classified to have subclinical AF. In logistic regression analysis, age was the only significant predictor for an increased risk of AF. AF is common in patients with HCM who need a CRM device. More than 50% of these patients develop de novo AF that was predominantly subclinical in our cohort. © 2016 Wiley Periodicals, Inc.

  20. Whole Exome Sequencing for a Patient with Rubinstein-Taybi Syndrome Reveals de Novo Variants besides an Overt CREBBP Mutation

    Directory of Open Access Journals (Sweden)

    Hee Jeong Yoo

    2015-03-01

    Full Text Available Rubinstein-Taybi syndrome (RSTS is a rare condition with a prevalence of 1 in 125,000–720,000 births and characterized by clinical features that include facial, dental, and limb dysmorphology and growth retardation. Most cases of RSTS occur sporadically and are caused by de novo mutations. Cytogenetic or molecular abnormalities are detected in only 55% of RSTS cases. Previous genetic studies have yielded inconsistent results due to the variety of methods used for genetic analysis. The purpose of this study was to use whole exome sequencing (WES to evaluate the genetic causes of RSTS in a young girl presenting with an Autism phenotype. We used the Autism diagnostic observation schedule (ADOS and Autism diagnostic interview revised (ADI-R to confirm her diagnosis of Autism. In addition, various questionnaires were used to evaluate other psychiatric features. We used WES to analyze the DNA sequences of the patient and her parents and to search for de novo variants. The patient showed all the typical features of Autism, WES revealed a de novo frameshift mutation in CREBBP and de novo sequence variants in TNC and IGFALS genes. Mutations in the CREBBP gene have been extensively reported in RSTS patients, while potential missense mutations in TNC and IGFALS genes have not previously been associated with RSTS. The TNC and IGFALS genes are involved in central nervous system development and growth. It is possible for patients with RSTS to have additional de novo variants that could account for previously unexplained phenotypes.

  1. Analysis of pyrimidine synthesis "de novo" intermediates in urine and dried urine filter- paper strips with HPLC-electrospray tandem mass spectrometry

    NARCIS (Netherlands)

    van Kuilenburg, André B. P.; van Lenthe, Henk; Löffler, Monika; van Gennip, Albert H.

    2004-01-01

    BACKGROUND: The concentrations of the pyrimidine "de novo" metabolites and their degradation products in urine are useful indicators for the diagnosis of an inborn error of the pyrimidine de novo pathway or a urea-cycle defect. Until now, no procedure was available that allowed the analysis of all

  2. Clinical correlation of the binding potential with {sup 123}I-FP-CIT in de novo idiopathic Parkinson's disease patients

    Energy Technology Data Exchange (ETDEWEB)

    Berti, Valentina; Pupi, Alberto; Vanzi, Eleonora; Cristofaro, Maria Teresa de; Pellicano, Giannantonio; Mungai, Francesco [University of Florence, Clinical Pathophysiology, Florence (Italy); Ramat, Silvia; Marini, Paolo; Sorbi, Sandro [University of Florence, Neurological and Psychiatric Sciences, Florence (Italy)

    2008-12-15

    The aim of this study was to evaluate the accuracy of different single-photon emission computed tomography (SPECT) reconstruction techniques in measuring striatal N-{omega}-fluoropropyl-2{beta}-carbomethoxy-3{beta}-4-[{sup 123}I]iodophenyl-nortropane ({sup 123}I-FP-CIT) binding in de novo Parkinson's disease (PD) patients, in order to find a correlation with clinical scales of disease severity in the initial phases of disease. Thirty-six de novo PD patients underwent {sup 123}I-FP-CIT SPECT and MRI scan. SPECT data were reconstructed with filtered back projection (FBP), with an iterative algorithm (ordered subset expected maximization, OSEM) and with a method previously developed in our institution, called least-squares (LS) method. The ratio of specific to non-specific striatal {sup 123}I-FP-CIT binding (binding potential, BP) was used as the outcome measure with all the reconstruction methods (BP{sub FBP}, BP{sub OSEM}, BP{sub LS}). The range of values of striatal BP{sub LS} was significantly greater than BP{sub FBP} and BP{sub OSEM}. For all striatal regions, estimates of BP{sub FBP} correlated well with BP{sub OSEM} (r=0.84) and with BP{sub LS} (r=0.64); BP{sub OSEM} correlated significantly with BP{sub LS} (r=0.76). A good correlation was found between putaminal BP{sub LS} and Hoen and Yahr, Unified PD Rating Scale (UPDRS) and lateralized UPDRS motor scores (r=-0.46, r=-0.42, r=-0.39, respectively). Neither putaminal BP{sub FBP} nor putaminal BP{sub OSEM} correlated with any of these motor scores. In de novo PD patients, {sup 123}I-FP-CIT BP values derived from FBP and OSEM reconstruction techniques do not permit to differentiate PD severity. The LS method instead finds a correlation between striatal BP and disease severity scores. The results of this study support the use of {sup 123}I-FP-CIT BP values estimated with the LS method as a biomarker of PD severity. (orig.)

  3. De novo formation of centrosomes in vertebrate cells arrested during S phase

    NARCIS (Netherlands)

    Khodjakov, A; Rieder, CL; Sluder, G; Cassels, G; Sibon, O; Wang, CL

    2002-01-01

    The centrosome usually replicates in a semiconservative fashion, i.e., new centrioles form in association with preexisting "maternal" centrioles. De novo formation of centrioles has been reported for a few highly specialized cell types but it has not been seen in vertebrate somatic cells. We find

  4. De novo mutations of GCK, HNF1A and HNF4A may be more frequent in MODY than previously assumed.

    Science.gov (United States)

    Stanik, Juraj; Dusatkova, Petra; Cinek, Ondrej; Valentinova, Lucia; Huckova, Miroslava; Skopkova, Martina; Dusatkova, Lenka; Stanikova, Daniela; Pura, Mikulas; Klimes, Iwar; Lebl, Jan; Gasperikova, Daniela; Pruhova, Stepanka

    2014-03-01

    MODY is mainly characterised by an early onset of diabetes and a positive family history of diabetes with an autosomal dominant mode of inheritance. However, de novo mutations have been reported anecdotally. The aim of this study was to systematically revisit a large collection of MODY patients to determine the minimum prevalence of de novo mutations in the most prevalent MODY genes (i.e. GCK, HNF1A, HNF4A). Analysis of 922 patients from two national MODY centres (Slovakia and the Czech Republic) identified 150 probands (16%) who came from pedigrees that did not fulfil the criterion of two generations with diabetes but did fulfil the remaining criteria. The GCK, HNF1A and HNF4A genes were analysed by direct sequencing. Mutations in GCK, HNF1A or HNF4A genes were detected in 58 of 150 individuals. Parents of 28 probands were unavailable for further analysis, and in 19 probands the mutation was inherited from an asymptomatic parent. In 11 probands the mutations arose de novo. In our cohort of MODY patients from two national centres the de novo mutations in GCK, HNF1A and HNF4A were present in 7.3% of the 150 families without a history of diabetes and 1.2% of all of the referrals for MODY testing. This is the largest collection of de novo MODY mutations to date, and our findings indicate a much higher frequency of de novo mutations than previously assumed. Therefore, genetic testing of MODY could be considered for carefully selected individuals without a family history of diabetes.

  5. A rice gene of de novo origin negatively regulates pathogen-induced defense response.

    Directory of Open Access Journals (Sweden)

    Wenfei Xiao

    Full Text Available How defense genes originated with the evolution of their specific pathogen-responsive traits remains an important problem. It is generally known that a form of duplication can generate new genes, suggesting that a new gene usually evolves from an ancestral gene. However, we show that a new defense gene in plants may evolve by de novo origination, resulting in sophisticated disease-resistant functions in rice. Analyses of gene evolution showed that this new gene, OsDR10, had homologs only in the closest relative, Leersia genus, but not other subfamilies of the grass family; therefore, it is a rice tribe-specific gene that may have originated de novo in the tribe. We further show that this gene may evolve a highly conservative rice-specific function that contributes to the regulation difference between rice and other plant species in response to pathogen infections. Biologic analyses including gene silencing, pathologic analysis, and mutant characterization by transformation showed that the OsDR10-suppressed plants enhanced resistance to a broad spectrum of Xanthomonas oryzae pv. oryzae strains, which cause bacterial blight disease. This enhanced disease resistance was accompanied by increased accumulation of endogenous salicylic acid (SA and suppressed accumulation of endogenous jasmonic acid (JA as well as modified expression of a subset of defense-responsive genes functioning both upstream and downstream of SA and JA. These data and analyses provide fresh insights into the new biologic and evolutionary processes of a de novo gene recruited rapidly.

  6. Prediction of Elastic Constants of the Fuzzy Fibre Reinforced Polymer Using Computational Micromechanics

    Science.gov (United States)

    Pawlik, Marzena; Lu, Yiling

    2018-05-01

    Computational micromechanics is a useful tool to predict properties of carbon fibre reinforced polymers. In this paper, a representative volume element (RVE) is used to investigate a fuzzy fibre reinforced polymer. The fuzzy fibre results from the introduction of nanofillers in the fibre surface. The composite being studied contains three phases, namely: the T650 carbon fibre, the carbon nanotubes (CNTs) reinforced interphase and the epoxy resin EPIKOTE 862. CNTs are radially grown on the surface of the carbon fibre, and thus resultant interphase composed of nanotubes and matrix is transversely isotropic. Transversely isotropic properties of the interphase are numerically implemented in the ANSYS FEM software using element orientation command. Obtained numerical predictions are compared with the available analytical models. It is found that the CNTs interphase significantly increased the transverse mechanical properties of the fuzzy fibre reinforced polymer. This extent of enhancement changes monotonically with the carbon fibre volume fraction. This RVE model enables to investigate different orientation of CNTs in the fuzzy fibre model.

  7. A rare case of de novo gigantic ovarian abscess within an endometrioma.

    Science.gov (United States)

    Hameed, Aisha; Mehta, Vaishali; Sinha, Prabha

    2010-06-01

    We are reporting a rare case of de novo ovarian abscess in an endometrioma. Ovarian abscess within an endometrioma is a rare gynecological problem, but de novo abscess in the endometrioma is even rarer. Most of the ovarian abscesses develop in the endometriomas following interventions, e.g., aspiration, pelvic surgery, and oocyte retrieval. We are presenting a case of a spontaneous giant abscess in a large ovarian cyst in a nulliparous woman who presented with acute abdomen. Patient was treated in a district general hospital with multidisciplinary approach. Thirteen liters of the pus were drained. She has had a sub total (supra cervical) hysterectomy and bilateral salpingo-oophorectomy (BSO) performed. Histology of the abscess wall confirmed endometriotic nature of the cyst. Patient made an uneventful recovery and was discharged home on the 14th postoperative day. This case highlights that endometrioma and its complication can present as a surgical emergency and should be dealt as one.

  8. Distilled single-cell genome sequencing and de novo assembly for sparse microbial communities.

    Science.gov (United States)

    Taghavi, Zeinab; Movahedi, Narjes S; Draghici, Sorin; Chitsaz, Hamidreza

    2013-10-01

    Identification of every single genome present in a microbial sample is an important and challenging task with crucial applications. It is challenging because there are typically millions of cells in a microbial sample, the vast majority of which elude cultivation. The most accurate method to date is exhaustive single-cell sequencing using multiple displacement amplification, which is simply intractable for a large number of cells. However, there is hope for breaking this barrier, as the number of different cell types with distinct genome sequences is usually much smaller than the number of cells. Here, we present a novel divide and conquer method to sequence and de novo assemble all distinct genomes present in a microbial sample with a sequencing cost and computational complexity proportional to the number of genome types, rather than the number of cells. The method is implemented in a tool called Squeezambler. We evaluated Squeezambler on simulated data. The proposed divide and conquer method successfully reduces the cost of sequencing in comparison with the naïve exhaustive approach. Squeezambler and datasets are available at http://compbio.cs.wayne.edu/software/squeezambler/.

  9. Para além do lumpen-indigenismo: novos aspectos informacionais da política indigenista brasileira

    Directory of Open Access Journals (Sweden)

    Rodrigo Piquet Saboia de Mello

    2017-06-01

    Full Text Available O artigo em tela propõe uma releitura do artigo intitulado “O lumpen-indigenismo do estado brasileiro” do antropólogo Jorge Pozzobon tendo por objetivo reelaborar alguns conceitos e propostas do autor em tempo outrora, assim como estabelecer uma ponte para os novos fenômenos informacionais que advém do conjunto político-jurídico hodierno, assim como no fortalecimento do protagonismo indígena na construção de novos modelos de participação e gerenciamento da informação indígena.

  10. Computational tools for experimental determination and theoretical prediction of protein structure

    Energy Technology Data Exchange (ETDEWEB)

    O`Donoghue, S.; Rost, B.

    1995-12-31

    This tutorial was one of eight tutorials selected to be presented at the Third International Conference on Intelligent Systems for Molecular Biology which was held in the United Kingdom from July 16 to 19, 1995. The authors intend to review the state of the art in the experimental determination of protein 3D structure (focus on nuclear magnetic resonance), and in the theoretical prediction of protein function and of protein structure in 1D, 2D and 3D from sequence. All the atomic resolution structures determined so far have been derived from either X-ray crystallography (the majority so far) or Nuclear Magnetic Resonance (NMR) Spectroscopy (becoming increasingly more important). The authors briefly describe the physical methods behind both of these techniques; the major computational methods involved will be covered in some detail. They highlight parallels and differences between the methods, and also the current limitations. Special emphasis will be given to techniques which have application to ab initio structure prediction. Large scale sequencing techniques increase the gap between the number of known proteins sequences and that of known protein structures. They describe the scope and principles of methods that contribute successfully to closing that gap. Emphasis will be given on the specification of adequate testing procedures to validate such methods.

  11. Functional characterization of a rice de novo DNA methyltransferase, OsDRM2, expressed in Escherichia coli and yeast

    Energy Technology Data Exchange (ETDEWEB)

    Pang, Jinsong, E-mail: pangjs542@nenu.edu.cn [Key Laboratory of Molecular Epigenetics of the Ministry of Education, Northeast Normal University, Changchun, Jilin 130024 (China); Dong, Mingyue; Li, Ning; Zhao, Yanli [Key Laboratory of Molecular Epigenetics of the Ministry of Education, Northeast Normal University, Changchun, Jilin 130024 (China); Liu, Bao, E-mail: baoliu@nenu.edu.cn [Key Laboratory of Molecular Epigenetics of the Ministry of Education, Northeast Normal University, Changchun, Jilin 130024 (China)

    2013-03-01

    Highlights: ► A rice de novo DNA methyltransferase OsDRM2 was cloned. ► In vitro methylation activity of OsDRM2 was characterized with Escherichia coli. ► Assays of OsDRM2 in vivo methylation were done with Saccharomyces cerevisiae. ► OsDRM2 methylation activity is not preferential to any type of cytosine context. ► The activity of OsDRM2 is independent of RdDM pathway. - Abstract: DNA methylation of cytosine nucleotides is an important epigenetic modification that occurs in most eukaryotic organisms and is established and maintained by various DNA methyltransferases together with their co-factors. There are two major categories of DNA methyltransferases: de novo and maintenance. Here, we report the isolation and functional characterization of a de novo methyltransferase, named OsDRM2, from rice (Oryza sativa L.). The full-length coding region of OsDRM2 was cloned and transformed into Escherichia coli and Saccharomyces cerevisiae. Both of these organisms expressed the OsDRM2 protein, which exhibited stochastic de novo methylation activity in vitro at CG, CHG, and CHH di- and tri-nucleotide patterns. Two lines of evidence demonstrated the de novo activity of OsDRM2: (1) a 5′-CCGG-3′ containing DNA fragment that had been pre-treated with OsDRM2 protein expressed in E. coli was protected from digestion by the CG-methylation-sensitive isoschizomer HpaII; (2) methylation-sensitive amplified polymorphism (MSAP) analysis of S. cerevisiae genomic DNA from transformants that had been introduced with OsDRM2 revealed CG and CHG methylation levels of 3.92–9.12%, and 2.88–6.93%, respectively, whereas the mock control S. cerevisiae DNA did not exhibit cytosine methylation. These results were further supported by bisulfite sequencing of the 18S rRNA and EAF5 genes of the transformed S. cerevisiae, which exhibited different DNA methylation patterns, which were observed in the genomic DNA. Our findings establish that OsDRM2 is an active de novo DNA

  12. Functional characterization of a rice de novo DNA methyltransferase, OsDRM2, expressed in Escherichia coli and yeast

    International Nuclear Information System (INIS)

    Pang, Jinsong; Dong, Mingyue; Li, Ning; Zhao, Yanli; Liu, Bao

    2013-01-01

    Highlights: ► A rice de novo DNA methyltransferase OsDRM2 was cloned. ► In vitro methylation activity of OsDRM2 was characterized with Escherichia coli. ► Assays of OsDRM2 in vivo methylation were done with Saccharomyces cerevisiae. ► OsDRM2 methylation activity is not preferential to any type of cytosine context. ► The activity of OsDRM2 is independent of RdDM pathway. - Abstract: DNA methylation of cytosine nucleotides is an important epigenetic modification that occurs in most eukaryotic organisms and is established and maintained by various DNA methyltransferases together with their co-factors. There are two major categories of DNA methyltransferases: de novo and maintenance. Here, we report the isolation and functional characterization of a de novo methyltransferase, named OsDRM2, from rice (Oryza sativa L.). The full-length coding region of OsDRM2 was cloned and transformed into Escherichia coli and Saccharomyces cerevisiae. Both of these organisms expressed the OsDRM2 protein, which exhibited stochastic de novo methylation activity in vitro at CG, CHG, and CHH di- and tri-nucleotide patterns. Two lines of evidence demonstrated the de novo activity of OsDRM2: (1) a 5′-CCGG-3′ containing DNA fragment that had been pre-treated with OsDRM2 protein expressed in E. coli was protected from digestion by the CG-methylation-sensitive isoschizomer HpaII; (2) methylation-sensitive amplified polymorphism (MSAP) analysis of S. cerevisiae genomic DNA from transformants that had been introduced with OsDRM2 revealed CG and CHG methylation levels of 3.92–9.12%, and 2.88–6.93%, respectively, whereas the mock control S. cerevisiae DNA did not exhibit cytosine methylation. These results were further supported by bisulfite sequencing of the 18S rRNA and EAF5 genes of the transformed S. cerevisiae, which exhibited different DNA methylation patterns, which were observed in the genomic DNA. Our findings establish that OsDRM2 is an active de novo DNA

  13. Predictive Models and Computational Embryology

    Science.gov (United States)

    EPA’s ‘virtual embryo’ project is building an integrative systems biology framework for predictive models of developmental toxicity. One schema involves a knowledge-driven adverse outcome pathway (AOP) framework utilizing information from public databases, standardized ontologies...

  14. Integration of CpG-free DNA induces de novo methylation of CpG islands in pluripotent stem cells

    KAUST Repository

    Takahashi, Yuta

    2017-05-05

    CpG islands (CGIs) are primarily promoter-associated genomic regions and are mostly unmethylated within highly methylated mammalian genomes. The mechanisms by which CGIs are protected from de novo methylation remain elusive. Here we show that insertion of CpG-free DNA into targeted CGIs induces de novo methylation of the entire CGI in human pluripotent stem cells (PSCs). The methylation status is stably maintained even after CpG-free DNA removal, extensive passaging, and differentiation. By targeting the DNA mismatch repair gene MLH1 CGI, we could generate a PSC model of a cancer-related epimutation. Furthermore, we successfully corrected aberrant imprinting in induced PSCs derived from an Angelman syndrome patient. Our results provide insights into how CpG-free DNA induces de novo CGI methylation and broaden the application of targeted epigenome editing for a better understanding of human development and disease.

  15. Osteoporosis prediction from the mandible using cone-beam computed tomography

    Energy Technology Data Exchange (ETDEWEB)

    Barngkgei, Imad; Al Haffar, Iyad [Dept. of Oral Medicine, Faculty of Dentistry, Damascus University, Damascus (Syrian Arab Republic); Khattab, Razan [Dept. of Periodontology, Faculty of Dentistry, Damascus University, Damascus (Syrian Arab Republic)

    2014-12-15

    This study aimed to evaluate the use of dental cone-beam computed tomography (CBCT) in the diagnosis of osteoporosis among menopausal and postmenopausal women by using only a CBCT viewer program. Thirty-eight menopausal and postmenopausal women who underwent dual-energy X-ray absorptiometry (DXA) examination for hip and lumbar vertebrae were scanned using CBCT (field of view: 13 cmx15 cm; voxel size: 0.25 mm). Slices from the body of the mandible as well as the ramus were selected and some CBCT-derived variables, such as radiographic density (RD) as gray values, were calculated as gray values. Pearson's correlation, one-way analysis of variance (ANOVA), and accuracy (sensitivity and specificity) evaluation based on linear and logistic regression were performed to choose the variable that best correlated with the lumbar and femoral neck T-scores. RD of the whole bone area of the mandible was the variable that best correlated with and predicted both the femoral neck and the lumbar vertebrae T-scores; further, Pearson's correlation coefficients were 0.5/0.6 (p value=0.037/0.009). The sensitivity, specificity, and accuracy based on the logistic regression were 50%, 88.9%, and 78.4%, respectively, for the femoral neck, and 46.2%, 91.3%, and 75%, respectively, for the lumbar vertebrae. Lumbar vertebrae and femoral neck osteoporosis can be predicted with high accuracy from the RD value of the body of the mandible by using a CBCT viewer program.

  16. Osteoporosis prediction from the mandible using cone-beam computed tomography

    International Nuclear Information System (INIS)

    Barngkgei, Imad; Al Haffar, Iyad; Khattab, Razan

    2014-01-01

    This study aimed to evaluate the use of dental cone-beam computed tomography (CBCT) in the diagnosis of osteoporosis among menopausal and postmenopausal women by using only a CBCT viewer program. Thirty-eight menopausal and postmenopausal women who underwent dual-energy X-ray absorptiometry (DXA) examination for hip and lumbar vertebrae were scanned using CBCT (field of view: 13 cmx15 cm; voxel size: 0.25 mm). Slices from the body of the mandible as well as the ramus were selected and some CBCT-derived variables, such as radiographic density (RD) as gray values, were calculated as gray values. Pearson's correlation, one-way analysis of variance (ANOVA), and accuracy (sensitivity and specificity) evaluation based on linear and logistic regression were performed to choose the variable that best correlated with the lumbar and femoral neck T-scores. RD of the whole bone area of the mandible was the variable that best correlated with and predicted both the femoral neck and the lumbar vertebrae T-scores; further, Pearson's correlation coefficients were 0.5/0.6 (p value=0.037/0.009). The sensitivity, specificity, and accuracy based on the logistic regression were 50%, 88.9%, and 78.4%, respectively, for the femoral neck, and 46.2%, 91.3%, and 75%, respectively, for the lumbar vertebrae. Lumbar vertebrae and femoral neck osteoporosis can be predicted with high accuracy from the RD value of the body of the mandible by using a CBCT viewer program.

  17. Computational methods for constructing protein structure models from 3D electron microscopy maps.

    Science.gov (United States)

    Esquivel-Rodríguez, Juan; Kihara, Daisuke

    2013-10-01

    Protein structure determination by cryo-electron microscopy (EM) has made significant progress in the past decades. Resolutions of EM maps have been improving as evidenced by recently reported structures that are solved at high resolutions close to 3Å. Computational methods play a key role in interpreting EM data. Among many computational procedures applied to an EM map to obtain protein structure information, in this article we focus on reviewing computational methods that model protein three-dimensional (3D) structures from a 3D EM density map that is constructed from two-dimensional (2D) maps. The computational methods we discuss range from de novo methods, which identify structural elements in an EM map, to structure fitting methods, where known high resolution structures are fit into a low-resolution EM map. A list of available computational tools is also provided. Copyright © 2013 Elsevier Inc. All rights reserved.

  18. Computational Prediction of Atomic Structures of Helical Membrane Proteins Aided by EM Maps

    Science.gov (United States)

    Kovacs, Julio A.; Yeager, Mark; Abagyan, Ruben

    2007-01-01

    Integral membrane proteins pose a major challenge for protein-structure prediction because only ≈100 high-resolution structures are available currently, thereby impeding the development of rules or empirical potentials to predict the packing of transmembrane α-helices. However, when an intermediate-resolution electron microscopy (EM) map is available, it can be used to provide restraints which, in combination with a suitable computational protocol, make structure prediction feasible. In this work we present such a protocol, which proceeds in three stages: 1), generation of an ensemble of α-helices by flexible fitting into each of the density rods in the low-resolution EM map, spanning a range of rotational angles around the main helical axes and translational shifts along the density rods; 2), fast optimization of side chains and scoring of the resulting conformations; and 3), refinement of the lowest-scoring conformations with internal coordinate mechanics, by optimizing the van der Waals, electrostatics, hydrogen bonding, torsional, and solvation energy contributions. In addition, our method implements a penalty term through a so-called tethering map, derived from the EM map, which restrains the positions of the α-helices. The protocol was validated on three test cases: GpA, KcsA, and MscL. PMID:17496035

  19. Autism Spectrum Disorder in a Girl with a De Novo X;19 Balanced Translocation

    Science.gov (United States)

    Baruffi, Marcelo Razera; de Souza, Deise Helena; Bicudo da Silva, Rosana Aparecida; Ramos, Ester Silveira; Moretti-Ferreira, Danilo

    2012-01-01

    Balanced X-autosome translocations are rare, and female carriers are a clinically heterogeneous group of patients, with phenotypically normal women, history of recurrent miscarriage, gonadal dysfunction, X-linked disorders or congenital abnormalities, and/or developmental delay. We investigated a patient with a de novo X;19 translocation. The six-year-old girl has been evaluated due to hyperactivity, social interaction impairment, stereotypic and repetitive use of language with echolalia, failure to follow parents/caretakers orders, inconsolable outbursts, and persistent preoccupation with parts of objects. The girl has normal cognitive function. Her measurements are within normal range, and no other abnormalities were found during physical, neurological, or dysmorphological examinations. Conventional cytogenetic analysis showed a de novo balanced translocation, with the karyotype 46,X,t(X;19)(p21.2;q13.4). Replication banding showed a clear preference for inactivation of the normal X chromosome. The translocation was confirmed by FISH and Spectral Karyotyping (SKY). Although abnormal phenotypes associated with de novo balanced chromosomal rearrangements may be the result of disruption of a gene at one of the breakpoints, submicroscopic deletion or duplication, or a position effect, X; autosomal translocations are associated with additional unique risk factors including X-linked disorders, functional autosomal monosomy, or functional X chromosome disomy resulting from the complex X-inactivation process. PMID:23074688

  20. Autism Spectrum Disorder in a Girl with a De Novo X;19 Balanced Translocation

    Directory of Open Access Journals (Sweden)

    Marcelo Razera Baruffi

    2012-01-01

    Full Text Available Balanced X-autosome translocations are rare, and female carriers are a clinically heterogeneous group of patients, with phenotypically normal women, history of recurrent miscarriage, gonadal dysfunction, X-linked disorders or congenital abnormalities, and/or developmental delay. We investigated a patient with a de novo X;19 translocation. The six-year-old girl has been evaluated due to hyperactivity, social interaction impairment, stereotypic and repetitive use of language with echolalia, failure to follow parents/caretakers orders, inconsolable outbursts, and persistent preoccupation with parts of objects. The girl has normal cognitive function. Her measurements are within normal range, and no other abnormalities were found during physical, neurological, or dysmorphological examinations. Conventional cytogenetic analysis showed a de novo balanced translocation, with the karyotype 46,X,t(X;19(p21.2;q13.4. Replication banding showed a clear preference for inactivation of the normal X chromosome. The translocation was confirmed by FISH and Spectral Karyotyping (SKY. Although abnormal phenotypes associated with de novo balanced chromosomal rearrangements may be the result of disruption of a gene at one of the breakpoints, submicroscopic deletion or duplication, or a position effect, X; autosomal translocations are associated with additional unique risk factors including X-linked disorders, functional autosomal monosomy, or functional X chromosome disomy resulting from the complex X-inactivation process.

  1. A network integration approach for drug-target interaction prediction and computational drug repositioning from heterogeneous information.

    Science.gov (United States)

    Luo, Yunan; Zhao, Xinbin; Zhou, Jingtian; Yang, Jinglin; Zhang, Yanqing; Kuang, Wenhua; Peng, Jian; Chen, Ligong; Zeng, Jianyang

    2017-09-18

    The emergence of large-scale genomic, chemical and pharmacological data provides new opportunities for drug discovery and repositioning. In this work, we develop a computational pipeline, called DTINet, to predict novel drug-target interactions from a constructed heterogeneous network, which integrates diverse drug-related information. DTINet focuses on learning a low-dimensional vector representation of features, which accurately explains the topological properties of individual nodes in the heterogeneous network, and then makes prediction based on these representations via a vector space projection scheme. DTINet achieves substantial performance improvement over other state-of-the-art methods for drug-target interaction prediction. Moreover, we experimentally validate the novel interactions between three drugs and the cyclooxygenase proteins predicted by DTINet, and demonstrate the new potential applications of these identified cyclooxygenase inhibitors in preventing inflammatory diseases. These results indicate that DTINet can provide a practically useful tool for integrating heterogeneous information to predict new drug-target interactions and repurpose existing drugs.Network-based data integration for drug-target prediction is a promising avenue for drug repositioning, but performance is wanting. Here, the authors introduce DTINet, whose performance is enhanced in the face of noisy, incomplete and high-dimensional biological data by learning low-dimensional vector representations.

  2. Positron emission tomography/computed tomography surveillance in patients with Hodgkin lymphoma in first remission has a low positive predictive value and high costs.

    Science.gov (United States)

    El-Galaly, Tarec Christoffer; Mylam, Karen Juul; Brown, Peter; Specht, Lena; Christiansen, Ilse; Munksgaard, Lars; Johnsen, Hans Erik; Loft, Annika; Bukh, Anne; Iyer, Victor; Nielsen, Anne Lerberg; Hutchings, Martin

    2012-06-01

    The value of performing post-therapy routine surveillance imaging in patients with Hodgkin lymphoma is controversial. This study evaluates the utility of positron emission tomography/computed tomography using 2-[18F]fluoro-2-deoxyglucose for this purpose and in situations with suspected lymphoma relapse. We conducted a multicenter retrospective study. Patients with newly diagnosed Hodgkin lymphoma achieving at least a partial remission on first-line therapy were eligible if they received positron emission tomography/computed tomography surveillance during follow-up. Two types of imaging surveillance were analyzed: "routine" when patients showed no signs of relapse at referral to positron emission tomography/computed tomography, and "clinically indicated" when recurrence was suspected. A total of 211 routine and 88 clinically indicated positron emission tomography/computed tomography studies were performed in 161 patients. In ten of 22 patients with recurrence of Hodgkin lymphoma, routine imaging surveillance was the primary tool for the diagnosis of the relapse. Extranodal disease, interim positron emission tomography-positive lesions and positron emission tomography activity at response evaluation were all associated with a positron emission tomography/computed tomography-diagnosed preclinical relapse. The true positive rates of routine and clinically indicated imaging were 5% and 13%, respectively (P = 0.02). The overall positive predictive value and negative predictive value of positron emission tomography/computed tomography were 28% and 100%, respectively. The estimated cost per routine imaging diagnosed relapse was US$ 50,778. Negative positron emission tomography/computed tomography reliably rules out a relapse. The high false positive rate is, however, an important limitation and a confirmatory biopsy is mandatory for the diagnosis of a relapse. With no proven survival benefit for patients with a pre-clinically diagnosed relapse, the high costs and low

  3. Predicting renal graft failure by sCD30 levels and de novo HLA antibodies at 1year post-transplantation.

    Science.gov (United States)

    Wang, Dong; Wu, Guojun; Chen, Jinhua; Yu, Ziqiang; Wu, Weizhen; Yang, Shunliang; Tan, Jianming

    2012-06-01

    HLA antibodies and sCD30 levels were detected in the serum sampled from 620 renal graft recipients at 1 year post-transplantation, which were followed up for 5 years. Six-year graft and patient survivals were 81.6% and 91.0%. HLA antibodies were detected in 45 recipients (7.3%), of whom there were 14 cases with class I antibodies, 26 cases with class II, and 5 cases with both class I and II. Much more graft loss was record in recipients with HLA antibodies than those without antibodies (60% vs. 15.1%, psCD30 levels were recorded in recipients suffering graft loss than the others (73.9±48.8 U/mL vs. 37.3±14.6 U/mL, psCD30 levels, recipients with low sCD30 levels (sCD30 on graft survival was not only independent but also additive. Therefore, post-transplantation monitoring of HLA antibodies and sCD30 levels is necessary and recipients with elevated sCD30 level and/or de novo HLA antibody should be paid more attention in order to achieve better graft survival. Copyright © 2012 Elsevier B.V. All rights reserved.

  4. A Psicologia no novo contexto mundial Psychology in the new world context

    Directory of Open Access Journals (Sweden)

    Carla Faria Leitão

    2003-12-01

    Full Text Available Profundas alterações no mundo contemporâneo criaram um novo contexto de produção científica, caracterizado pela desconstrução de antigas teorias e pela construção de uma nova rede de conhecimentos. Neste artigo, analisamos algumas teorias recentemente desenvolvidas nas ciências sociais e na filosofia que compõem esta rede: as teorias pós-modernas, as teorias da modernização reflexiva e a teoria da Revolução da Tecnologia da Informação. Visamos com isto munir os psicólogos de conhecimentos advindos de outros campos disciplinares que sirvam como ponto de partida para a análise das mudanças subjetivas introduzidas pelo novo cenário mundial. Argumentamos que a psicologia ainda observa o homem contemporâneo a partir de categorias tradicionais, desconsiderando que transformações sociais profundas geram impactos psicológicos não menos profundos e dificilmente captáveis a partir de antigos referenciais. Concluímos que um conhecimento mais aprofundado das transformações radicais em curso no mundo atual pode ajudar os psicólogos a rever suas antigas certezas a respeito do homem e a aventurar novos olhares sobre os também novos fenômenos humanos.Profound changes in the contemporary world have created a new context for scientific work, characterised by the deconstruction of old theories and the construction of a new network of knowledge. In this article, we analyse a few theories recently developed in the social sciences and in philosophy that make up this network: post-modernist theories, reflexive modernisation theories and Information Technology Revolution theory. In this way, we aim to provide psychologists with insights produced in other fields of knowledge that can serve as a starting point for the analysis of the subjective changes introduced by the new global scenario. We argue that psychology still observes contemporary man on the basis of traditional categories, ignoring the fact that profound social changes

  5. Genomic prediction using subsampling.

    Science.gov (United States)

    Xavier, Alencar; Xu, Shizhong; Muir, William; Rainey, Katy Martin

    2017-03-24

    Genome-wide assisted selection is a critical tool for the genetic improvement of plants and animals. Whole-genome regression models in Bayesian framework represent the main family of prediction methods. Fitting such models with a large number of observations involves a prohibitive computational burden. We propose the use of subsampling bootstrap Markov chain in genomic prediction. Such method consists of fitting whole-genome regression models by subsampling observations in each round of a Markov Chain Monte Carlo. We evaluated the effect of subsampling bootstrap on prediction and computational parameters. Across datasets, we observed an optimal subsampling proportion of observations around 50% with replacement, and around 33% without replacement. Subsampling provided a substantial decrease in computation time, reducing the time to fit the model by half. On average, losses on predictive properties imposed by subsampling were negligible, usually below 1%. For each dataset, an optimal subsampling point that improves prediction properties was observed, but the improvements were also negligible. Combining subsampling with Gibbs sampling is an interesting ensemble algorithm. The investigation indicates that the subsampling bootstrap Markov chain algorithm substantially reduces computational burden associated with model fitting, and it may slightly enhance prediction properties.

  6. CNN-PROMOTER, NEW CONSENSUS PROMOTER PREDICTION PROGRAM BASED ON NEURAL NETWORKS CNN-PROMOTER, NUEVO PROGRAMA PARA LA PREDICCIÓN DE PROMOTORES BASADO EN REDES NEURONALES CNN-PROMOTER, NOVO PROGRAMA PARA A PREDIÇÃO DE PROMOTORES BASEADO EM REDES NEURONAIS

    Directory of Open Access Journals (Sweden)

    Óscar Bedoya

    2011-06-01

    Full Text Available A new promoter prediction program called CNN-Promoter is presented. CNN-Promoter allows DNA sequences to be submitted and predicts them as promoter or non-promoter. Several methods have been developed to predict the promoter regions of genomes in eukaryotic organisms including algorithms based on Markov's models, decision trees, and statistical methods. Although there are plenty of programs proposed, there is still a need to improve the sensitivity and specificity values. In this paper, a new program is proposed; it is based on the consensus strategy of using experts to make a better prediction. The consensus strategy is developed by using neural networks. During the training process, the sensitivity and specificity were 100 % and during the test process the model reaches a sensitivity of 74.5 % and a specificity of 82.7 %.En este artículo se presenta un programa nuevo para la predicción de promotores llamado CNN-Promoter, que toma como entrada secuencias de ADN y las clasifica como promotor o no promotor. Se han desarrollado diversos métodos para predecir las regiones promotoras en organismos eucariotas, muchos de los cuales se basan en modelos de Markov, árboles de decisión y métodos estadísticos. A pesar de la variedad de programas existentes para la predicción de promotores, se necesita aún mejorar los valores de sensibilidad y especificidad. Se propone un nuevo programa que se basa en la estrategia de mezcla de expertos usando redes neuronales. Los resultados obtenidos en las pruebas alcanzan valores de sensibilidad y especificidad de 100 % en el entrenamiento y de 74,5 % de sensibilidad y 82,7 % de especificidad en los conjuntos de validación y prueba.Neste artigo a presenta-se um novo programa para a predição de promotores chamado CNN-Promoter, que toma como entrada sequências de DNA e as classifica como promotor ou não promotor. Desenvolveramse diversos métodos para predizer as regiões promotoras em organismos eucariotas

  7. Computational tools for genome-wide miRNA prediction and study

    KAUST Repository

    Malas, T.B.

    2012-11-02

    MicroRNAs (miRNAs) are single-stranded non-coding RNA susually of 22 nucleotidesin length that play an important post-transcriptional regulation role in many organisms. MicroRNAs bind a seed sequence to the 3-untranslated region (UTR) region of the target messenger RNA (mRNA), inducing degradation or inhibition of translation and resulting in a reduction in the protein level. This regulatory mechanism is central to many biological processes and perturbation could lead to diseases such as cancer. Given the biological importance, of miRNAs, there is a great need to identify and study their targets and functions. However, miRNAs are very difficult to clone in the lab and this has hindered the identification of novel miRNAs. Next-generation sequencing coupled with new computational tools has recently evolved to help researchers efficiently identify large numbers of novel miRNAs. In this review, we describe recent miRNA prediction tools and discuss their priorities, advantages and disadvantages. Malas and Ravasi.

  8. Computational tools for genome-wide miRNA prediction and study

    KAUST Repository

    Malas, T.B.; Ravasi, Timothy

    2012-01-01

    MicroRNAs (miRNAs) are single-stranded non-coding RNA susually of 22 nucleotidesin length that play an important post-transcriptional regulation role in many organisms. MicroRNAs bind a seed sequence to the 3-untranslated region (UTR) region of the target messenger RNA (mRNA), inducing degradation or inhibition of translation and resulting in a reduction in the protein level. This regulatory mechanism is central to many biological processes and perturbation could lead to diseases such as cancer. Given the biological importance, of miRNAs, there is a great need to identify and study their targets and functions. However, miRNAs are very difficult to clone in the lab and this has hindered the identification of novel miRNAs. Next-generation sequencing coupled with new computational tools has recently evolved to help researchers efficiently identify large numbers of novel miRNAs. In this review, we describe recent miRNA prediction tools and discuss their priorities, advantages and disadvantages. Malas and Ravasi.

  9. A simple method suitable to study de novo root organogenesis

    Directory of Open Access Journals (Sweden)

    Xiaodong eChen

    2014-05-01

    Full Text Available De novo root organogenesis is the process in which adventitious roots regenerate from detached or wounded plant tissues or organs. In tissue culture, appropriate types and concentrations of plant hormones in the medium are critical for inducing adventitious roots. However, in natural conditions, regeneration from detached organs is likely to rely on endogenous hormones. To investigate the actions of endogenous hormones and the molecular mechanisms guiding de novo root organogenesis, we developed a simple method to imitate natural conditions for adventitious root formation by culturing Arabidopsis thaliana leaf explants on B5 medium without additive hormones. Here we show that the ability of the leaf explants to regenerate roots depends on the age of the leaf and on certain nutrients in the medium. Based on these observations, we provide examples of how this method can be used in different situations, and how it can be optimized. This simple method could be used to investigate the effects of various physiological and molecular changes on the regeneration of adventitious roots. It is also useful for tracing cell lineage during the regeneration process by differential interference contrast observation of -glucuronidase staining, and by live imaging of proteins labeled with fluorescent tags.

  10. Lethal Ultra-Early Subarachnoid Hemorrhage Due to Rupture of De Novo Aneurysm 5 Months After Primary Aneurysmatic Subarachnoid Hemorrhage.

    Science.gov (United States)

    Walter, Johannes; Unterberg, Andreas W; Zweckberger, Klaus

    2018-05-01

    Approximately 1% of all patients surviving rupture of a cerebral aneurysm suffer from a second aneurysmatic subarachnoid hemorrhage later in their lives, 61% of which are caused by rupture of a de novo aneurysm. Latency between bleedings is usually many years, and younger patients tend to achieve better outcomes from a second subarachnoid hemorrhage. We report an unusual case of lethal ultra-early rupture of a de novo aneurysm of the anterior communicating artery only 5 months after the initial subarachnoid hemorrhage and complete coiling in a young, healthy male patient. Despite complete aneurysm obliteration, young age, and good recovery, patients may be subjected to secondary subarachnoid hemorrhages from de novo aneurysms after only a few months of the initial bleeding. Early-control magnetic resonance angiography might hence be advisable. Copyright © 2018 Elsevier Inc. All rights reserved.

  11. Whole-genome sequencing in autism identifies hot spots for de novo germline mutation

    DEFF Research Database (Denmark)

    Michaelson, Jacob J.; Shi, Yujian; Gujral, Madhusudan

    2012-01-01

    De novo mutation plays an important role in autism spectrum disorders (ASDs). Notably, pathogenic copy number variants (CNVs) are characterized by high mutation rates. We hypothesize that hypermutability is a property of ASD genes and may also include nucleotide-substitution hot spots. We...

  12. SPEEDI: a computer code system for the real-time prediction of radiation dose to the public due to an accidental release

    International Nuclear Information System (INIS)

    Imai, Kazuhiko; Chino, Masamichi; Ishikawa, Hirohiko

    1985-10-01

    SPEEDI, a computer code system for prediction of environmental doses from radioactive materials accidentally released from a nuclear plant has been developed to assist the organizations responsible for an emergency planning. For realistic simulation, have been developed a model which statistically predicts the basic wind data and then calculates the three-dimensional mass consistent wind field by interpolating these predicted data, and a model for calculation of the diffusion of released materials using a combined model of random-walk and PICK methods. These calculation in the system is carried out in conversational mode with a computer so that we may use the system with ease in an emergency. SPEEDI has also versatile files, which make it easy to control the complicated flows of calculation. In order to attain a short computation time, a large-scale computer with performance of 25 MIPS and a vector processor of maximum 250 MFLOPS are used for calculation of the models so that quick responses have been made. Simplified models are also prepared for calculation in a minicomputer widely used by local governments and research institutes, although the precision of calculation as same with the above models can not be expected to obtain. The present report outlines the structure and functions of SPEEDI, methods for prediction of the wind field and the models for calculation of the concentration of released materials in air and on the ground, and the doses to the public. Some of the diffusion models have been compared with the field experiments which had been carried out as a part of the SPEEDI development program. The report also discusses the reliability of the diffusion models on the basis of the compared results, and shows that they can reasonably simulate the diffusion in the internal boundary layer which commonly occurs near the coastal region. (J.P.N.)

  13. Computer Simulation in Predicting Biochemical Processes and Energy Balance at WWTPs

    Science.gov (United States)

    Drewnowski, Jakub; Zaborowska, Ewa; Hernandez De Vega, Carmen

    2018-02-01

    Nowadays, the use of mathematical models and computer simulation allow analysis of many different technological solutions as well as testing various scenarios in a short time and at low financial budget in order to simulate the scenario under typical conditions for the real system and help to find the best solution in design or operation process. The aim of the study was to evaluate different concepts of biochemical processes and energy balance modelling using a simulation platform GPS-x and a comprehensive model Mantis2. The paper presents the example of calibration and validation processes in the biological reactor as well as scenarios showing an influence of operational parameters on the WWTP energy balance. The results of batch tests and full-scale campaign obtained in the former work were used to predict biochemical and operational parameters in a newly developed plant model. The model was extended with sludge treatment devices, including anaerobic digester. Primary sludge removal efficiency was found as a significant factor determining biogas production and further renewable energy production in cogeneration. Water and wastewater utilities, which run and control WWTP, are interested in optimizing the process in order to save environment, their budget and decrease the pollutant emissions to water and air. In this context, computer simulation can be the easiest and very useful tool to improve the efficiency without interfering in the actual process performance.

  14. Computer Simulation in Predicting Biochemical Processes and Energy Balance at WWTPs

    Directory of Open Access Journals (Sweden)

    Drewnowski Jakub

    2018-01-01

    Full Text Available Nowadays, the use of mathematical models and computer simulation allow analysis of many different technological solutions as well as testing various scenarios in a short time and at low financial budget in order to simulate the scenario under typical conditions for the real system and help to find the best solution in design or operation process. The aim of the study was to evaluate different concepts of biochemical processes and energy balance modelling using a simulation platform GPS-x and a comprehensive model Mantis2. The paper presents the example of calibration and validation processes in the biological reactor as well as scenarios showing an influence of operational parameters on the WWTP energy balance. The results of batch tests and full-scale campaign obtained in the former work were used to predict biochemical and operational parameters in a newly developed plant model. The model was extended with sludge treatment devices, including anaerobic digester. Primary sludge removal efficiency was found as a significant factor determining biogas production and further renewable energy production in cogeneration. Water and wastewater utilities, which run and control WWTP, are interested in optimizing the process in order to save environment, their budget and decrease the pollutant emissions to water and air. In this context, computer simulation can be the easiest and very useful tool to improve the efficiency without interfering in the actual process performance.

  15. A neural network based computational model to predict the output power of different types of photovoltaic cells.

    Science.gov (United States)

    Xiao, WenBo; Nazario, Gina; Wu, HuaMing; Zhang, HuaMing; Cheng, Feng

    2017-01-01

    In this article, we introduced an artificial neural network (ANN) based computational model to predict the output power of three types of photovoltaic cells, mono-crystalline (mono-), multi-crystalline (multi-), and amorphous (amor-) crystalline. The prediction results are very close to the experimental data, and were also influenced by numbers of hidden neurons. The order of the solar generation power output influenced by the external conditions from smallest to biggest is: multi-, mono-, and amor- crystalline silicon cells. In addition, the dependences of power prediction on the number of hidden neurons were studied. For multi- and amorphous crystalline cell, three or four hidden layer units resulted in the high correlation coefficient and low MSEs. For mono-crystalline cell, the best results were achieved at the hidden layer unit of 8.

  16. A neural network based computational model to predict the output power of different types of photovoltaic cells.

    Directory of Open Access Journals (Sweden)

    WenBo Xiao

    Full Text Available In this article, we introduced an artificial neural network (ANN based computational model to predict the output power of three types of photovoltaic cells, mono-crystalline (mono-, multi-crystalline (multi-, and amorphous (amor- crystalline. The prediction results are very close to the experimental data, and were also influenced by numbers of hidden neurons. The order of the solar generation power output influenced by the external conditions from smallest to biggest is: multi-, mono-, and amor- crystalline silicon cells. In addition, the dependences of power prediction on the number of hidden neurons were studied. For multi- and amorphous crystalline cell, three or four hidden layer units resulted in the high correlation coefficient and low MSEs. For mono-crystalline cell, the best results were achieved at the hidden layer unit of 8.

  17. De Novo Assembly of the Donkey White Blood Cell Transcriptome and a Comparative Analysis of Phenotype-Associated Genes between Donkeys and Horses.

    Science.gov (United States)

    Xie, Feng-Yun; Feng, Yu-Long; Wang, Hong-Hui; Ma, Yun-Feng; Yang, Yang; Wang, Yin-Chao; Shen, Wei; Pan, Qing-Jie; Yin, Shen; Sun, Yu-Jiang; Ma, Jun-Yu

    2015-01-01

    Prior to the mechanization of agriculture and labor-intensive tasks, humans used donkeys (Equus africanus asinus) for farm work and packing. However, as mechanization increased, donkeys have been increasingly raised for meat, milk, and fur in China. To maintain the development of the donkey industry, breeding programs should focus on traits related to these new uses. Compared to conventional marker-assisted breeding plans, genome- and transcriptome-based selection methods are more efficient and effective. To analyze the coding genes of the donkey genome, we assembled the transcriptome of donkey white blood cells de novo. Using transcriptomic deep-sequencing data, we identified 264,714 distinct donkey unigenes and predicted 38,949 protein fragments. We annotated the donkey unigenes by BLAST searches against the non-redundant (NR) protein database. We also compared the donkey protein sequences with those of the horse (E. caballus) and wild horse (E. przewalskii), and linked the donkey protein fragments with mammalian phenotypes. As the outer ear size of donkeys and horses are obviously different, we compared the outer ear size-associated proteins in donkeys and horses. We identified three ear size-associated proteins, HIC1, PRKRA, and KMT2A, with sequence differences among the donkey, horse, and wild horse loci. Since the donkey genome sequence has not been released, the de novo assembled donkey transcriptome is helpful for preliminary investigations of donkey cultivars and for genetic improvement.

  18. Prediction of detonation and JWL eos parameters of energetic materials using EXPLO5 computer code

    CSIR Research Space (South Africa)

    Peter, Xolani

    2016-09-01

    Full Text Available Ballistic Organization Cape Town, South Africa 27-29 September 2016 1 PREDICTION OF DETONATION AND JWL EOS PARAMETERS OF ENERGETIC MATERIALS USING EXPLO5 COMPUTER CODE X. Peter*, Z. Jiba, M. Olivier, I.M. Snyman, F.J. Mostert and T.J. Sono.... Nowadays many numerical methods and programs are being used for carrying out thermodynamic calculations of the detonation parameters of condensed explosives, for example a BKW Fortran (Mader, 1967), Ruby (Cowperthwaite and Zwisler, 1974) TIGER...

  19. Origins of De Novo Genes in Human and Chimpanzee.

    Science.gov (United States)

    Ruiz-Orera, Jorge; Hernandez-Rodriguez, Jessica; Chiva, Cristina; Sabidó, Eduard; Kondova, Ivanela; Bontrop, Ronald; Marqués-Bonet, Tomàs; Albà, M Mar

    2015-12-01

    The birth of new genes is an important motor of evolutionary innovation. Whereas many new genes arise by gene duplication, others originate at genomic regions that did not contain any genes or gene copies. Some of these newly expressed genes may acquire coding or non-coding functions and be preserved by natural selection. However, it is yet unclear which is the prevalence and underlying mechanisms of de novo gene emergence. In order to obtain a comprehensive view of this process, we have performed in-depth sequencing of the transcriptomes of four mammalian species--human, chimpanzee, macaque, and mouse--and subsequently compared the assembled transcripts and the corresponding syntenic genomic regions. This has resulted in the identification of over five thousand new multiexonic transcriptional events in human and/or chimpanzee that are not observed in the rest of species. Using comparative genomics, we show that the expression of these transcripts is associated with the gain of regulatory motifs upstream of the transcription start site (TSS) and of U1 snRNP sites downstream of the TSS. In general, these transcripts show little evidence of purifying selection, suggesting that many of them are not functional. However, we find signatures of selection in a subset of de novo genes which have evidence of protein translation. Taken together, the data support a model in which frequently-occurring new transcriptional events in the genome provide the raw material for the evolution of new proteins.

  20. Cavitation during the protein misfolding cyclic amplification (PMCA) method – The trigger for de novo prion generation?

    International Nuclear Information System (INIS)

    Haigh, Cathryn L.; Drew, Simon C.

    2015-01-01

    The protein misfolding cyclic amplification (PMCA) technique has become a widely-adopted method for amplifying minute amounts of the infectious conformer of the prion protein (PrP). PMCA involves repeated cycles of 20 kHz sonication and incubation, during which the infectious conformer seeds the conversion of normally folded protein by a templating interaction. Recently, it has proved possible to create an infectious PrP conformer without the need for an infectious seed, by including RNA and the phospholipid POPG as essential cofactors during PMCA. The mechanism underpinning this de novo prion formation remains unknown. In this study, we first establish by spin trapping methods that cavitation bubbles formed during PMCA provide a radical-rich environment. Using a substrate preparation comparable to that employed in studies of de novo prion formation, we demonstrate by immuno-spin trapping that PrP- and RNA-centered radicals are generated during sonication, in addition to PrP-RNA cross-links. We further show that serial PMCA produces protease-resistant PrP that is oxidatively modified. We suggest a unique confluence of structural (membrane-mimetic hydrophobic/hydrophilic bubble interface) and chemical (ROS) effects underlie the phenomenon of de novo prion formation by PMCA, and that these effects have meaningful biological counterparts of possible relevance to spontaneous prion formation in vivo. - Highlights: • Sonication during PMCA generates free radicals at the surface of cavitation bubbles. • PrP-centered and RNA-centered radicals are formed in addition to PrP-RNA adducts. • De novo prions may result from ROS and structural constraints during cavitation

  1. Cavitation during the protein misfolding cyclic amplification (PMCA) method – The trigger for de novo prion generation?

    Energy Technology Data Exchange (ETDEWEB)

    Haigh, Cathryn L., E-mail: chaigh@unimelb.edu.au [Department of Pathology, The University of Melbourne, Victoria 3010 (Australia); Drew, Simon C., E-mail: sdrew@unimelb.edu.au [Florey Department of Neuroscience and Mental Health, The University of Melbourne, Victoria 3010 (Australia)

    2015-06-05

    The protein misfolding cyclic amplification (PMCA) technique has become a widely-adopted method for amplifying minute amounts of the infectious conformer of the prion protein (PrP). PMCA involves repeated cycles of 20 kHz sonication and incubation, during which the infectious conformer seeds the conversion of normally folded protein by a templating interaction. Recently, it has proved possible to create an infectious PrP conformer without the need for an infectious seed, by including RNA and the phospholipid POPG as essential cofactors during PMCA. The mechanism underpinning this de novo prion formation remains unknown. In this study, we first establish by spin trapping methods that cavitation bubbles formed during PMCA provide a radical-rich environment. Using a substrate preparation comparable to that employed in studies of de novo prion formation, we demonstrate by immuno-spin trapping that PrP- and RNA-centered radicals are generated during sonication, in addition to PrP-RNA cross-links. We further show that serial PMCA produces protease-resistant PrP that is oxidatively modified. We suggest a unique confluence of structural (membrane-mimetic hydrophobic/hydrophilic bubble interface) and chemical (ROS) effects underlie the phenomenon of de novo prion formation by PMCA, and that these effects have meaningful biological counterparts of possible relevance to spontaneous prion formation in vivo. - Highlights: • Sonication during PMCA generates free radicals at the surface of cavitation bubbles. • PrP-centered and RNA-centered radicals are formed in addition to PrP-RNA adducts. • De novo prions may result from ROS and structural constraints during cavitation.

  2. Predicting Transport of 3,5,6-Trichloro-2-Pyridinol Into Saliva Using a Combination Experimental and Computational Approach

    Energy Technology Data Exchange (ETDEWEB)

    Smith, Jordan Ned; Carver, Zana A.; Weber, Thomas J.; Timchalk, Charles

    2017-04-11

    A combination experimental and computational approach was developed to predict chemical transport into saliva. A serous-acinar chemical transport assay was established to measure chemical transport with non-physiological (standard cell culture medium) and physiological (using surrogate plasma and saliva medium) conditions using 3,5,6-trichloro-2-pyridinol (TCPy) a metabolite of the pesticide chlorpyrifos. High levels of TCPy protein binding was observed in cell culture medium and rat plasma resulting in different TCPy transport behaviors in the two experimental conditions. In the non-physiological transport experiment, TCPy reached equilibrium at equivalent concentrations in apical and basolateral chambers. At higher TCPy doses, increased unbound TCPy was observed, and TCPy concentrations in apical and basolateral chambers reached equilibrium faster than lower doses, suggesting only unbound TCPy is able to cross the cellular monolayer. In the physiological experiment, TCPy transport was slower than non-physiological conditions, and equilibrium was achieved at different concentrations in apical and basolateral chambers at a comparable ratio (0.034) to what was previously measured in rats dosed with TCPy (saliva:blood ratio: 0.049). A cellular transport computational model was developed based on TCPy protein binding kinetics and accurately simulated all transport experiments using different permeability coefficients for the two experimental conditions (1.4 vs 0.4 cm/hr for non-physiological and physiological experiments, respectively). The computational model was integrated into a physiologically based pharmacokinetic (PBPK) model and accurately predicted TCPy concentrations in saliva of rats dosed with TCPy. Overall, this study demonstrates an approach to predict chemical transport in saliva potentially increasing the utility of salivary biomonitoring in the future.

  3. Prediction of intramuscular fat content and shear force in Texel lamb loins using combinations of different X-ray computed tomography (CT) scanning techniques.

    Science.gov (United States)

    Clelland, N; Bunger, L; McLean, K A; Knott, S; Matthews, K R; Lambe, N R

    2018-06-01

    Computed tomography (CT) parameters, including spiral computed tomography scanning (SCTS) parameters, intramuscular fat (IMF) and mechanically measured shear force were derived from two previously published studies. Purebred Texel (n = 377) of both sexes, females (n = 206) and intact males (n = 171) were used to investigate the prediction of IMF and shear force in the loin. Two and three dimensional CT density information was available. Accuracies in the prediction of shear force and IMF ranged from R 2 0.02 to R 2 0.13 and R 2 0.51 to R 2 0.71 respectively, using combinations of SCTS and CT scan information. The prediction of mechanical shear force could not be achieved at an acceptable level of accuracy employing SCTS information. However, the prediction of IMF in the loin employing information from SCTS and additional information from standard CT scans was successful, providing evidence that the prediction of IMF and related meat eating quality (MEQ) traits for Texel lambs in vivo can be achieved. Copyright © 2018 Elsevier Ltd. All rights reserved.

  4. Osteoporosis prediction from the mandible using cone-beam computed tomography

    Science.gov (United States)

    Al Haffar, Iyad; Khattab, Razan

    2014-01-01

    Purpose This study aimed to evaluate the use of dental cone-beam computed tomography (CBCT) in the diagnosis of osteoporosis among menopausal and postmenopausal women by using only a CBCT viewer program. Materials and Methods Thirty-eight menopausal and postmenopausal women who underwent dual-energy X-ray absorptiometry (DXA) examination for hip and lumbar vertebrae were scanned using CBCT (field of view: 13 cm×15 cm; voxel size: 0.25 mm). Slices from the body of the mandible as well as the ramus were selected and some CBCT-derived variables, such as radiographic density (RD) as gray values, were calculated as gray values. Pearson's correlation, one-way analysis of variance (ANOVA), and accuracy (sensitivity and specificity) evaluation based on linear and logistic regression were performed to choose the variable that best correlated with the lumbar and femoral neck T-scores. Results RD of the whole bone area of the mandible was the variable that best correlated with and predicted both the femoral neck and the lumbar vertebrae T-scores; further, Pearson's correlation coefficients were 0.5/0.6 (p value=0.037/0.009). The sensitivity, specificity, and accuracy based on the logistic regression were 50%, 88.9%, and 78.4%, respectively, for the femoral neck, and 46.2%, 91.3%, and 75%, respectively, for the lumbar vertebrae. Conclusion Lumbar vertebrae and femoral neck osteoporosis can be predicted with high accuracy from the RD value of the body of the mandible by using a CBCT viewer program. PMID:25473633

  5. CompNanoTox2015: novel perspectives from a European conference on computational nanotoxicology on predictive nanotoxicology

    Energy Technology Data Exchange (ETDEWEB)

    Bañares, Miguel A. [Instituto de Catalisis y Petroleoquimica, ICP, CSIC, Madrid, Spain,; Haase, Andrea [German Federal Institute for Risk Assessment (BfR), Department of Chemical and Product Safety, Berlin, Germany,; Tran, Lang [Statistics and Toxicology Section, Institute of Occupational Medicine, Edinburgh, UK,; Lobaskin, Vladimir [School of Physics, University College Dublin, Dublin, Ireland,; Oberdörster, Günter [Department of Environmental Medicine, University of Rochester, Rochester, NY, USA,; Rallo, Robert [Departament d’Enginyeria Informatica i Matematiques, Universitat Rovira i Virgili, Tarragona, Spain,; Advanced Computing, Mathematics, and Data Division, Pacific Northwest National Laboratory, Richland, WA, USA,; Leszczynski, Jerzy [Interdisciplinary Nanotoxicity Center, Jackson State University, Jackson, MS, USA,; Hoet, Peter [Public Health, Unit Lung Toxicology, K. U. Leuven, Leuven, Belgium,; Korenstein, Rafi [Faculty of Medicine, Tel-Aviv University, Tel Aviv, Israel,; Hardy, Barry [Technology Park Basel, Douglas Connect GmbH, Basel, Switzerland,; Puzyn, Tomasz [Laboratory of Environmental Chemometrics, Faculty of Chemistry, University of Gdansk, Gdansk, Poland

    2017-08-09

    A first European Conference on Computational Nanotoxicology, CompNanoTox, was held in November 2015 in Benahavís, Spain with the objectives to disseminate and integrate results from the European modeling and database projects (NanoPUZZLES, ModENPTox, PreNanoTox, MembraneNanoPart, MODERN, eNanoMapper and EU COST TD1204 MODENA) as well as to create synergies within the European NanoSafety Cluster. This conference was supported by the COST Action TD1204 MODENA on developing computational methods for toxicological risk assessment of engineered nanoparticles and provided a unique opportunity for crossfertilization among complementary disciplines. The efforts to develop and validate computational models crucially depend on high quality experimental data and relevant assays which will be the basis to identify relevant descriptors. The ambitious overarching goal of this conference was to promote predictive nanotoxicology, which can only be achieved by a close collaboration between the computational scientists (e.g. database experts, modeling experts for structure, (eco) toxicological effects, performance and interaction of nanomaterials) and experimentalists from different areas (in particular toxicologists, biologists, chemists and material scientists, among others). The main outcome and new perspectives of this conference are summarized here.

  6. ESTADO E NOVOS AGENTES SOCIAIS NA RECONSTRUÇÃO DO ESPAÇO

    Directory of Open Access Journals (Sweden)

    Cláudio Barbosa da Costa

    2006-07-01

    Full Text Available Este estudo tem como objetivo principal sistematizar novas questões referentes à complexidade das ações dos agentes sociais na construção do espaço contemporâneo, e que vem interferindo, de maneira significativa, na forma como o Estado vem realizando o processo de intervenção no território. Os eixos analíticos do estudo serão constituídos pelo novo patamar de acumulação que constitui a globalização, os grandes agentes econômicos internacionais e as novas referências da chamada lógica civilizatória, na qual emergem sujeitos sociais que ao buscarem romper com idéia dominante de desenvolvimento e de crescimento, vêm construindo novas formas de lidar com a natureza, com o território e com o trabalho, estabelecendo a alteridade, a partir da luta pela sua legitimidade. A compreensão teórica dos novos movimentos sociais possibilitou um melhor entendimento do papel histórico destes agentes na construção contemporânea do espaço e das suas relações com o vetor ecológico e o desenvolvimento sustentável. Para finalizar, realizamos uma análise histórica das intervenções do Estado brasileiro na construção do território nacional, buscando identificar a complexidade dos agentes da sociedade civil, e as possibilidades e os limites da constituição de um projeto geopolítico através de um novo pacto social, que resguardando as diferenças, consideram-se as territorialidades dos setores sociais historicamente excluídos do processo de

  7. NOVOS PRODUTOS SOB O ENFOQUE DO SISTEMA DE CONSUMO E A ORIENTAÇÃO PARA O MERCADO

    OpenAIRE

    Mauro Calixta Tavares; Valéria Braga Pinto

    2009-01-01

    A orientação para o mercado tem sido abordada com suas possíveis implicações na estrutura, processos, pessoas, desenvolvimento de novos produtos, inovação e flexibilidade estratégica, entre outras possibilidades. O foco do presente capítulo é sobre sua relação com novos produtos sob a perspectiva de sistema de consumo. A partir de um estudo exploratório, os alunos de um programa de MBA, divididos em grupos, realizaram um trabalho envolvendo o consumo de vários produtos e serviços sob a perspe...

  8. Trial-by-Trial Modulation of Associative Memory Formation by Reward Prediction Error and Reward Anticipation as Revealed by a Biologically Plausible Computational Model.

    Science.gov (United States)

    Aberg, Kristoffer C; Müller, Julia; Schwartz, Sophie

    2017-01-01

    Anticipation and delivery of rewards improves memory formation, but little effort has been made to disentangle their respective contributions to memory enhancement. Moreover, it has been suggested that the effects of reward on memory are mediated by dopaminergic influences on hippocampal plasticity. Yet, evidence linking memory improvements to actual reward computations reflected in the activity of the dopaminergic system, i.e., prediction errors and expected values, is scarce and inconclusive. For example, different previous studies reported that the magnitude of prediction errors during a reinforcement learning task was a positive, negative, or non-significant predictor of successfully encoding simultaneously presented images. Individual sensitivities to reward and punishment have been found to influence the activation of the dopaminergic reward system and could therefore help explain these seemingly discrepant results. Here, we used a novel associative memory task combined with computational modeling and showed independent effects of reward-delivery and reward-anticipation on memory. Strikingly, the computational approach revealed positive influences from both reward delivery, as mediated by prediction error magnitude, and reward anticipation, as mediated by magnitude of expected value, even in the absence of behavioral effects when analyzed using standard methods, i.e., by collapsing memory performance across trials within conditions. We additionally measured trait estimates of reward and punishment sensitivity and found that individuals with increased reward (vs. punishment) sensitivity had better memory for associations encoded during positive (vs. negative) prediction errors when tested after 20 min, but a negative trend when tested after 24 h. In conclusion, modeling trial-by-trial fluctuations in the magnitude of reward, as we did here for prediction errors and expected value computations, provides a comprehensive and biologically plausible description of

  9. Autosomal dominant cutis laxa with progeroid features due to a novel, de novo mutation in ALDH18A1.

    Science.gov (United States)

    Bhola, Priya T; Hartley, Taila; Bareke, Eric; Boycott, Kym M; Nikkel, Sarah M; Dyment, David A

    2017-06-01

    De novo dominant mutations in the aldehyde dehydrogenase 18 family member A1 (ALDH18A1) gene have recently been shown to cause autosomal dominant cutis laxa with progeroid features (MIM 616603). To date, all de novo dominant mutations have been found in a single highly conserved amino acid residue at position p.Arg138. We report an 8-year-old male with a clinical diagnosis of autosomal dominant cutis laxa (ADCL) with progeroid features and a novel de novo missense mutation in ALDH18A1 (NM_002860.3: c.377G>A (p.Arg126His)). This is the first report of an individual with ALDH18A1-ADCL due to a substitution at a residue other than p.Arg138. Knowledge of the complete spectrum of dominant-acting mutations that cause this rare syndrome will have implications for molecular diagnosis and genetic counselling of these families.

  10. Differences in Recurrence Rate and De Novo Incontinence after Endoscopic Treatment of Vesicourethral Stenosis and Bladder Neck Stenosis

    Directory of Open Access Journals (Sweden)

    Jennifer Kranz

    2017-08-01

    Full Text Available ObjectivesThe objective of this study was to compare the recurrence rate and de novo incontinence after endoscopic treatment of vesicourethral stenosis (VUS after radical prostatectomy (RP and for bladder neck stenosis (BNS after transurethral resection of the prostate (TURP.MethodsRetrospective analysis of patients treated endoscopically for VUS after RP or for BNS after TURP at three German tertiary care centers between March 2009 and June 2016. Investigated endpoints were recurrence rate and de novo incontinence. Chi-squared tests and t-tests were used to model the differences between groups.ResultsA total of 147 patients underwent endoscopic therapy for VUS (59.2% or BNS (40.8%. Mean age was 68.3 years (range 44–86, mean follow-up 27.1 months (1–98. Mean time to recurrence after initial therapy was 23.9 months (1–156, mean time to recurrence after prior endoscopic therapy for VUS or BNS was 12.0 months (1–159. Patients treated for VUS underwent significantly more often radiotherapy prior to endoscopic treatment (33.3 vs. 13.3%; p = 0.006 and the recurrence rate was significantly higher (59.8 vs. 41.7%; p = 0.031. The overall success rate of TUR for VUS was 40.2%, success rate of TUR for BNS was 58.3%. TUR for BNS is significantly more successful (p = 0.031. The mean number of TUR for BNS vs. TUR for VUS in successful cases was 1.5 vs. 1.8, which was not significantly different. The rate of de novo incontinence was significantly higher in patients treated for VUS (13.8 vs. 1.7%; p = 0.011. After excluding those patients with radiotherapy prior to endoscopic treatment, the recurrence rate did not differ significantly between both groups (60.3% for VUS vs. 44.2% for BNS; p = 0.091, whereas the rate of de novo incontinence (13.8 for VUS vs. 0% for BNS; p = 0.005 stayed significantly higher in patients treated for VUS.ConclusionMost patients with BNS are successfully treated endoscopically. In patients with

  11. Novos biomateriais poliméricos para implantes ósseos

    OpenAIRE

    Ferreira, Sara Cristina Silva

    2016-01-01

    Os danos dos tecidos ósseos representam um problema importante de saúde em todo o mundo. Um dos componentes principais utilizados em engenharia de tecidos são os materiais de suporte, estruturas 3D formadas por matrizes porosas, que visam proporcionar o desenvolvimento do novo tecido. Muitas das matrizes poliméricas porosas (espumas) utilizadas em aplicações médicas são constituídas por poliuretanos, sendo estes produzidos a partir de isocianatos (produtos tóxicos) e polióis. Assim, surge a n...

  12. Recent advances, and unresolved issues, in the application of computational modelling to the prediction of the biological effects of nanomaterials

    International Nuclear Information System (INIS)

    Winkler, David A.

    2016-01-01

    Nanomaterials research is one of the fastest growing contemporary research areas. The unprecedented properties of these materials have meant that they are being incorporated into products very quickly. Regulatory agencies are concerned they cannot assess the potential hazards of these materials adequately, as data on the biological properties of nanomaterials are still relatively limited and expensive to acquire. Computational modelling methods have much to offer in helping understand the mechanisms by which toxicity may occur, and in predicting the likelihood of adverse biological impacts of materials not yet tested experimentally. This paper reviews the progress these methods, particularly those QSAR-based, have made in understanding and predicting potentially adverse biological effects of nanomaterials, and also the limitations and pitfalls of these methods. - Highlights: • Nanomaterials regulators need good information to make good decisions. • Nanomaterials and their interactions with biology are very complex. • Computational methods use existing data to predict properties of new nanomaterials. • Statistical, data driven modelling methods have been successfully applied to this task. • Much more must be learnt before robust toolkits will be widely usable by regulators.

  13. De Novo Assembly and Characterization of Sophora japonica Transcriptome Using RNA-seq

    Directory of Open Access Journals (Sweden)

    Liucun Zhu

    2014-01-01

    Full Text Available Sophora japonica Linn (Chinese Scholar Tree is a shrub species belonging to the subfamily Faboideae of the pea family Fabaceae. In this study, RNA sequencing of S. japonica transcriptome was performed to produce large expression datasets for functional genomic analysis. Approximate 86.1 million high-quality clean reads were generated and assembled de novo into 143010 unique transcripts and 57614 unigenes. The average length of unigenes was 901 bps with an N50 of 545 bps. Four public databases, including the NCBI nonredundant protein (NR, Swiss-Prot, Kyoto Encyclopedia of Genes and Genomes (KEGG, and the Cluster of Orthologous Groups (COG, were used to annotate unigenes through NCBI BLAST procedure. A total of 27541 of 57614 unigenes (47.8% were annotated for gene descriptions, conserved protein domains, or gene ontology. Moreover, an interaction network of unigenes in S. japonica was predicted based on known protein-protein interactions of putative orthologs of well-studied plant genomes. The transcriptome data of S. japonica reported here represents first genome-scale investigation of gene expressions in Faboideae plants. We expect that our study will provide a useful resource for further studies on gene expression, genomics, functional genomics, and protein-protein interaction in S. japonica.

  14. De novo transcriptome assembly of two different peach cultivars grown in Korea

    Directory of Open Access Journals (Sweden)

    Yeonhwa Jo

    2015-12-01

    Full Text Available Peach (Prunus persica is one of the most popular stone fruits worldwide. Next generation sequencing (NGS has facilitated genome and transcriptome analyses of several stone fruit trees. In this study, we conducted de novo transcriptome analyses of two peach cultivars grown in Korea. Leaves of two cultivars, referred to as Jangtaek and Mibaek, were harvested and used for library preparation. The two prepared libraries were paired-end sequenced by the HiSeq2000 system. We obtained 8.14 GB and 9.62 GB sequence data from Jangtaek and Mibaek (NCBI accession numbers: SRS1056585 and SRS1056587, respectively. The Trinity program was used to assemble two transcriptomes de novo, resulting in 110,477 (Jangtaek and 136,196 (Mibaek transcripts. TransDecoder identified possible coding regions in assembled transcripts. The identified proteins were subjected to BLASTP search against NCBI's non-redundant database for functional annotation. This study provides transcriptome data for two peach cultivars, which might be useful for genetic marker development and comparative transcriptome analyses.

  15. De novo assembly of the Indo-Pacific humpback dolphin leucocyte transcriptome to identify putative genes involved in the aquatic adaptation and immune response.

    Directory of Open Access Journals (Sweden)

    Duan Gui

    Full Text Available BACKGROUND: The Indo-Pacific humpback dolphin (Sousa chinensis, a marine mammal species inhabited in the waters of Southeast Asia, South Africa and Australia, has attracted much attention because of the dramatic decline in population size in the past decades, which raises the concern of extinction. So far, this species is poorly characterized at molecular level due to little sequence information available in public databases. Recent advances in large-scale RNA sequencing provide an efficient approach to generate abundant sequences for functional genomic analyses in the species with un-sequenced genomes. PRINCIPAL FINDINGS: We performed a de novo assembly of the Indo-Pacific humpback dolphin leucocyte transcriptome by Illumina sequencing. 108,751 high quality sequences from 47,840,388 paired-end reads were generated, and 48,868 and 46,587 unigenes were functionally annotated by BLAST search against the NCBI non-redundant and Swiss-Prot protein databases (E-value<10(-5, respectively. In total, 16,467 unigenes were clustered into 25 functional categories by searching against the COG database, and BLAST2GO search assigned 37,976 unigenes to 61 GO terms. In addition, 36,345 unigenes were grouped into 258 KEGG pathways. We also identified 9,906 simple sequence repeats and 3,681 putative single nucleotide polymorphisms as potential molecular markers in our assembled sequences. A large number of unigenes were predicted to be involved in immune response, and many genes were predicted to be relevant to adaptive evolution and cetacean-specific traits. CONCLUSION: This study represented the first transcriptome analysis of the Indo-Pacific humpback dolphin, an endangered species. The de novo transcriptome analysis of the unique transcripts will provide valuable sequence information for discovery of new genes, characterization of gene expression, investigation of various pathways and adaptive evolution, as well as identification of genetic markers.

  16. De novo hepatic steatosis drives atherogenic risk in liver transplantation recipients.

    Science.gov (United States)

    Idowu, Michael O; Chhatrala, Ravi; Siddiqui, M Bilal; Driscoll, Carolyn; Stravitz, R Todd; Sanyal, Arun J; Bhati, Chandra; Sargeant, Carol; Luketic, Velimir A; Sterling, Richard K; Contos, Melissa; Matherly, Scott; Puri, Puneet; Siddiqui, M Shadab

    2015-11-01

    Nonalcoholic fatty liver disease is associated with cardiovascular disease (CVD) in the general population. Despite a high prevalence of de novo hepatic steatosis after liver transplantation (LT), there are no data exploring the association between hepatic steatosis after LT and atherogenic risk. The aim of the study was to explore the impact of hepatic steatosis on serum atherogenic markers in liver transplantation recipients (LTRs). Biomarkers of CVD risk were compared in 89 LTRs with no known history of dyslipidemia, ischemic heart disease, or graft cirrhosis. To avoid potential confounders, LTRs on oral hypoglycemic agents, exogenous insulin, corticosteroids, or lipid-lowering therapy were excluded. Only patients for whom histological assessment was available after LT were included in the study. Thirty-five LTRs had de novo hepatic steatosis after LT, whereas 54 did not. Both cohorts were similar with regards to age, sex, ethnicity, and follow-up from LT. Additionally, the traditional lipid profile was similar between the 2 cohorts. LTRs with hepatic steatosis had higher serum concentrations of small-dense low-density lipoprotein cholesterol (sdLDL-C; 34.8 ± 16.9 versus 22.7 ± 11.2 mg/dL; P hepatic steatosis had higher serum insulin concentrations (27.8 ± 41.8 versus 11.7 ± 7.8 uU/mL; P Steatosis grade was directly related to sdLDL-C, sdLDL-P, insulin, VLDL-P, and VLDL-size. In multivariate analysis, the association between steatosis grade and sdLDL-C (β = 0.03; P = 0.029), VLDL-size (β = 0.316; P = 0.04), and low-density lipoprotein particle size (β = -0.27; P = 0.05) was independent of sex, body mass index, age, diabetes mellitus, time from transplant, and indication for LT. In conclusion, de novo hepatic steatosis after LT is associated with atherogenic lipoproteins and independent of traditional CVD risk factors. © 2015 American Association for the Study of Liver Diseases.

  17. COBRA: A Computational Brewing Application for Predicting the Molecular Composition of Organic Aerosols

    Energy Technology Data Exchange (ETDEWEB)

    Fooshee, David R.; Nguyen, Tran B.; Nizkorodov, Sergey A.; Laskin, Julia; Laskin, Alexander; Baldi, Pierre

    2012-05-08

    Atmospheric organic aerosols (OA) represent a significant fraction of airborne particulate matter and can impact climate, visibility, and human health. These mixtures are difficult to characterize experimentally due to the enormous complexity and dynamic nature of their chemical composition. We introduce a novel Computational Brewing Application (COBRA) and apply it to modeling oligomerization chemistry stemming from condensation and addition reactions of monomers pertinent to secondary organic aerosol (SOA) formed by photooxidation of isoprene. COBRA uses two lists as input: a list of chemical structures comprising the molecular starting pool, and a list of rules defining potential reactions between molecules. Reactions are performed iteratively, with products of all previous iterations serving as reactants for the next one. The simulation generated thousands of molecular structures in the mass range of 120-500 Da, and correctly predicted ~70% of the individual SOA constituents observed by high-resolution mass spectrometry (HR-MS). Selected predicted structures were confirmed with tandem mass spectrometry. Esterification and hemiacetal formation reactions were shown to play the most significant role in oligomer formation, whereas aldol condensation was shown to be insignificant. COBRA is not limited to atmospheric aerosol chemistry, but is broadly applicable to the prediction of reaction products in other complex mixtures for which reasonable reaction mechanisms and seed molecules can be supplied by experimental or theoretical methods.

  18. Protein Sorting Prediction

    DEFF Research Database (Denmark)

    Nielsen, Henrik

    2017-01-01

    and drawbacks of each of these approaches is described through many examples of methods that predict secretion, integration into membranes, or subcellular locations in general. The aim of this chapter is to provide a user-level introduction to the field with a minimum of computational theory.......Many computational methods are available for predicting protein sorting in bacteria. When comparing them, it is important to know that they can be grouped into three fundamentally different approaches: signal-based, global-property-based and homology-based prediction. In this chapter, the strengths...

  19. Computational models for residual creep life prediction of power plant components

    International Nuclear Information System (INIS)

    Grewal, G.S.; Singh, A.K.; Ramamoortry, M.

    2006-01-01

    All high temperature - high pressure power plant components are prone to irreversible visco-plastic deformation by the phenomenon of creep. The steady state creep response as well as the total creep life of a material is related to the operational component temperature through, respectively, the exponential and inverse exponential relationships. Minor increases in the component temperature can thus have serious consequences as far as the creep life and dimensional stability of a plant component are concerned. In high temperature steam tubing in power plants, one mechanism by which a significant temperature rise can occur is by the growth of a thermally insulating oxide film on its steam side surface. In the present paper, an elegantly simple and computationally efficient technique is presented for predicting the residual creep life of steel components subjected to continual steam side oxide film growth. Similarly, fabrication of high temperature power plant components involves extensive use of welding as the fabrication process of choice. Naturally, issues related to the creep life of weldments have to be seriously addressed for safe and continual operation of the welded plant component. Unfortunately, a typical weldment in an engineering structure is a zone of complex microstructural gradation comprising of a number of distinct sub-zones with distinct meso-scale and micro-scale morphology of the phases and (even) chemistry and its creep life prediction presents considerable challenges. The present paper presents a stochastic algorithm, which can be' used for developing experimental creep-cavitation intensity versus residual life correlations for welded structures. Apart from estimates of the residual life in a mean field sense, the model can be used for predicting the reliability of the plant component in a rigorous probabilistic setting. (author)

  20. Ética, política e direito brasileiro: reflexões para um novo senso

    Directory of Open Access Journals (Sweden)

    Adriano Monteiro Madruga

    2010-12-01

    Full Text Available O objetivo deste artigo é questionar o sentido enfático dado à ética no cenário político brasileiro na atualidade e seus reflexos no direito, já que historicamente a política e o direito estiveram sempre centralizados aos interesses dos grandes personagens e fatos (Hegel e para contrapor essa percepção é pensada a história do Brasil como tragédia repetida por farsa (Marx e o sentido de Pasárgada (Santos. A partir disso, re-pensar a ética como diálogo político (Platão e a universalidade das leis (Cícero no direito brasileiro na hodiernidade voltadas para cidadania através de movimentos de vanguarda refletidos por novos espelhos sociais para um novo senso comum (Santos.