WorldWideScience

Sample records for large-scale phenotypic screening

  1. Large-scale image-based profiling of single-cell phenotypes in arrayed CRISPR-Cas9 gene perturbation screens.

    Science.gov (United States)

    de Groot, Reinoud; Lüthi, Joel; Lindsay, Helen; Holtackers, René; Pelkmans, Lucas

    2018-01-23

    High-content imaging using automated microscopy and computer vision allows multivariate profiling of single-cell phenotypes. Here, we present methods for the application of the CISPR-Cas9 system in large-scale, image-based, gene perturbation experiments. We show that CRISPR-Cas9-mediated gene perturbation can be achieved in human tissue culture cells in a timeframe that is compatible with image-based phenotyping. We developed a pipeline to construct a large-scale arrayed library of 2,281 sequence-verified CRISPR-Cas9 targeting plasmids and profiled this library for genes affecting cellular morphology and the subcellular localization of components of the nuclear pore complex (NPC). We conceived a machine-learning method that harnesses genetic heterogeneity to score gene perturbations and identify phenotypically perturbed cells for in-depth characterization of gene perturbation effects. This approach enables genome-scale image-based multivariate gene perturbation profiling using CRISPR-Cas9. © 2018 The Authors. Published under the terms of the CC BY 4.0 license.

  2. Active Learning Strategies for Phenotypic Profiling of High-Content Screens.

    Science.gov (United States)

    Smith, Kevin; Horvath, Peter

    2014-06-01

    High-content screening is a powerful method to discover new drugs and carry out basic biological research. Increasingly, high-content screens have come to rely on supervised machine learning (SML) to perform automatic phenotypic classification as an essential step of the analysis. However, this comes at a cost, namely, the labeled examples required to train the predictive model. Classification performance increases with the number of labeled examples, and because labeling examples demands time from an expert, the training process represents a significant time investment. Active learning strategies attempt to overcome this bottleneck by presenting the most relevant examples to the annotator, thereby achieving high accuracy while minimizing the cost of obtaining labeled data. In this article, we investigate the impact of active learning on single-cell-based phenotype recognition, using data from three large-scale RNA interference high-content screens representing diverse phenotypic profiling problems. We consider several combinations of active learning strategies and popular SML methods. Our results show that active learning significantly reduces the time cost and can be used to reveal the same phenotypic targets identified using SML. We also identify combinations of active learning strategies and SML methods which perform better than others on the phenotypic profiling problems we studied. © 2014 Society for Laboratory Automation and Screening.

  3. A deep learning and novelty detection framework for rapid phenotyping in high-content screening

    Science.gov (United States)

    Sommer, Christoph; Hoefler, Rudolf; Samwer, Matthias; Gerlich, Daniel W.

    2017-01-01

    Supervised machine learning is a powerful and widely used method for analyzing high-content screening data. Despite its accuracy, efficiency, and versatility, supervised machine learning has drawbacks, most notably its dependence on a priori knowledge of expected phenotypes and time-consuming classifier training. We provide a solution to these limitations with CellCognition Explorer, a generic novelty detection and deep learning framework. Application to several large-scale screening data sets on nuclear and mitotic cell morphologies demonstrates that CellCognition Explorer enables discovery of rare phenotypes without user training, which has broad implications for improved assay development in high-content screening. PMID:28954863

  4. Ultra-high frequency ultrasound biomicroscopy and high throughput cardiovascular phenotyping in a large scale mouse mutagenesis screen

    Science.gov (United States)

    Liu, Xiaoqin; Francis, Richard; Tobita, Kimimasa; Kim, Andy; Leatherbury, Linda; Lo, Cecilia W.

    2013-02-01

    Ultrasound biomicroscopy (UBM) is ideally suited for phenotyping fetal mice for congenital heart disease (CHD), as imaging can be carried out noninvasively to provide both hemodynamic and structural information essential for CHD diagnosis. Using the UBM (Vevo 2100; 40Hz) in conjunction with the clinical ultrasound system (Acuson Sequioa C512; 15Hz), we developed a two-step screening protocol to scan thousands fetuses derived from ENU mutagenized pedigrees. A wide spectrum of CHD was detected by the UBM, which were subsequently confirmed with follow-up necropsy and histopathology examination with episcopic fluorescence image capture. CHD observed included outflow anomalies, left/right heart obstructive lesions, septal/valvular defects and cardiac situs anomalies. Meanwhile, various extracardiac defects were found, such as polydactyly, craniofacial defects, exencephaly, omphalocele-cleft palate, most of which were associated with cardiac defects. Our analyses showed the UBM was better at assessing cardiac structure and blood flow profiles, while conventional ultrasound allowed higher throughput low-resolution screening. Our study showed the integration of conventional clinical ultrasound imaging with the UBM for fetal mouse cardiovascular phenotyping can maximize the detection and recovery of CHD mutants.

  5. Large-scale DNA Barcode Library Generation for Biomolecule Identification in High-throughput Screens.

    Science.gov (United States)

    Lyons, Eli; Sheridan, Paul; Tremmel, Georg; Miyano, Satoru; Sugano, Sumio

    2017-10-24

    High-throughput screens allow for the identification of specific biomolecules with characteristics of interest. In barcoded screens, DNA barcodes are linked to target biomolecules in a manner allowing for the target molecules making up a library to be identified by sequencing the DNA barcodes using Next Generation Sequencing. To be useful in experimental settings, the DNA barcodes in a library must satisfy certain constraints related to GC content, homopolymer length, Hamming distance, and blacklisted subsequences. Here we report a novel framework to quickly generate large-scale libraries of DNA barcodes for use in high-throughput screens. We show that our framework dramatically reduces the computation time required to generate large-scale DNA barcode libraries, compared with a naїve approach to DNA barcode library generation. As a proof of concept, we demonstrate that our framework is able to generate a library consisting of one million DNA barcodes for use in a fragment antibody phage display screening experiment. We also report generating a general purpose one billion DNA barcode library, the largest such library yet reported in literature. Our results demonstrate the value of our novel large-scale DNA barcode library generation framework for use in high-throughput screening applications.

  6. Fragment-based screening in tandem with phenotypic screening provides novel antiparasitic hits.

    Science.gov (United States)

    Blaazer, Antoni R; Orrling, Kristina M; Shanmugham, Anitha; Jansen, Chimed; Maes, Louis; Edink, Ewald; Sterk, Geert Jan; Siderius, Marco; England, Paul; Bailey, David; de Esch, Iwan J P; Leurs, Rob

    2015-01-01

    Methods to discover biologically active small molecules include target-based and phenotypic screening approaches. One of the main difficulties in drug discovery is elucidating and exploiting the relationship between drug activity at the protein target and disease modification, a phenotypic endpoint. Fragment-based drug discovery is a target-based approach that typically involves the screening of a relatively small number of fragment-like (molecular weight <300) molecules that efficiently cover chemical space. Here, we report a fragment screening on TbrPDEB1, an essential cyclic nucleotide phosphodiesterase (PDE) from Trypanosoma brucei, and human PDE4D, an off-target, in a workflow in which fragment hits and a series of close analogs are subsequently screened for antiparasitic activity in a phenotypic panel. The phenotypic panel contained T. brucei, Trypanosoma cruzi, Leishmania infantum, and Plasmodium falciparum, the causative agents of human African trypanosomiasis (sleeping sickness), Chagas disease, leishmaniasis, and malaria, respectively, as well as MRC-5 human lung cells. This hybrid screening workflow has resulted in the discovery of various benzhydryl ethers with antiprotozoal activity and low toxicity, representing interesting starting points for further antiparasitic optimization. © 2014 Society for Laboratory Automation and Screening.

  7. A Fully Automated High-Throughput Flow Cytometry Screening System Enabling Phenotypic Drug Discovery.

    Science.gov (United States)

    Joslin, John; Gilligan, James; Anderson, Paul; Garcia, Catherine; Sharif, Orzala; Hampton, Janice; Cohen, Steven; King, Miranda; Zhou, Bin; Jiang, Shumei; Trussell, Christopher; Dunn, Robert; Fathman, John W; Snead, Jennifer L; Boitano, Anthony E; Nguyen, Tommy; Conner, Michael; Cooke, Mike; Harris, Jennifer; Ainscow, Ed; Zhou, Yingyao; Shaw, Chris; Sipes, Dan; Mainquist, James; Lesley, Scott

    2018-05-01

    The goal of high-throughput screening is to enable screening of compound libraries in an automated manner to identify quality starting points for optimization. This often involves screening a large diversity of compounds in an assay that preserves a connection to the disease pathology. Phenotypic screening is a powerful tool for drug identification, in that assays can be run without prior understanding of the target and with primary cells that closely mimic the therapeutic setting. Advanced automation and high-content imaging have enabled many complex assays, but these are still relatively slow and low throughput. To address this limitation, we have developed an automated workflow that is dedicated to processing complex phenotypic assays for flow cytometry. The system can achieve a throughput of 50,000 wells per day, resulting in a fully automated platform that enables robust phenotypic drug discovery. Over the past 5 years, this screening system has been used for a variety of drug discovery programs, across many disease areas, with many molecules advancing quickly into preclinical development and into the clinic. This report will highlight a diversity of approaches that automated flow cytometry has enabled for phenotypic drug discovery.

  8. The health system and population health implications of large-scale diabetes screening in India: a microsimulation model of alternative approaches.

    Directory of Open Access Journals (Sweden)

    Sanjay Basu

    2015-05-01

    Full Text Available Like a growing number of rapidly developing countries, India has begun to develop a system for large-scale community-based screening for diabetes. We sought to identify the implications of using alternative screening instruments to detect people with undiagnosed type 2 diabetes among diverse populations across India.We developed and validated a microsimulation model that incorporated data from 58 studies from across the country into a nationally representative sample of Indians aged 25-65 y old. We estimated the diagnostic and health system implications of three major survey-based screening instruments and random glucometer-based screening. Of the 567 million Indians eligible for screening, depending on which of four screening approaches is utilized, between 158 and 306 million would be expected to screen as "high risk" for type 2 diabetes, and be referred for confirmatory testing. Between 26 million and 37 million of these people would be expected to meet international diagnostic criteria for diabetes, but between 126 million and 273 million would be "false positives." The ratio of false positives to true positives varied from 3.9 (when using random glucose screening to 8.2 (when using a survey-based screening instrument in our model. The cost per case found would be expected to be from US$5.28 (when using random glucose screening to US$17.06 (when using a survey-based screening instrument, presenting a total cost of between US$169 and US$567 million. The major limitation of our analysis is its dependence on published cohort studies that are unlikely fully to capture the poorest and most rural areas of the country. Because these areas are thought to have the lowest diabetes prevalence, this may result in overestimation of the efficacy and health benefits of screening.Large-scale community-based screening is anticipated to produce a large number of false-positive results, particularly if using currently available survey-based screening

  9. Screening and large-scale expression of membrane proteins in mammalian cells for structural studies.

    Science.gov (United States)

    Goehring, April; Lee, Chia-Hsueh; Wang, Kevin H; Michel, Jennifer Carlisle; Claxton, Derek P; Baconguis, Isabelle; Althoff, Thorsten; Fischer, Suzanne; Garcia, K Christopher; Gouaux, Eric

    2014-11-01

    Structural, biochemical and biophysical studies of eukaryotic membrane proteins are often hampered by difficulties in overexpression of the candidate molecule. Baculovirus transduction of mammalian cells (BacMam), although a powerful method to heterologously express membrane proteins, can be cumbersome for screening and expression of multiple constructs. We therefore developed plasmid Eric Gouaux (pEG) BacMam, a vector optimized for use in screening assays, as well as for efficient production of baculovirus and robust expression of the target protein. In this protocol, we show how to use small-scale transient transfection and fluorescence-detection size-exclusion chromatography (FSEC) experiments using a GFP-His8-tagged candidate protein to screen for monodispersity and expression level. Once promising candidates are identified, we describe how to generate baculovirus, transduce HEK293S GnTI(-) (N-acetylglucosaminyltransferase I-negative) cells in suspension culture and overexpress the candidate protein. We have used these methods to prepare pure samples of chicken acid-sensing ion channel 1a (cASIC1) and Caenorhabditis elegans glutamate-gated chloride channel (GluCl) for X-ray crystallography, demonstrating how to rapidly and efficiently screen hundreds of constructs and accomplish large-scale expression in 4-6 weeks.

  10. The Chado Natural Diversity module: a new generic database schema for large-scale phenotyping and genotyping data.

    Science.gov (United States)

    Jung, Sook; Menda, Naama; Redmond, Seth; Buels, Robert M; Friesen, Maren; Bendana, Yuri; Sanderson, Lacey-Anne; Lapp, Hilmar; Lee, Taein; MacCallum, Bob; Bett, Kirstin E; Cain, Scott; Clements, Dave; Mueller, Lukas A; Main, Dorrie

    2011-01-01

    Linking phenotypic with genotypic diversity has become a major requirement for basic and applied genome-centric biological research. To meet this need, a comprehensive database backend for efficiently storing, querying and analyzing large experimental data sets is necessary. Chado, a generic, modular, community-based database schema is widely used in the biological community to store information associated with genome sequence data. To meet the need to also accommodate large-scale phenotyping and genotyping projects, a new Chado module called Natural Diversity has been developed. The module strictly adheres to the Chado remit of being generic and ontology driven. The flexibility of the new module is demonstrated in its capacity to store any type of experiment that either uses or generates specimens or stock organisms. Experiments may be grouped or structured hierarchically, whereas any kind of biological entity can be stored as the observed unit, from a specimen to be used in genotyping or phenotyping experiments, to a group of species collected in the field that will undergo further lab analysis. We describe details of the Natural Diversity module, including the design approach, the relational schema and use cases implemented in several databases.

  11. How Phenotypic Screening Influenced Drug Discovery: Lessons from Five Years of Practice.

    Science.gov (United States)

    Haasen, Dorothea; Schopfer, Ulrich; Antczak, Christophe; Guy, Chantale; Fuchs, Florian; Selzer, Paul

    Since 2011, phenotypic screening has been a trend in the pharmaceutical industry as well as in academia. This renaissance was triggered by analyses that suggested that phenotypic screening is a superior strategy to discover first-in-class drugs. Despite these promises and considerable investments, pharmaceutical research organizations have encountered considerable challenges with the approach. Few success stories have emerged in the past 5 years and companies are questioning their investment in this area. In this contribution, we outline what we have learned about success factors and challenges of phenotypic screening. We then describe how our efforts in phenotypic screening have influenced our approach to drug discovery in general. We predict that concepts from phenotypic screening will be incorporated into target-based approaches and will thus remain influential beyond the current trend.

  12. Chemically intuited, large-scale screening of MOFs by machine learning techniques

    Science.gov (United States)

    Borboudakis, Giorgos; Stergiannakos, Taxiarchis; Frysali, Maria; Klontzas, Emmanuel; Tsamardinos, Ioannis; Froudakis, George E.

    2017-10-01

    A novel computational methodology for large-scale screening of MOFs is applied to gas storage with the use of machine learning technologies. This approach is a promising trade-off between the accuracy of ab initio methods and the speed of classical approaches, strategically combined with chemical intuition. The results demonstrate that the chemical properties of MOFs are indeed predictable (stochastically, not deterministically) using machine learning methods and automated analysis protocols, with the accuracy of predictions increasing with sample size. Our initial results indicate that this methodology is promising to apply not only to gas storage in MOFs but in many other material science projects.

  13. EMPReSS: European mouse phenotyping resource for standardized screens.

    Science.gov (United States)

    Green, Eain C J; Gkoutos, Georgios V; Lad, Heena V; Blake, Andrew; Weekes, Joseph; Hancock, John M

    2005-06-15

    Standardized phenotyping protocols are essential for the characterization of phenotypes so that results are comparable between different laboratories and phenotypic data can be related to ontological descriptions in an automated manner. We describe a web-based resource for the visualization, searching and downloading of standard operating procedures and other documents, the European Mouse Phenotyping Resource for Standardized Screens-EMPReSS. Direct access: http://www.empress.har.mrc.ac.uk e.green@har.mrc.ac.uk.

  14. Pilot study of large-scale production of mutant pigs by ENU mutagenesis.

    Science.gov (United States)

    Hai, Tang; Cao, Chunwei; Shang, Haitao; Guo, Weiwei; Mu, Yanshuang; Yang, Shulin; Zhang, Ying; Zheng, Qiantao; Zhang, Tao; Wang, Xianlong; Liu, Yu; Kong, Qingran; Li, Kui; Wang, Dayu; Qi, Meng; Hong, Qianlong; Zhang, Rui; Wang, Xiupeng; Jia, Qitao; Wang, Xiao; Qin, Guosong; Li, Yongshun; Luo, Ailing; Jin, Weiwu; Yao, Jing; Huang, Jiaojiao; Zhang, Hongyong; Li, Menghua; Xie, Xiangmo; Zheng, Xuejuan; Guo, Kenan; Wang, Qinghua; Zhang, Shibin; Li, Liang; Xie, Fei; Zhang, Yu; Weng, Xiaogang; Yin, Zhi; Hu, Kui; Cong, Yimei; Zheng, Peng; Zou, Hailong; Xin, Leilei; Xia, Jihan; Ruan, Jinxue; Li, Hegang; Zhao, Weiming; Yuan, Jing; Liu, Zizhan; Gu, Weiwang; Li, Ming; Wang, Yong; Wang, Hongmei; Yang, Shiming; Liu, Zhonghua; Wei, Hong; Zhao, Jianguo; Zhou, Qi; Meng, Anming

    2017-06-22

    N-ethyl-N-nitrosourea (ENU) mutagenesis is a powerful tool to generate mutants on a large scale efficiently, and to discover genes with novel functions at the whole-genome level in Caenorhabditis elegans, flies, zebrafish and mice, but it has never been tried in large model animals. We describe a successful systematic three-generation ENU mutagenesis screening in pigs with the establishment of the Chinese Swine Mutagenesis Consortium. A total of 6,770 G1 and 6,800 G3 pigs were screened, 36 dominant and 91 recessive novel pig families with various phenotypes were established. The causative mutations in 10 mutant families were further mapped. As examples, the mutation of SOX10 (R109W) in pig causes inner ear malfunctions and mimics human Mondini dysplasia, and upregulated expression of FBXO32 is associated with congenital splay legs. This study demonstrates the feasibility of artificial random mutagenesis in pigs and opens an avenue for generating a reservoir of mutants for agricultural production and biomedical research.

  15. Optimization of a Fluorescence-Based Assay for Large-Scale Drug Screening against Babesia and Theileria Parasites.

    Science.gov (United States)

    Rizk, Mohamed Abdo; El-Sayed, Shimaa Abd El-Salam; Terkawi, Mohamed Alaa; Youssef, Mohamed Ahmed; El Said, El Said El Shirbini; Elsayed, Gehad; El-Khodery, Sabry; El-Ashker, Maged; Elsify, Ahmed; Omar, Mosaab; Salama, Akram; Yokoyama, Naoaki; Igarashi, Ikuo

    2015-01-01

    A rapid and accurate assay for evaluating antibabesial drugs on a large scale is required for the discovery of novel chemotherapeutic agents against Babesia parasites. In the current study, we evaluated the usefulness of a fluorescence-based assay for determining the efficacies of antibabesial compounds against bovine and equine hemoparasites in in vitro cultures. Three different hematocrits (HCTs; 2.5%, 5%, and 10%) were used without daily replacement of the medium. The results of a high-throughput screening assay revealed that the best HCT was 2.5% for bovine Babesia parasites and 5% for equine Babesia and Theileria parasites. The IC50 values of diminazene aceturate obtained by fluorescence and microscopy did not differ significantly. Likewise, the IC50 values of luteolin, pyronaridine tetraphosphate, nimbolide, gedunin, and enoxacin did not differ between the two methods. In conclusion, our fluorescence-based assay uses low HCT and does not require daily replacement of culture medium, making it highly suitable for in vitro large-scale drug screening against Babesia and Theileria parasites that infect cattle and horses.

  16. Optimization of a Fluorescence-Based Assay for Large-Scale Drug Screening against Babesia and Theileria Parasites.

    Directory of Open Access Journals (Sweden)

    Mohamed Abdo Rizk

    Full Text Available A rapid and accurate assay for evaluating antibabesial drugs on a large scale is required for the discovery of novel chemotherapeutic agents against Babesia parasites. In the current study, we evaluated the usefulness of a fluorescence-based assay for determining the efficacies of antibabesial compounds against bovine and equine hemoparasites in in vitro cultures. Three different hematocrits (HCTs; 2.5%, 5%, and 10% were used without daily replacement of the medium. The results of a high-throughput screening assay revealed that the best HCT was 2.5% for bovine Babesia parasites and 5% for equine Babesia and Theileria parasites. The IC50 values of diminazene aceturate obtained by fluorescence and microscopy did not differ significantly. Likewise, the IC50 values of luteolin, pyronaridine tetraphosphate, nimbolide, gedunin, and enoxacin did not differ between the two methods. In conclusion, our fluorescence-based assay uses low HCT and does not require daily replacement of culture medium, making it highly suitable for in vitro large-scale drug screening against Babesia and Theileria parasites that infect cattle and horses.

  17. Characterizing Protein Interactions Employing a Genome-Wide siRNA Cellular Phenotyping Screen

    Science.gov (United States)

    Suratanee, Apichat; Schaefer, Martin H.; Betts, Matthew J.; Soons, Zita; Mannsperger, Heiko; Harder, Nathalie; Oswald, Marcus; Gipp, Markus; Ramminger, Ellen; Marcus, Guillermo; Männer, Reinhard; Rohr, Karl; Wanker, Erich; Russell, Robert B.; Andrade-Navarro, Miguel A.; Eils, Roland; König, Rainer

    2014-01-01

    Characterizing the activating and inhibiting effect of protein-protein interactions (PPI) is fundamental to gain insight into the complex signaling system of a human cell. A plethora of methods has been suggested to infer PPI from data on a large scale, but none of them is able to characterize the effect of this interaction. Here, we present a novel computational development that employs mitotic phenotypes of a genome-wide RNAi knockdown screen and enables identifying the activating and inhibiting effects of PPIs. Exemplarily, we applied our technique to a knockdown screen of HeLa cells cultivated at standard conditions. Using a machine learning approach, we obtained high accuracy (82% AUC of the receiver operating characteristics) by cross-validation using 6,870 known activating and inhibiting PPIs as gold standard. We predicted de novo unknown activating and inhibiting effects for 1,954 PPIs in HeLa cells covering the ten major signaling pathways of the Kyoto Encyclopedia of Genes and Genomes, and made these predictions publicly available in a database. We finally demonstrate that the predicted effects can be used to cluster knockdown genes of similar biological processes in coherent subgroups. The characterization of the activating or inhibiting effect of individual PPIs opens up new perspectives for the interpretation of large datasets of PPIs and thus considerably increases the value of PPIs as an integrated resource for studying the detailed function of signaling pathways of the cellular system of interest. PMID:25255318

  18. Transcriptome sequencing of two phenotypic mosaic Eucalyptus trees reveals large scale transcriptome re-modelling.

    Directory of Open Access Journals (Sweden)

    Amanda Padovan

    Full Text Available Phenotypic mosaic trees offer an ideal system for studying differential gene expression. We have investigated two mosaic eucalypt trees from two closely related species (Eucalyptus melliodora and E. sideroxylon, which each support two types of leaves: one part of the canopy is resistant to insect herbivory and the remaining leaves are susceptible. Driving this ecological distinction are differences in plant secondary metabolites. We used these phenotypic mosaics to investigate genome wide patterns of foliar gene expression with the aim of identifying patterns of differential gene expression and the somatic mutation(s that lead to this phenotypic mosaicism. We sequenced the mRNA pool from leaves of the resistant and susceptible ecotypes from both mosaic eucalypts using the Illumina HiSeq 2000 platform. We found large differences in pathway regulation and gene expression between the ecotypes of each mosaic. The expression of the genes in the MVA and MEP pathways is reflected by variation in leaf chemistry, however this is not the case for the terpene synthases. Apart from the terpene biosynthetic pathway, there are several other metabolic pathways that are differentially regulated between the two ecotypes, suggesting there is much more phenotypic diversity than has been described. Despite the close relationship between the two species, they show large differences in the global patterns of gene and pathway regulation.

  19. Coordinated phenotype switching with large-scale chromosome flip-flop inversion observed in bacteria.

    Science.gov (United States)

    Cui, Longzhu; Neoh, Hui-min; Iwamoto, Akira; Hiramatsu, Keiichi

    2012-06-19

    Genome inversions are ubiquitous in organisms ranging from prokaryotes to eukaryotes. Typical examples can be identified by comparing the genomes of two or more closely related organisms, where genome inversion footprints are clearly visible. Although the evolutionary implications of this phenomenon are huge, little is known about the function and biological meaning of this process. Here, we report our findings on a bacterium that generates a reversible, large-scale inversion of its chromosome (about half of its total genome) at high frequencies of up to once every four generations. This inversion switches on or off bacterial phenotypes, including colony morphology, antibiotic susceptibility, hemolytic activity, and expression of dozens of genes. Quantitative measurements and mathematical analyses indicate that this reversible switching is stochastic but self-organized so as to maintain two forms of stable cell populations (i.e., small colony variant, normal colony variant) as a bet-hedging strategy. Thus, this heritable and reversible genome fluctuation seems to govern the bacterial life cycle; it has a profound impact on the course and outcomes of bacterial infections.

  20. Crowdsourced geometric morphometrics enable rapid large-scale collection and analysis of phenotypic data

    OpenAIRE

    Chang, Jonathan; Chang, Jonathan

    2015-01-01

    1. Advances in genomics and informatics have enabled the production of large phylogenetic trees. However, the ability to collect large phenotypic datasets has not kept pace. 2. Here, we present a method to quickly and accurately gather morphometric data using crowdsourced image-based landmarking. 3. We find that crowdsourced workers perform similarly to experienced morphologists on the same digitization tasks. We also demonstrate the speed and accuracy of our method on seven families of ray-f...

  1. iBeetle-Base: a database for RNAi phenotypes in the red flour beetle Tribolium castaneum.

    Science.gov (United States)

    Dönitz, Jürgen; Schmitt-Engel, Christian; Grossmann, Daniela; Gerischer, Lizzy; Tech, Maike; Schoppmeier, Michael; Klingler, Martin; Bucher, Gregor

    2015-01-01

    The iBeetle-Base (http://ibeetle-base.uni-goettingen.de) makes available annotations of RNAi phenotypes, which were gathered in a large scale RNAi screen in the red flour beetle Tribolium castaneum (iBeetle screen). In addition, it provides access to sequence information and links for all Tribolium castaneum genes. The iBeetle-Base contains the annotations of phenotypes of several thousands of genes knocked down during embryonic and metamorphic epidermis and muscle development in addition to phenotypes linked to oogenesis and stink gland biology. The phenotypes are described according to the EQM (entity, quality, modifier) system using controlled vocabularies and the Tribolium morphological ontology (TrOn). Furthermore, images linked to the respective annotations are provided. The data are searchable either for specific phenotypes using a complex 'search for morphological defects' or a 'quick search' for gene names and IDs. The red flour beetle Tribolium castaneum has become an important model system for insect functional genetics and is a representative of the most species rich taxon, the Coleoptera, which comprise several devastating pests. It is used for studying insect typical development, the evolution of development and for research on metabolism and pest control. Besides Drosophila, Tribolium is the first insect model organism where large scale unbiased screens have been performed. © The Author(s) 2014. Published by Oxford University Press on behalf of Nucleic Acids Research.

  2. Socio-Cognitive Phenotypes Differentially Modulate Large-Scale Structural Covariance Networks.

    Science.gov (United States)

    Valk, Sofie L; Bernhardt, Boris C; Böckler, Anne; Trautwein, Fynn-Mathis; Kanske, Philipp; Singer, Tania

    2017-02-01

    Functional neuroimaging studies have suggested the existence of 2 largely distinct social cognition networks, one for theory of mind (taking others' cognitive perspective) and another for empathy (sharing others' affective states). To address whether these networks can also be dissociated at the level of brain structure, we combined behavioral phenotyping across multiple socio-cognitive tasks with 3-Tesla MRI cortical thickness and structural covariance analysis in 270 healthy adults, recruited across 2 sites. Regional thickness mapping only provided partial support for divergent substrates, highlighting that individual differences in empathy relate to left insular-opercular thickness while no correlation between thickness and mentalizing scores was found. Conversely, structural covariance analysis showed clearly divergent network modulations by socio-cognitive and -affective phenotypes. Specifically, individual differences in theory of mind related to structural integration between temporo-parietal and dorsomedial prefrontal regions while empathy modulated the strength of dorsal anterior insula networks. Findings were robust across both recruitment sites, suggesting generalizability. At the level of structural network embedding, our study provides a double dissociation between empathy and mentalizing. Moreover, our findings suggest that structural substrates of higher-order social cognition are reflected rather in interregional networks than in the the local anatomical markup of specific regions per se. © The Author 2016. Published by Oxford University Press. All rights reserved. For Permissions, please e-mail: journals.permissions@oup.com.

  3. Feasibility of large-scale screening using N-ERC/mesothelin levels in the blood for the early diagnosis of malignant mesothelioma.

    Science.gov (United States)

    Imashimizu, Kohta; Shiomi, Kazu; Maeda, Masahiro; Aoki, Naoko; Igarashi, Kiyoko; Suzuki, Fumio; Koizumi, Mitsuru; Suzuki, Kenji; Hino, Okio

    2011-05-01

    A large-scale screening involving the measurement of N-ERC/mesothelin levels in blood using an ELISA system for the early diagnosis of malignant mesothelioma (MM) was carried out in individuals with a history of employment at construction sites. Approximately 30,000 subjects were screened. Of the 80 subjects with high-risk values, one male patient was diagnosed as having MM based on a PET study and histopathology. This is the first report of the pre-clinical diagnosis of MM based on blood test screening. In addition, plasma levels of N-ERC/mesothelin may be effectively used for monitoring relapse after surgery.

  4. Techniques for Large-Scale Bacterial Genome Manipulation and Characterization of the Mutants with Respect to In Silico Metabolic Reconstructions.

    Science.gov (United States)

    diCenzo, George C; Finan, Turlough M

    2018-01-01

    The rate at which all genes within a bacterial genome can be identified far exceeds the ability to characterize these genes. To assist in associating genes with cellular functions, a large-scale bacterial genome deletion approach can be employed to rapidly screen tens to thousands of genes for desired phenotypes. Here, we provide a detailed protocol for the generation of deletions of large segments of bacterial genomes that relies on the activity of a site-specific recombinase. In this procedure, two recombinase recognition target sequences are introduced into known positions of a bacterial genome through single cross-over plasmid integration. Subsequent expression of the site-specific recombinase mediates recombination between the two target sequences, resulting in the excision of the intervening region and its loss from the genome. We further illustrate how this deletion system can be readily adapted to function as a large-scale in vivo cloning procedure, in which the region excised from the genome is captured as a replicative plasmid. We next provide a procedure for the metabolic analysis of bacterial large-scale genome deletion mutants using the Biolog Phenotype MicroArray™ system. Finally, a pipeline is described, and a sample Matlab script is provided, for the integration of the obtained data with a draft metabolic reconstruction for the refinement of the reactions and gene-protein-reaction relationships in a metabolic reconstruction.

  5. Large-scale Topographical Screen for Investigation of Physical Neural-Guidance Cues

    Science.gov (United States)

    Li, Wei; Tang, Qing Yuan; Jadhav, Amol D.; Narang, Ankit; Qian, Wei Xian; Shi, Peng; Pang, Stella W.

    2015-03-01

    A combinatorial approach was used to present primary neurons with a large library of topographical features in the form of micropatterned substrate for high-throughput screening of physical neural-guidance cues that can effectively promote different aspects of neuronal development, including axon and dendritic outgrowth. Notably, the neuronal-guidance capability of specific features was automatically identified using a customized image processing software, thus significantly increasing the screening throughput with minimal subjective bias. Our results indicate that the anisotropic topographies promote axonal and in some cases dendritic extension relative to the isotropic topographies, while dendritic branching showed preference to plain substrates over the microscale features. The results from this work can be readily applied towards engineering novel biomaterials with precise surface topography that can serve as guidance conduits for neuro-regenerative applications. This novel topographical screening strategy combined with the automated processing capability can also be used for high-throughput screening of chemical or genetic regulatory factors in primary neurons.

  6. Identification of genes important for cutaneous function revealed by a large scale reverse genetic screen in the mouse.

    Directory of Open Access Journals (Sweden)

    Tia DiTommaso

    2014-10-01

    Full Text Available The skin is a highly regenerative organ which plays critical roles in protecting the body and sensing its environment. Consequently, morbidity and mortality associated with skin defects represent a significant health issue. To identify genes important in skin development and homeostasis, we have applied a high throughput, multi-parameter phenotype screen to the conditional targeted mutant mice generated by the Wellcome Trust Sanger Institute's Mouse Genetics Project (Sanger-MGP. A total of 562 different mouse lines were subjected to a variety of tests assessing cutaneous expression, macroscopic clinical disease, histological change, hair follicle cycling, and aberrant marker expression. Cutaneous lesions were associated with mutations in 23 different genes. Many of these were not previously associated with skin disease in the organ (Mysm1, Vangl1, Trpc4ap, Nom1, Sparc, Farp2, and Prkab1, while others were ascribed new cutaneous functions on the basis of the screening approach (Krt76, Lrig1, Myo5a, Nsun2, and Nf1. The integration of these skin specific screening protocols into the Sanger-MGP primary phenotyping pipelines marks the largest reported reverse genetic screen undertaken in any organ and defines approaches to maximise the productivity of future projects of this nature, while flagging genes for further characterisation.

  7. Plant phenomics and the need for physiological phenotyping across scales to narrow the genotype-to-phenotype knowledge gap

    DEFF Research Database (Denmark)

    Grosskinsky, Dominik Kilian; Svensgaard, Jesper; Christensen, Svend

    2015-01-01

    Plants are affected by complex genome×environment×management interactions which determine phenotypic plasticity as a result of the variability of genetic components. Whereas great advances have been made in the cost-efficient and high-throughput analyses of genetic information and non-invasive ph......Plants are affected by complex genome×environment×management interactions which determine phenotypic plasticity as a result of the variability of genetic components. Whereas great advances have been made in the cost-efficient and high-throughput analyses of genetic information and non......-invasive phenotyping, the large-scale analyses of the underlying physiological mechanisms lag behind. The external phenotype is determined by the sum of the complex interactions of metabolic pathways and intracellular regulatory networks that is reflected in an internal, physiological, and biochemical phenotype......, ultimately enabling the in silico assessment of responses under defined environments with advanced crop models. This will allow generation of robust physiological predictors also for complex traits to bridge the knowledge gap between genotype and phenotype for applications in breeding, precision farming...

  8. A Phenotypic Cell-Binding Screen Identifies a Novel Compound Targeting Triple-Negative Breast Cancer.

    Science.gov (United States)

    Chen, Luxi; Long, Chao; Youn, Jonghae; Lee, Jiyong

    2018-06-11

    We describe a "phenotypic cell-binding screen" by which therapeutic candidate targeting cancer cells of a particular phenotype can be isolated without knowledge of drug targets. Chemical library beads are incubated with cancer cells of the phenotype of interest in the presence of cancer cells lacking the phenotype of interest, and then the beads bound to only cancer cells of the phenotype of interest are selected as hits. We have applied this screening strategy in discovering a novel compound (LC129-8) targeting triple-negative breast cancer (TNBC). LC129-8 displayed highly specific binding to TNBC in cancer cell lines and patient-derived tumor tissues. LC129-8 exerted anti-TNBC activity by inducing apoptosis, inhibiting proliferation, reversing epithelial-mesenchymal transition, downregulating cancer stem cell activity and blocking in vivo tumor growth.

  9. Database for High Throughput Screening Hits (dHITS): a simple tool to retrieve gene specific phenotypes from systematic screens done in yeast.

    Science.gov (United States)

    Chuartzman, Silvia G; Schuldiner, Maya

    2018-03-25

    In the last decade several collections of Saccharomyces cerevisiae yeast strains have been created. In these collections every gene is modified in a similar manner such as by a deletion or the addition of a protein tag. Such libraries have enabled a diversity of systematic screens, giving rise to large amounts of information regarding gene functions. However, often papers describing such screens focus on a single gene or a small set of genes and all other loci affecting the phenotype of choice ('hits') are only mentioned in tables that are provided as supplementary material and are often hard to retrieve or search. To help unify and make such data accessible, we have created a Database of High Throughput Screening Hits (dHITS). The dHITS database enables information to be obtained about screens in which genes of interest were found as well as the other genes that came up in that screen - all in a readily accessible and downloadable format. The ability to query large lists of genes at the same time provides a platform to easily analyse hits obtained from transcriptional analyses or other screens. We hope that this platform will serve as a tool to facilitate investigation of protein functions to the yeast community. © 2018 The Authors Yeast Published by John Wiley & Sons Ltd.

  10. Using iterative cluster merging with improved gap statistics to perform online phenotype discovery in the context of high-throughput RNAi screens

    Directory of Open Access Journals (Sweden)

    Sun Youxian

    2008-06-01

    image-based datasets derived from a wide spectrum of experimental conditions and is suitable to adaptively process new images which are continuously added to existing datasets. Validations were carried out on different dataset, including published RNAi screening using Drosophila embryos [Additional files 1, 2], dataset for cell cycle phase identification using HeLa cells [Additional files 1, 3, 4] and synthetic dataset using polygons, our methods tackled three aforementioned tasks effectively with an accuracy range of 85%–90%. When our method is implemented in the context of a Drosophila genome-scale RNAi image-based screening of cultured cells aimed to identifying the contribution of individual genes towards the regulation of cell-shape, it efficiently discovers meaningful new phenotypes and provides novel biological insight. We also propose a two-step procedure to modify the novelty detection method based on one-class SVM, so that it can be used to online phenotype discovery. In different conditions, we compared the SVM based method with our method using various datasets and our methods consistently outperformed SVM based method in at least two of three tasks by 2% to 5%. These results demonstrate that our methods can be used to better identify novel phenotypes in image-based datasets from a wide range of conditions and organisms. Conclusion We demonstrate that our method can detect various novel phenotypes effectively in complex datasets. Experiment results also validate that our method performs consistently under different order of image input, variation of starting conditions including the number and composition of existing phenotypes, and dataset from different screens. In our findings, the proposed method is suitable for online phenotype discovery in diverse high-throughput image-based genetic and chemical screens.

  11. Tumor phenotype and breast density in distinct categories of interval cancer: results of population-based mammography screening in Spain.

    Science.gov (United States)

    Domingo, Laia; Salas, Dolores; Zubizarreta, Raquel; Baré, Marisa; Sarriugarte, Garbiñe; Barata, Teresa; Ibáñez, Josefa; Blanch, Jordi; Puig-Vives, Montserrat; Fernández, Ana; Castells, Xavier; Sala, Maria

    2014-01-10

    Interval cancers are tumors arising after a negative screening episode and before the next screening invitation. They can be classified into true interval cancers, false-negatives, minimal-sign cancers, and occult tumors based on mammographic findings in screening and diagnostic mammograms. This study aimed to describe tumor-related characteristics and the association of breast density and tumor phenotype within four interval cancer categories. We included 2,245 invasive tumors (1,297 screening-detected and 948 interval cancers) diagnosed from 2000 to 2009 among 645,764 women aged 45 to 69 who underwent biennial screening in Spain. Interval cancers were classified by a semi-informed retrospective review into true interval cancers (n = 455), false-negatives (n = 224), minimal-sign (n = 166), and occult tumors (n = 103). Breast density was evaluated using Boyd's scale and was conflated into: 75%. Tumor-related information was obtained from cancer registries and clinical records. Tumor phenotype was defined as follows: luminal A: ER+/HER2- or PR+/HER2-; luminal B: ER+/HER2+ or PR+/HER2+; HER2: ER-/PR-/HER2+; triple-negative: ER-/PR-/HER2-. The association of tumor phenotype and breast density was assessed using a multinomial logistic regression model. Adjusted odds ratios (OR) and 95% confidence intervals (95% CI) were calculated. All statistical tests were two-sided. Forty-eight percent of interval cancers were true interval cancers and 23.6% false-negatives. True interval cancers were associated with HER2 and triple-negative phenotypes (OR = 1.91 (95% CI:1.22-2.96), OR = 2.07 (95% CI:1.42-3.01), respectively) and extremely dense breasts (>75%) (OR = 1.67 (95% CI:1.08-2.56)). However, among true interval cancers a higher proportion of triple-negative tumors was observed in predominantly fatty breasts (breasts (28.7%, 21.4%, 11.3% and 14.3%, respectively; cancers, extreme breast density being strongly associated with occult tumors (OR

  12. Tumor phenotype and breast density in distinct categories of interval cancer: results of population-based mammography screening in Spain

    Science.gov (United States)

    2014-01-01

    Introduction Interval cancers are tumors arising after a negative screening episode and before the next screening invitation. They can be classified into true interval cancers, false-negatives, minimal-sign cancers, and occult tumors based on mammographic findings in screening and diagnostic mammograms. This study aimed to describe tumor-related characteristics and the association of breast density and tumor phenotype within four interval cancer categories. Methods We included 2,245 invasive tumors (1,297 screening-detected and 948 interval cancers) diagnosed from 2000 to 2009 among 645,764 women aged 45 to 69 who underwent biennial screening in Spain. Interval cancers were classified by a semi-informed retrospective review into true interval cancers (n = 455), false-negatives (n = 224), minimal-sign (n = 166), and occult tumors (n = 103). Breast density was evaluated using Boyd’s scale and was conflated into: 75%. Tumor-related information was obtained from cancer registries and clinical records. Tumor phenotype was defined as follows: luminal A: ER+/HER2- or PR+/HER2-; luminal B: ER+/HER2+ or PR+/HER2+; HER2: ER-/PR-/HER2+; triple-negative: ER-/PR-/HER2-. The association of tumor phenotype and breast density was assessed using a multinomial logistic regression model. Adjusted odds ratios (OR) and 95% confidence intervals (95% CI) were calculated. All statistical tests were two-sided. Results Forty-eight percent of interval cancers were true interval cancers and 23.6% false-negatives. True interval cancers were associated with HER2 and triple-negative phenotypes (OR = 1.91 (95% CI:1.22-2.96), OR = 2.07 (95% CI:1.42-3.01), respectively) and extremely dense breasts (>75%) (OR = 1.67 (95% CI:1.08-2.56)). However, among true interval cancers a higher proportion of triple-negative tumors was observed in predominantly fatty breasts (breasts (28.7%, 21.4%, 11.3% and 14.3%, respectively; screening-detected cancers, extreme breast density

  13. Study on the millimeter-wave scale absorber based on the Salisbury screen

    Science.gov (United States)

    Yuan, Liming; Dai, Fei; Xu, Yonggang; Zhang, Yuan

    2018-03-01

    In order to solve the problem on the millimeter-wave scale absorber, the Salisbury screen absorber is employed and designed based on the RL. By optimizing parameters including the sheet resistance of the surface resistive layer, the permittivity and the thickness of the grounded dielectric layer, the RL of the Salisbury screen absorber could be identical with that of the theoretical scale absorber. An example is given to verify the effectiveness of the method, where the Salisbury screen absorber is designed by the proposed method and compared with the theoretical scale absorber. Meanwhile, plate models and tri-corner reflector (TCR) models are constructed according to the designed result and their scattering properties are simulated by FEKO. Results reveal that the deviation between the designed Salisbury screen absorber and the theoretical scale absorber falls within the tolerance of radar Cross section (RCS) measurement. The work in this paper has important theoretical and practical significance in electromagnetic measurement of large scale ratio.

  14. A large-scale RNA interference screen identifies genes that regulate autophagy at different stages.

    Science.gov (United States)

    Guo, Sujuan; Pridham, Kevin J; Virbasius, Ching-Man; He, Bin; Zhang, Liqing; Varmark, Hanne; Green, Michael R; Sheng, Zhi

    2018-02-12

    Dysregulated autophagy is central to the pathogenesis and therapeutic development of cancer. However, how autophagy is regulated in cancer is not well understood and genes that modulate cancer autophagy are not fully defined. To gain more insights into autophagy regulation in cancer, we performed a large-scale RNA interference screen in K562 human chronic myeloid leukemia cells using monodansylcadaverine staining, an autophagy-detecting approach equivalent to immunoblotting of the autophagy marker LC3B or fluorescence microscopy of GFP-LC3B. By coupling monodansylcadaverine staining with fluorescence-activated cell sorting, we successfully isolated autophagic K562 cells where we identified 336 short hairpin RNAs. After candidate validation using Cyto-ID fluorescence spectrophotometry, LC3B immunoblotting, and quantitative RT-PCR, 82 genes were identified as autophagy-regulating genes. 20 genes have been reported previously and the remaining 62 candidates are novel autophagy mediators. Bioinformatic analyses revealed that most candidate genes were involved in molecular pathways regulating autophagy, rather than directly participating in the autophagy process. Further autophagy flux assays revealed that 57 autophagy-regulating genes suppressed autophagy initiation, whereas 21 candidates promoted autophagy maturation. Our RNA interference screen identifies identified genes that regulate autophagy at different stages, which helps decode autophagy regulation in cancer and offers novel avenues to develop autophagy-related therapies for cancer.

  15. A broad phenotypic screen identifies novel phenotypes driven by a single mutant allele in Huntington's disease CAG knock-in mice.

    Directory of Open Access Journals (Sweden)

    Sabine M Hölter

    Full Text Available Huntington's disease (HD is an autosomal dominant neurodegenerative disorder caused by the expansion of a CAG trinucleotide repeat in the HTT gene encoding huntingtin. The disease has an insidious course, typically progressing over 10-15 years until death. Currently there is no effective disease-modifying therapy. To better understand the HD pathogenic process we have developed genetic HTT CAG knock-in mouse models that accurately recapitulate the HD mutation in man. Here, we describe results of a broad, standardized phenotypic screen in 10-46 week old heterozygous HdhQ111 knock-in mice, probing a wide range of physiological systems. The results of this screen revealed a number of behavioral abnormalities in HdhQ111/+ mice that include hypoactivity, decreased anxiety, motor learning and coordination deficits, and impaired olfactory discrimination. The screen also provided evidence supporting subtle cardiovascular, lung, and plasma metabolite alterations. Importantly, our results reveal that a single mutant HTT allele in the mouse is sufficient to elicit multiple phenotypic abnormalities, consistent with a dominant disease process in patients. These data provide a starting point for further investigation of several organ systems in HD, for the dissection of underlying pathogenic mechanisms and for the identification of reliable phenotypic endpoints for therapeutic testing.

  16. Identification of small molecule inhibitors of Pseudomonas aeruginosa exoenzyme S using a yeast phenotypic screen.

    Directory of Open Access Journals (Sweden)

    Anthony Arnoldo

    2008-02-01

    Full Text Available Pseudomonas aeruginosa is an opportunistic human pathogen that is a key factor in the mortality of cystic fibrosis patients, and infection represents an increased threat for human health worldwide. Because resistance of Pseudomonas aeruginosa to antibiotics is increasing, new inhibitors of pharmacologically validated targets of this bacterium are needed. Here we demonstrate that a cell-based yeast phenotypic assay, combined with a large-scale inhibitor screen, identified small molecule inhibitors that can suppress the toxicity caused by heterologous expression of selected Pseudomonas aeruginosa ORFs. We identified the first small molecule inhibitor of Exoenzyme S (ExoS, a toxin involved in Type III secretion. We show that this inhibitor, exosin, modulates ExoS ADP-ribosyltransferase activity in vitro, suggesting the inhibition is direct. Moreover, exosin and two of its analogues display a significant protective effect against Pseudomonas infection in vivo. Furthermore, because the assay was performed in yeast, we were able to demonstrate that several yeast homologues of the known human ExoS targets are likely ADP-ribosylated by the toxin. For example, using an in vitro enzymatic assay, we demonstrate that yeast Ras2p is directly modified by ExoS. Lastly, by surveying a collection of yeast deletion mutants, we identified Bmh1p, a yeast homologue of the human FAS, as an ExoS cofactor, revealing that portions of the bacterial toxin mode of action are conserved from yeast to human. Taken together, our integrated cell-based, chemical-genetic approach demonstrates that such screens can augment traditional drug screening approaches and facilitate the discovery of new compounds against a broad range of human pathogens.

  17. HTS-DB: an online resource to publish and query data from functional genomics high-throughput siRNA screening projects.

    Science.gov (United States)

    Saunders, Rebecca E; Instrell, Rachael; Rispoli, Rossella; Jiang, Ming; Howell, Michael

    2013-01-01

    High-throughput screening (HTS) uses technologies such as RNA interference to generate loss-of-function phenotypes on a genomic scale. As these technologies become more popular, many research institutes have established core facilities of expertise to deal with the challenges of large-scale HTS experiments. As the efforts of core facility screening projects come to fruition, focus has shifted towards managing the results of these experiments and making them available in a useful format that can be further mined for phenotypic discovery. The HTS-DB database provides a public view of data from screening projects undertaken by the HTS core facility at the CRUK London Research Institute. All projects and screens are described with comprehensive assay protocols, and datasets are provided with complete descriptions of analysis techniques. This format allows users to browse and search data from large-scale studies in an informative and intuitive way. It also provides a repository for additional measurements obtained from screens that were not the focus of the project, such as cell viability, and groups these data so that it can provide a gene-centric summary across several different cell lines and conditions. All datasets from our screens that can be made available can be viewed interactively and mined for further hit lists. We believe that in this format, the database provides researchers with rapid access to results of large-scale experiments that might facilitate their understanding of genes/compounds identified in their own research. DATABASE URL: http://hts.cancerresearchuk.org/db/public.

  18. Digital radiography with a large-scale electronic flat-panel detector vs screen-film radiography: observer preference in clinical skeletal diagnostics

    International Nuclear Information System (INIS)

    Hamers, S.; Freyschmidt, J.; Neitzel, U.

    2001-01-01

    The imaging performance of a recently developed digital flat-panel detector system was compared with conventional screen-film imaging in an observer preference study. In total, 34 image pairs of various regions of the skeleton were obtained in 24 patients; 30 image pairs were included in the study. The conventional images were acquired with 250- and 400-speed screen-film combinations, using the standard technique of our department. Within hours, the digital images were obtained using identical exposure parameters. The digital system employed a large-area (43 x 43 cm) flat-panel detector based on amorphous silicon (Trixell Pixium 4600), integrated in a Bucky table. Six radiologists independently evaluated the image pairs with respect to image latitude, soft tissue rendition, rendition of the periosteal and enosteal border of cortical bone, rendition of cancellous bone and the visibility of potentially present pathological changes, using a subjective five-point scale. The digital images were rated significantly (p=0.001) better than the screen-film images with respect to soft tissue rendition and image latitude. Also the rendition of the cancellous bone and the periosteal and enosteal border of the cortical bone was rated significantly (p=0.05) better for the flat-panel detector. The visibility of pathological lesions was equivalent; only large-area sclerotic lesions (n=2) were seen superiorly on screen-film images. The new digital flat-panel detector based on amorphous silicon appears to be at least equivalent to conventional screen-film combinations for skeletal examinations, and in most respects even superior. (orig.)

  19. GenomeRNAi: a database for cell-based RNAi phenotypes.

    Science.gov (United States)

    Horn, Thomas; Arziman, Zeynep; Berger, Juerg; Boutros, Michael

    2007-01-01

    RNA interference (RNAi) has emerged as a powerful tool to generate loss-of-function phenotypes in a variety of organisms. Combined with the sequence information of almost completely annotated genomes, RNAi technologies have opened new avenues to conduct systematic genetic screens for every annotated gene in the genome. As increasing large datasets of RNAi-induced phenotypes become available, an important challenge remains the systematic integration and annotation of functional information. Genome-wide RNAi screens have been performed both in Caenorhabditis elegans and Drosophila for a variety of phenotypes and several RNAi libraries have become available to assess phenotypes for almost every gene in the genome. These screens were performed using different types of assays from visible phenotypes to focused transcriptional readouts and provide a rich data source for functional annotation across different species. The GenomeRNAi database provides access to published RNAi phenotypes obtained from cell-based screens and maps them to their genomic locus, including possible non-specific regions. The database also gives access to sequence information of RNAi probes used in various screens. It can be searched by phenotype, by gene, by RNAi probe or by sequence and is accessible at http://rnai.dkfz.de.

  20. Large Scale Screening of Low Cost Ferritic Steel Designs For Advanced Ultra Supercritical Boiler Using First Principles Methods

    Energy Technology Data Exchange (ETDEWEB)

    Ouyang, Lizhi [Tennessee State Univ. Nashville, TN (United States)

    2016-11-29

    Advanced Ultra Supercritical Boiler (AUSC) requires materials that can operate in corrosive environment at temperature and pressure as high as 760°C (or 1400°F) and 5000psi, respectively, while at the same time maintain good ductility at low temperature. We develop automated simulation software tools to enable fast large scale screening studies of candidate designs. While direct evaluation of creep rupture strength and ductility are currently not feasible, properties such as energy, elastic constants, surface energy, interface energy, and stack fault energy can be used to assess their relative ductility and creeping strength. We implemented software to automate the complex calculations to minimize human inputs in the tedious screening studies which involve model structures generation, settings for first principles calculations, results analysis and reporting. The software developed in the project and library of computed mechanical properties of phases found in ferritic steels, many are complex solid solutions estimated for the first time, will certainly help the development of low cost ferritic steel for AUSC.

  1. Phenotypic Screening Identifies Modulators of Amyloid Precursor Protein Processing in Human Stem Cell Models of Alzheimer’s Disease

    Directory of Open Access Journals (Sweden)

    Philip W. Brownjohn

    2017-04-01

    Full Text Available Summary: Human stem cell models have the potential to provide platforms for phenotypic screens to identify candidate treatments and cellular pathways involved in the pathogenesis of neurodegenerative disorders. Amyloid precursor protein (APP processing and the accumulation of APP-derived amyloid β (Aβ peptides are key processes in Alzheimer's disease (AD. We designed a phenotypic small-molecule screen to identify modulators of APP processing in trisomy 21/Down syndrome neurons, a complex genetic model of AD. We identified the avermectins, commonly used as anthelmintics, as compounds that increase the relative production of short Aβ peptides at the expense of longer, potentially more toxic peptides. Further studies demonstrated that this effect is not due to an interaction with the core γ-secretase responsible for Aβ production. This study demonstrates the feasibility of phenotypic drug screening in human stem cell models of Alzheimer-type dementia, and points to possibilities for indirectly modulating APP processing, independently of γ-secretase modulation. : In this article, Livesey and colleagues perform a phenotypic drug screen in a human stem cell model of Alzheimer's disease. The anthelminthic avermectins are identified as a family of compounds that increase the production of short Aβ peptides over longer more toxic Aβ forms. The effect is analogous to existing γ-secretase modulators, but is independent of the core γ-secretase complex. Keywords: neural stem cells, Alzheimer's disease, phenotypic screening, iPSCs, human neurons, dementia, Down syndrome, amyloid beta, ivermectin, selamectin

  2. Advanced Cell Classifier: User-Friendly Machine-Learning-Based Software for Discovering Phenotypes in High-Content Imaging Data.

    Science.gov (United States)

    Piccinini, Filippo; Balassa, Tamas; Szkalisity, Abel; Molnar, Csaba; Paavolainen, Lassi; Kujala, Kaisa; Buzas, Krisztina; Sarazova, Marie; Pietiainen, Vilja; Kutay, Ulrike; Smith, Kevin; Horvath, Peter

    2017-06-28

    High-content, imaging-based screens now routinely generate data on a scale that precludes manual verification and interrogation. Software applying machine learning has become an essential tool to automate analysis, but these methods require annotated examples to learn from. Efficiently exploring large datasets to find relevant examples remains a challenging bottleneck. Here, we present Advanced Cell Classifier (ACC), a graphical software package for phenotypic analysis that addresses these difficulties. ACC applies machine-learning and image-analysis methods to high-content data generated by large-scale, cell-based experiments. It features methods to mine microscopic image data, discover new phenotypes, and improve recognition performance. We demonstrate that these features substantially expedite the training process, successfully uncover rare phenotypes, and improve the accuracy of the analysis. ACC is extensively documented, designed to be user-friendly for researchers without machine-learning expertise, and distributed as a free open-source tool at www.cellclassifier.org. Copyright © 2017 Elsevier Inc. All rights reserved.

  3. Large-scale functional RNAi screen in C. elegans identifies genes that regulate the dysfunction of mutant polyglutamine neurons.

    Science.gov (United States)

    Lejeune, François-Xavier; Mesrob, Lilia; Parmentier, Frédéric; Bicep, Cedric; Vazquez-Manrique, Rafael P; Parker, J Alex; Vert, Jean-Philippe; Tourette, Cendrine; Neri, Christian

    2012-03-13

    A central goal in Huntington's disease (HD) research is to identify and prioritize candidate targets for neuroprotective intervention, which requires genome-scale information on the modifiers of early-stage neuron injury in HD. Here, we performed a large-scale RNA interference screen in C. elegans strains that express N-terminal huntingtin (htt) in touch receptor neurons. These neurons control the response to light touch. Their function is strongly impaired by expanded polyglutamines (128Q) as shown by the nearly complete loss of touch response in adult animals, providing an in vivo model in which to manipulate the early phases of expanded-polyQ neurotoxicity. In total, 6034 genes were examined, revealing 662 gene inactivations that either reduce or aggravate defective touch response in 128Q animals. Several genes were previously implicated in HD or neurodegenerative disease, suggesting that this screen has effectively identified candidate targets for HD. Network-based analysis emphasized a subset of high-confidence modifier genes in pathways of interest in HD including metabolic, neurodevelopmental and pro-survival pathways. Finally, 49 modifiers of 128Q-neuron dysfunction that are dysregulated in the striatum of either R/2 or CHL2 HD mice, or both, were identified. Collectively, these results highlight the relevance to HD pathogenesis, providing novel information on the potential therapeutic targets for neuroprotection in HD. © 2012 Lejeune et al; licensee BioMed Central Ltd.

  4. Novel gene function revealed by mouse mutagenesis screens for models of age-related disease.

    Science.gov (United States)

    Potter, Paul K; Bowl, Michael R; Jeyarajan, Prashanthini; Wisby, Laura; Blease, Andrew; Goldsworthy, Michelle E; Simon, Michelle M; Greenaway, Simon; Michel, Vincent; Barnard, Alun; Aguilar, Carlos; Agnew, Thomas; Banks, Gareth; Blake, Andrew; Chessum, Lauren; Dorning, Joanne; Falcone, Sara; Goosey, Laurence; Harris, Shelley; Haynes, Andy; Heise, Ines; Hillier, Rosie; Hough, Tertius; Hoslin, Angela; Hutchison, Marie; King, Ruairidh; Kumar, Saumya; Lad, Heena V; Law, Gemma; MacLaren, Robert E; Morse, Susan; Nicol, Thomas; Parker, Andrew; Pickford, Karen; Sethi, Siddharth; Starbuck, Becky; Stelma, Femke; Cheeseman, Michael; Cross, Sally H; Foster, Russell G; Jackson, Ian J; Peirson, Stuart N; Thakker, Rajesh V; Vincent, Tonia; Scudamore, Cheryl; Wells, Sara; El-Amraoui, Aziz; Petit, Christine; Acevedo-Arozena, Abraham; Nolan, Patrick M; Cox, Roger; Mallon, Anne-Marie; Brown, Steve D M

    2016-08-18

    Determining the genetic bases of age-related disease remains a major challenge requiring a spectrum of approaches from human and clinical genetics to the utilization of model organism studies. Here we report a large-scale genetic screen in mice employing a phenotype-driven discovery platform to identify mutations resulting in age-related disease, both late-onset and progressive. We have utilized N-ethyl-N-nitrosourea mutagenesis to generate pedigrees of mutagenized mice that were subject to recurrent screens for mutant phenotypes as the mice aged. In total, we identify 105 distinct mutant lines from 157 pedigrees analysed, out of which 27 are late-onset phenotypes across a range of physiological systems. Using whole-genome sequencing we uncover the underlying genes for 44 of these mutant phenotypes, including 12 late-onset phenotypes. These genes reveal a number of novel pathways involved with age-related disease. We illustrate our findings by the recovery and characterization of a novel mouse model of age-related hearing loss.

  5. Screen-Space Normal Distribution Function Caching for Consistent Multi-Resolution Rendering of Large Particle Data

    KAUST Repository

    Ibrahim, Mohamed

    2017-08-28

    Molecular dynamics (MD) simulations are crucial to investigating important processes in physics and thermodynamics. The simulated atoms are usually visualized as hard spheres with Phong shading, where individual particles and their local density can be perceived well in close-up views. However, for large-scale simulations with 10 million particles or more, the visualization of large fields-of-view usually suffers from strong aliasing artifacts, because the mismatch between data size and output resolution leads to severe under-sampling of the geometry. Excessive super-sampling can alleviate this problem, but is prohibitively expensive. This paper presents a novel visualization method for large-scale particle data that addresses aliasing while enabling interactive high-quality rendering. We introduce the novel concept of screen-space normal distribution functions (S-NDFs) for particle data. S-NDFs represent the distribution of surface normals that map to a given pixel in screen space, which enables high-quality re-lighting without re-rendering particles. In order to facilitate interactive zooming, we cache S-NDFs in a screen-space mipmap (S-MIP). Together, these two concepts enable interactive, scale-consistent re-lighting and shading changes, as well as zooming, without having to re-sample the particle data. We show how our method facilitates the interactive exploration of real-world large-scale MD simulation data in different scenarios.

  6. Screen-Space Normal Distribution Function Caching for Consistent Multi-Resolution Rendering of Large Particle Data

    KAUST Repository

    Ibrahim, Mohamed; Wickenhauser, Patrick; Rautek, Peter; Reina, Guido; Hadwiger, Markus

    2017-01-01

    Molecular dynamics (MD) simulations are crucial to investigating important processes in physics and thermodynamics. The simulated atoms are usually visualized as hard spheres with Phong shading, where individual particles and their local density can be perceived well in close-up views. However, for large-scale simulations with 10 million particles or more, the visualization of large fields-of-view usually suffers from strong aliasing artifacts, because the mismatch between data size and output resolution leads to severe under-sampling of the geometry. Excessive super-sampling can alleviate this problem, but is prohibitively expensive. This paper presents a novel visualization method for large-scale particle data that addresses aliasing while enabling interactive high-quality rendering. We introduce the novel concept of screen-space normal distribution functions (S-NDFs) for particle data. S-NDFs represent the distribution of surface normals that map to a given pixel in screen space, which enables high-quality re-lighting without re-rendering particles. In order to facilitate interactive zooming, we cache S-NDFs in a screen-space mipmap (S-MIP). Together, these two concepts enable interactive, scale-consistent re-lighting and shading changes, as well as zooming, without having to re-sample the particle data. We show how our method facilitates the interactive exploration of real-world large-scale MD simulation data in different scenarios.

  7. Large scale electrolysers

    International Nuclear Information System (INIS)

    B Bello; M Junker

    2006-01-01

    Hydrogen production by water electrolysis represents nearly 4 % of the world hydrogen production. Future development of hydrogen vehicles will require large quantities of hydrogen. Installation of large scale hydrogen production plants will be needed. In this context, development of low cost large scale electrolysers that could use 'clean power' seems necessary. ALPHEA HYDROGEN, an European network and center of expertise on hydrogen and fuel cells, has performed for its members a study in 2005 to evaluate the potential of large scale electrolysers to produce hydrogen in the future. The different electrolysis technologies were compared. Then, a state of art of the electrolysis modules currently available was made. A review of the large scale electrolysis plants that have been installed in the world was also realized. The main projects related to large scale electrolysis were also listed. Economy of large scale electrolysers has been discussed. The influence of energy prices on the hydrogen production cost by large scale electrolysis was evaluated. (authors)

  8. Identifiability in N-mixture models: a large-scale screening test with bird data.

    Science.gov (United States)

    Kéry, Marc

    2018-02-01

    Binomial N-mixture models have proven very useful in ecology, conservation, and monitoring: they allow estimation and modeling of abundance separately from detection probability using simple counts. Recently, doubts about parameter identifiability have been voiced. I conducted a large-scale screening test with 137 bird data sets from 2,037 sites. I found virtually no identifiability problems for Poisson and zero-inflated Poisson (ZIP) binomial N-mixture models, but negative-binomial (NB) models had problems in 25% of all data sets. The corresponding multinomial N-mixture models had no problems. Parameter estimates under Poisson and ZIP binomial and multinomial N-mixture models were extremely similar. Identifiability problems became a little more frequent with smaller sample sizes (267 and 50 sites), but were unaffected by whether the models did or did not include covariates. Hence, binomial N-mixture model parameters with Poisson and ZIP mixtures typically appeared identifiable. In contrast, NB mixtures were often unidentifiable, which is worrying since these were often selected by Akaike's information criterion. Identifiability of binomial N-mixture models should always be checked. If problems are found, simpler models, integrated models that combine different observation models or the use of external information via informative priors or penalized likelihoods, may help. © 2017 by the Ecological Society of America.

  9. Phenotypic screening of hepatocyte nuclear factor (HNF) 4-γ receptor knockout mice

    International Nuclear Information System (INIS)

    Gerdin, Anna Karin; Surve, Vikas V.; Joensson, Marie; Bjursell, Mikael; Bjoerkman, Maria; Edenro, Anne; Schuelke, Meint; Saad, Alaa; Bjurstroem, Sivert; Lundgren, Elisabeth Jensen; Snaith, Michael; Fransson-Steen, Ronny; Toernell, Jan; Berg, Anna-Lena; Bohlooly-Y, Mohammad

    2006-01-01

    Using the mouse as a model organism in pharmaceutical research presents unique advantages as its physiology in many ways resembles the human physiology, it also has a relatively short generation time, low breeding and maintenance costs, and is available in a wide variety of inbred strains. The ability to genetically modify mouse embryonic stem cells to generate mouse models that better mimic human disease is another advantage. In the present study, a comprehensive phenotypic screening protocol is applied to elucidate the phenotype of a novel mouse knockout model of hepatocyte nuclear factor (HNF) 4-γ. HNF4-γ is expressed in the kidneys, gut, pancreas, and testis. First level of the screen is aimed at general health, morphologic appearance, normal cage behaviour, and gross neurological functions. The second level of the screen looks at metabolic characteristics and lung function. The third level of the screen investigates behaviour more in-depth and the fourth level consists of a thorough pathological characterisation, blood chemistry, haematology, and bone marrow analysis. When compared with littermate wild-type mice (HNF4-γ +/+ ), the HNF4-γ knockout (HNF4-γ -/- ) mice had lowered energy expenditure and locomotor activity during night time that resulted in a higher body weight despite having reduced intake of food and water. HNF4-γ -/- mice were less inclined to build nest and were found to spend more time in a passive state during the forced swim test

  10. High-Throughput Screening Enhances Kidney Organoid Differentiation from Human Pluripotent Stem Cells and Enables Automated Multidimensional Phenotyping.

    Science.gov (United States)

    Czerniecki, Stefan M; Cruz, Nelly M; Harder, Jennifer L; Menon, Rajasree; Annis, James; Otto, Edgar A; Gulieva, Ramila E; Islas, Laura V; Kim, Yong Kyun; Tran, Linh M; Martins, Timothy J; Pippin, Jeffrey W; Fu, Hongxia; Kretzler, Matthias; Shankland, Stuart J; Himmelfarb, Jonathan; Moon, Randall T; Paragas, Neal; Freedman, Benjamin S

    2018-05-15

    Organoids derived from human pluripotent stem cells are a potentially powerful tool for high-throughput screening (HTS), but the complexity of organoid cultures poses a significant challenge for miniaturization and automation. Here, we present a fully automated, HTS-compatible platform for enhanced differentiation and phenotyping of human kidney organoids. The entire 21-day protocol, from plating to differentiation to analysis, can be performed automatically by liquid-handling robots, or alternatively by manual pipetting. High-content imaging analysis reveals both dose-dependent and threshold effects during organoid differentiation. Immunofluorescence and single-cell RNA sequencing identify previously undetected parietal, interstitial, and partially differentiated compartments within organoids and define conditions that greatly expand the vascular endothelium. Chemical modulation of toxicity and disease phenotypes can be quantified for safety and efficacy prediction. Screening in gene-edited organoids in this system reveals an unexpected role for myosin in polycystic kidney disease. Organoids in HTS formats thus establish an attractive platform for multidimensional phenotypic screening. Copyright © 2018 Elsevier Inc. All rights reserved.

  11. Role of optometry school in single day large scale school vision testing

    Science.gov (United States)

    Anuradha, N; Ramani, Krishnakumar

    2015-01-01

    Background: School vision testing aims at identification and management of refractive errors. Large-scale school vision testing using conventional methods is time-consuming and demands a lot of chair time from the eye care professionals. A new strategy involving a school of optometry in single day large scale school vision testing is discussed. Aim: The aim was to describe a new approach of performing vision testing of school children on a large scale in a single day. Materials and Methods: A single day vision testing strategy was implemented wherein 123 members (20 teams comprising optometry students and headed by optometrists) conducted vision testing for children in 51 schools. School vision testing included basic vision screening, refraction, frame measurements, frame choice and referrals for other ocular problems. Results: A total of 12448 children were screened, among whom 420 (3.37%) were identified to have refractive errors. 28 (1.26%) children belonged to the primary, 163 to middle (9.80%), 129 (4.67%) to secondary and 100 (1.73%) to the higher secondary levels of education respectively. 265 (2.12%) children were referred for further evaluation. Conclusion: Single day large scale school vision testing can be adopted by schools of optometry to reach a higher number of children within a short span. PMID:25709271

  12. Phenotype-Based Screening of Small Molecules to Modify Plant Cell Walls Using BY-2 Cells.

    Science.gov (United States)

    Okubo-Kurihara, Emiko; Matsui, Minami

    2018-01-01

    The plant cell wall is an important and abundant biomass with great potential for use as a modern recyclable resource. For effective utilization of this cellulosic biomass, its ability to degrade efficiently is key point. With the aim of modifying the cell wall to allow easy decomposition, we used chemical biological technology to alter its structure. As a first step toward evaluating the chemicals in the cell wall we employed a phenotype-based approach using high-throughput screening. As the plant cell wall is essential in determining cell morphology, phenotype-based screening is particularly effective in identifying compounds that bring about alterations in the cell wall. For rapid and reproducible screening, tobacco BY-2 cell is an excellent system in which to observe cell morphology. In this chapter, we provide a detailed chemical biological methodology for studying cell morphology using tobacco BY-2 cells.

  13. Decision aid on breast cancer screening reduces attendance rate: results of a large-scale, randomized, controlled study by the DECIDEO group

    Science.gov (United States)

    Bourmaud, Aurelie; Soler-Michel, Patricia; Oriol, Mathieu; Regnier, Véronique; Tinquaut, Fabien; Nourissat, Alice; Bremond, Alain; Moumjid, Nora; Chauvin, Franck

    2016-01-01

    Controversies regarding the benefits of breast cancer screening programs have led to the promotion of new strategies taking into account individual preferences, such as decision aid. The aim of this study was to assess the impact of a decision aid leaflet on the participation of women invited to participate in a national breast cancer screening program. This Randomized, multicentre, controlled trial. Women aged 50 to 74 years, were randomly assigned to receive either a decision aid or the usual invitation letter. Primary outcome was the participation rate 12 months after the invitation. 16 000 women were randomized and 15 844 included in the modified intention-to-treat analysis. The participation rate in the intervention group was 40.25% (3174/7885 women) compared with 42.13% (3353/7959) in the control group (p = 0.02). Previous attendance for screening (RR = 6.24; [95%IC: 5.75-6.77]; p < 0.0001) and medium household income (RR = 1.05; [95%IC: 1.01-1.09]; p = 0.0074) were independently associated with attendance for screening. This large-scale study demonstrates that the decision aid reduced the participation rate. The decision aid activate the decision making process of women toward non-attendance to screening. These results show the importance of promoting informed patient choices, especially when those choices cannot be anticipated. PMID:26883201

  14. Arena3D: visualizing time-driven phenotypic differences in biological systems

    Directory of Open Access Journals (Sweden)

    Secrier Maria

    2012-03-01

    Full Text Available Abstract Background Elucidating the genotype-phenotype connection is one of the big challenges of modern molecular biology. To fully understand this connection, it is necessary to consider the underlying networks and the time factor. In this context of data deluge and heterogeneous information, visualization plays an essential role in interpreting complex and dynamic topologies. Thus, software that is able to bring the network, phenotypic and temporal information together is needed. Arena3D has been previously introduced as a tool that facilitates link discovery between processes. It uses a layered display to separate different levels of information while emphasizing the connections between them. We present novel developments of the tool for the visualization and analysis of dynamic genotype-phenotype landscapes. Results Version 2.0 introduces novel features that allow handling time course data in a phenotypic context. Gene expression levels or other measures can be loaded and visualized at different time points and phenotypic comparison is facilitated through clustering and correlation display or highlighting of impacting changes through time. Similarity scoring allows the identification of global patterns in dynamic heterogeneous data. In this paper we demonstrate the utility of the tool on two distinct biological problems of different scales. First, we analyze a medium scale dataset that looks at perturbation effects of the pluripotency regulator Nanog in murine embryonic stem cells. Dynamic cluster analysis suggests alternative indirect links between Nanog and other proteins in the core stem cell network. Moreover, recurrent correlations from the epigenetic to the translational level are identified. Second, we investigate a large scale dataset consisting of genome-wide knockdown screens for human genes essential in the mitotic process. Here, a potential new role for the gene lsm14a in cytokinesis is suggested. We also show how phenotypic

  15. Evaluation of a large-scale tuberculosis contact investigation in the Netherlands

    NARCIS (Netherlands)

    Borgen, K.; Koster, B.; Meijer, H.; Kuyvenhoven, V.; van der Sande, M.; Cobelens, F.

    2008-01-01

    The aim of the present study was to evaluate yield and effectiveness of a large-scale contact investigation around a supermarket employee with infectious tuberculosis. Supermarket customers were screened by tuberculin skin test (TST) and/or radiography, depending on individual characteristics. The

  16. Disinformative data in large-scale hydrological modelling

    Directory of Open Access Journals (Sweden)

    A. Kauffeldt

    2013-07-01

    Full Text Available Large-scale hydrological modelling has become an important tool for the study of global and regional water resources, climate impacts, and water-resources management. However, modelling efforts over large spatial domains are fraught with problems of data scarcity, uncertainties and inconsistencies between model forcing and evaluation data. Model-independent methods to screen and analyse data for such problems are needed. This study aimed at identifying data inconsistencies in global datasets using a pre-modelling analysis, inconsistencies that can be disinformative for subsequent modelling. The consistency between (i basin areas for different hydrographic datasets, and (ii between climate data (precipitation and potential evaporation and discharge data, was examined in terms of how well basin areas were represented in the flow networks and the possibility of water-balance closure. It was found that (i most basins could be well represented in both gridded basin delineations and polygon-based ones, but some basins exhibited large area discrepancies between flow-network datasets and archived basin areas, (ii basins exhibiting too-high runoff coefficients were abundant in areas where precipitation data were likely affected by snow undercatch, and (iii the occurrence of basins exhibiting losses exceeding the potential-evaporation limit was strongly dependent on the potential-evaporation data, both in terms of numbers and geographical distribution. Some inconsistencies may be resolved by considering sub-grid variability in climate data, surface-dependent potential-evaporation estimates, etc., but further studies are needed to determine the reasons for the inconsistencies found. Our results emphasise the need for pre-modelling data analysis to identify dataset inconsistencies as an important first step in any large-scale study. Applying data-screening methods before modelling should also increase our chances to draw robust conclusions from subsequent

  17. Large-scale exploration and analysis of drug combinations.

    Science.gov (United States)

    Li, Peng; Huang, Chao; Fu, Yingxue; Wang, Jinan; Wu, Ziyin; Ru, Jinlong; Zheng, Chunli; Guo, Zihu; Chen, Xuetong; Zhou, Wei; Zhang, Wenjuan; Li, Yan; Chen, Jianxin; Lu, Aiping; Wang, Yonghua

    2015-06-15

    Drug combinations are a promising strategy for combating complex diseases by improving the efficacy and reducing corresponding side effects. Currently, a widely studied problem in pharmacology is to predict effective drug combinations, either through empirically screening in clinic or pure experimental trials. However, the large-scale prediction of drug combination by a systems method is rarely considered. We report a systems pharmacology framework to predict drug combinations (PreDCs) on a computational model, termed probability ensemble approach (PEA), for analysis of both the efficacy and adverse effects of drug combinations. First, a Bayesian network integrating with a similarity algorithm is developed to model the combinations from drug molecular and pharmacological phenotypes, and the predictions are then assessed with both clinical efficacy and adverse effects. It is illustrated that PEA can predict the combination efficacy of drugs spanning different therapeutic classes with high specificity and sensitivity (AUC = 0.90), which was further validated by independent data or new experimental assays. PEA also evaluates the adverse effects (AUC = 0.95) quantitatively and detects the therapeutic indications for drug combinations. Finally, the PreDC database includes 1571 known and 3269 predicted optimal combinations as well as their potential side effects and therapeutic indications. The PreDC database is available at http://sm.nwsuaf.edu.cn/lsp/predc.php. © The Author 2015. Published by Oxford University Press. All rights reserved. For Permissions, please e-mail: journals.permissions@oup.com.

  18. Substantial improvements in large-scale redocking and screening using the novel HYDE scoring function

    Science.gov (United States)

    Schneider, Nadine; Hindle, Sally; Lange, Gudrun; Klein, Robert; Albrecht, Jürgen; Briem, Hans; Beyer, Kristin; Claußen, Holger; Gastreich, Marcus; Lemmen, Christian; Rarey, Matthias

    2012-06-01

    The HYDE scoring function consistently describes hydrogen bonding, the hydrophobic effect and desolvation. It relies on HYdration and DEsolvation terms which are calibrated using octanol/water partition coefficients of small molecules. We do not use affinity data for calibration, therefore HYDE is generally applicable to all protein targets. HYDE reflects the Gibbs free energy of binding while only considering the essential interactions of protein-ligand complexes. The greatest benefit of HYDE is that it yields a very intuitive atom-based score, which can be mapped onto the ligand and protein atoms. This allows the direct visualization of the score and consequently facilitates analysis of protein-ligand complexes during the lead optimization process. In this study, we validated our new scoring function by applying it in large-scale docking experiments. We could successfully predict the correct binding mode in 93% of complexes in redocking calculations on the Astex diverse set, while our performance in virtual screening experiments using the DUD dataset showed significant enrichment values with a mean AUC of 0.77 across all protein targets with little or no structural defects. As part of these studies, we also carried out a very detailed analysis of the data that revealed interesting pitfalls, which we highlight here and which should be addressed in future benchmark datasets.

  19. Towards Automated Large-Scale 3D Phenotyping of Vineyards under Field Conditions

    Directory of Open Access Journals (Sweden)

    Johann Christian Rose

    2016-12-01

    Full Text Available In viticulture, phenotypic data are traditionally collected directly in the field via visual and manual means by an experienced person. This approach is time consuming, subjective and prone to human errors. In recent years, research therefore has focused strongly on developing automated and non-invasive sensor-based methods to increase data acquisition speed, enhance measurement accuracy and objectivity and to reduce labor costs. While many 2D methods based on image processing have been proposed for field phenotyping, only a few 3D solutions are found in the literature. A track-driven vehicle consisting of a camera system, a real-time-kinematic GPS system for positioning, as well as hardware for vehicle control, image storage and acquisition is used to visually capture a whole vine row canopy with georeferenced RGB images. In the first post-processing step, these images were used within a multi-view-stereo software to reconstruct a textured 3D point cloud of the whole grapevine row. A classification algorithm is then used in the second step to automatically classify the raw point cloud data into the semantic plant components, grape bunches and canopy. In the third step, phenotypic data for the semantic objects is gathered using the classification results obtaining the quantity of grape bunches, berries and the berry diameter.

  20. High-Throughput Screening Using iPSC-Derived Neuronal Progenitors to Identify Compounds Counteracting Epigenetic Gene Silencing in Fragile X Syndrome.

    Science.gov (United States)

    Kaufmann, Markus; Schuffenhauer, Ansgar; Fruh, Isabelle; Klein, Jessica; Thiemeyer, Anke; Rigo, Pierre; Gomez-Mancilla, Baltazar; Heidinger-Millot, Valerie; Bouwmeester, Tewis; Schopfer, Ulrich; Mueller, Matthias; Fodor, Barna D; Cobos-Correa, Amanda

    2015-10-01

    Fragile X syndrome (FXS) is the most common form of inherited mental retardation, and it is caused in most of cases by epigenetic silencing of the Fmr1 gene. Today, no specific therapy exists for FXS, and current treatments are only directed to improve behavioral symptoms. Neuronal progenitors derived from FXS patient induced pluripotent stem cells (iPSCs) represent a unique model to study the disease and develop assays for large-scale drug discovery screens since they conserve the Fmr1 gene silenced within the disease context. We have established a high-content imaging assay to run a large-scale phenotypic screen aimed to identify compounds that reactivate the silenced Fmr1 gene. A set of 50,000 compounds was tested, including modulators of several epigenetic targets. We describe an integrated drug discovery model comprising iPSC generation, culture scale-up, and quality control and screening with a very sensitive high-content imaging assay assisted by single-cell image analysis and multiparametric data analysis based on machine learning algorithms. The screening identified several compounds that induced a weak expression of fragile X mental retardation protein (FMRP) and thus sets the basis for further large-scale screens to find candidate drugs or targets tackling the underlying mechanism of FXS with potential for therapeutic intervention. © 2015 Society for Laboratory Automation and Screening.

  1. A powerful nonparametric method for detecting differentially co-expressed genes: distance correlation screening and edge-count test.

    Science.gov (United States)

    Zhang, Qingyang

    2018-05-16

    Differential co-expression analysis, as a complement of differential expression analysis, offers significant insights into the changes in molecular mechanism of different phenotypes. A prevailing approach to detecting differentially co-expressed genes is to compare Pearson's correlation coefficients in two phenotypes. However, due to the limitations of Pearson's correlation measure, this approach lacks the power to detect nonlinear changes in gene co-expression which is common in gene regulatory networks. In this work, a new nonparametric procedure is proposed to search differentially co-expressed gene pairs in different phenotypes from large-scale data. Our computational pipeline consisted of two main steps, a screening step and a testing step. The screening step is to reduce the search space by filtering out all the independent gene pairs using distance correlation measure. In the testing step, we compare the gene co-expression patterns in different phenotypes by a recently developed edge-count test. Both steps are distribution-free and targeting nonlinear relations. We illustrate the promise of the new approach by analyzing the Cancer Genome Atlas data and the METABRIC data for breast cancer subtypes. Compared with some existing methods, the new method is more powerful in detecting nonlinear type of differential co-expressions. The distance correlation screening can greatly improve computational efficiency, facilitating its application to large data sets.

  2. Liver diseases: A major, neglected global public health problem requiring urgent actions and large-scale screening.

    Science.gov (United States)

    Marcellin, Patrick; Kutala, Blaise K

    2018-02-01

    CLDs represent an important, and certainly underestimated, global public health problem. CLDs are highly prevalent and silent, related to different, sometimes associated causes. The distribution of the causes of these diseases is slowly changing, and within the next decade, the proportion of virus-induced CLDs will certainly decrease significantly while the proportion of NASH will increase. There is an urgent need for effective global actions including education, prevention and early diagnosis to manage and treat CLDs, thus preventing cirrhosis-related morbidity and mortality. Our role is to increase the awareness of the public, healthcare professionals and public health authorities to encourage active policies for early management that will decrease the short- and long-term public health burden of these diseases. Because necroinflammation is the key mechanism in the progression of CLDs, it should be detected early. Thus, large-scale screening for CLDs is needed. ALT levels are an easy and inexpensive marker of liver necroinflammation and could be the first-line tool in this process. © 2018 John Wiley & Sons A/S. Published by John Wiley & Sons Ltd.

  3. Phenotypic Screening Approaches to Develop Aurora Kinase Inhibitors: Drug Discovery Perspectives.

    Science.gov (United States)

    Marugán, Carlos; Torres, Raquel; Lallena, María José

    2015-01-01

    Targeting mitotic regulators as a strategy to fight cancer implies the development of drugs against key proteins, such as Aurora-A and -B. Current drugs, which target mitosis through a general mechanism of action (stabilization/destabilization of microtubules), have several side effects (neutropenia, alopecia, and emesis). Pharmaceutical companies aim at avoiding these unwanted effects by generating improved and selective drugs that increase the quality of life of the patients. However, the development of these drugs is an ambitious task that involves testing thousands of compounds through biochemical and cell-based assays. In addition, molecules usually target complex biological processes, involving several proteins and different molecular pathways, further emphasizing the need for high-throughput screening techniques and multiplexing technologies in order to identify drugs with the desired phenotype. We will briefly describe two multiplexing technologies [high-content imaging (HCI) and flow cytometry] and two key processes for drug discovery research (assay development and validation) following our own published industry quality standards. We will further focus on HCI as a useful tool for phenotypic screening and will provide a concrete example of HCI assay to detect Aurora-A or -B selective inhibitors discriminating the off-target effects related to the inhibition of other cell cycle or non-cell cycle key regulators. Finally, we will describe other assays that can help to characterize the in vitro pharmacology of the inhibitors.

  4. MicroCT-based phenomics in the zebrafish skeleton reveals virtues of deep phenotyping in a distributed organ system.

    Science.gov (United States)

    Hur, Matthew; Gistelinck, Charlotte A; Huber, Philippe; Lee, Jane; Thompson, Marjorie H; Monstad-Rios, Adrian T; Watson, Claire J; McMenamin, Sarah K; Willaert, Andy; Parichy, David M; Coucke, Paul; Kwon, Ronald Y

    2017-09-08

    Phenomics, which ideally involves in-depth phenotyping at the whole-organism scale, may enhance our functional understanding of genetic variation. Here, we demonstrate methods to profile hundreds of phenotypic measures comprised of morphological and densitometric traits at a large number of sites within the axial skeleton of adult zebrafish. We show the potential for vertebral patterns to confer heightened sensitivity, with similar specificity, in discriminating mutant populations compared to analyzing individual vertebrae in isolation. We identify phenotypes associated with human brittle bone disease and thyroid stimulating hormone receptor hyperactivity. Finally, we develop allometric models and show their potential to aid in the discrimination of mutant phenotypes masked by alterations in growth. Our studies demonstrate virtues of deep phenotyping in a spatially distributed organ system. Analyzing phenotypic patterns may increase productivity in genetic screens, and facilitate the study of genetic variants associated with smaller effect sizes, such as those that underlie complex diseases.

  5. Imaging techniques for visualizing and phenotyping congenital heart defects in murine models.

    Science.gov (United States)

    Liu, Xiaoqin; Tobita, Kimimasa; Francis, Richard J B; Lo, Cecilia W

    2013-06-01

    Mouse model is ideal for investigating the genetic and developmental etiology of congenital heart disease. However, cardiovascular phenotyping for the precise diagnosis of structural heart defects in mice remain challenging. With rapid advances in imaging techniques, there are now high throughput phenotyping tools available for the diagnosis of structural heart defects. In this review, we discuss the efficacy of four different imaging modalities for congenital heart disease diagnosis in fetal/neonatal mice, including noninvasive fetal echocardiography, micro-computed tomography (micro-CT), micro-magnetic resonance imaging (micro-MRI), and episcopic fluorescence image capture (EFIC) histopathology. The experience we have gained in the use of these imaging modalities in a large-scale mouse mutagenesis screen have validated their efficacy for congenital heart defect diagnosis in the tiny hearts of fetal and newborn mice. These cutting edge phenotyping tools will be invaluable for furthering our understanding of the developmental etiology of congenital heart disease. Copyright © 2013 Wiley Periodicals, Inc.

  6. [Phenotype-based primary screening for drugs promoting neuronal subtype differentiation in embryonic stem cells with light microscope].

    Science.gov (United States)

    Gao, Yi-ning; Wang, Dan-ying; Pan, Zong-fu; Mei, Yu-qin; Wang, Zhi-qiang; Zhu, Dan-yan; Lou, Yi-jia

    2012-07-01

    To set up a platform for phenotype-based primary screening of drug candidates promoting neuronal subtype differentiation in embryonic stem cells (ES) with light microscope. Hanging drop culture 4-/4+ method was employed to harvest the cells around embryoid body (EB) at differentiation endpoint. Morphological evaluation for neuron-like cells was performed with light microscope. Axons for more than three times of the length of the cell body were considered as neuron-like cells. The compound(s) that promote neuron-like cells was further evaluated. Icariin (ICA, 10(-6)mol/L) and Isobavachin (IBA, 10(-7)mol/L) were selected to screen the differentiation-promoting activity on ES cells. Immunofluorescence staining with specific antibodies (ChAT, GABA) was used to evaluate the neuron subtypes. The cells treated with IBA showed neuron-like phenotype, but the cells treated with ICA did not exhibit the morphological changes. ES cells treated with IBA was further confirmed to be cholinergic and GABAergic neurons. Phenotypic screening with light microscope for molecules promoting neuronal differentiation is an effective method with advantages of less labor and material consuming and time saving, and false-positive results derived from immunofluorescence can be avoided. The method confirms that IBA is able to facilitate ES cells differentiating into neuronal cells, including cholinergic neurons and GABAergic neurons.

  7. Large Scale Gaussian Processes for Atmospheric Parameter Retrieval and Cloud Screening

    Science.gov (United States)

    Camps-Valls, G.; Gomez-Chova, L.; Mateo, G.; Laparra, V.; Perez-Suay, A.; Munoz-Mari, J.

    2017-12-01

    Current Earth-observation (EO) applications for image classification have to deal with an unprecedented big amount of heterogeneous and complex data sources. Spatio-temporally explicit classification methods are a requirement in a variety of Earth system data processing applications. Upcoming missions such as the super-spectral Copernicus Sentinels EnMAP and FLEX will soon provide unprecedented data streams. Very high resolution (VHR) sensors like Worldview-3 also pose big challenges to data processing. The challenge is not only attached to optical sensors but also to infrared sounders and radar images which increased in spectral, spatial and temporal resolution. Besides, we should not forget the availability of the extremely large remote sensing data archives already collected by several past missions, such ENVISAT, Cosmo-SkyMED, Landsat, SPOT, or Seviri/MSG. These large-scale data problems require enhanced processing techniques that should be accurate, robust and fast. Standard parameter retrieval and classification algorithms cannot cope with this new scenario efficiently. In this work, we review the field of large scale kernel methods for both atmospheric parameter retrieval and cloud detection using infrared sounding IASI data and optical Seviri/MSG imagery. We propose novel Gaussian Processes (GPs) to train problems with millions of instances and high number of input features. Algorithms can cope with non-linearities efficiently, accommodate multi-output problems, and provide confidence intervals for the predictions. Several strategies to speed up algorithms are devised: random Fourier features and variational approaches for cloud classification using IASI data and Seviri/MSG, and engineered randomized kernel functions and emulation in temperature, moisture and ozone atmospheric profile retrieval from IASI as a proxy to the upcoming MTG-IRS sensor. Excellent compromise between accuracy and scalability are obtained in all applications.

  8. Large-scale solar purchasing

    International Nuclear Information System (INIS)

    1999-01-01

    The principal objective of the project was to participate in the definition of a new IEA task concerning solar procurement (''the Task'') and to assess whether involvement in the task would be in the interest of the UK active solar heating industry. The project also aimed to assess the importance of large scale solar purchasing to UK active solar heating market development and to evaluate the level of interest in large scale solar purchasing amongst potential large scale purchasers (in particular housing associations and housing developers). A further aim of the project was to consider means of stimulating large scale active solar heating purchasing activity within the UK. (author)

  9. Phenotypic screening approaches to develop Aurora kinase inhibitors: Drug Discovery perspectives

    Directory of Open Access Journals (Sweden)

    Carlos eMarugán

    2016-01-01

    Full Text Available Targeting mitotic regulators as a strategy to fight cancer implies the development of drugs against key proteins such as Aurora A and B. Current drugs which target mitosis through a general mechanism of action (stabilization/destabilization of microtubules, have several side effects (neutropenia, alopecia, emesis. Pharmaceutical companies aim at avoiding these unwanted effects by generating improved and selective drugs that increase the quality of life of the patients. However, the development of these drugs is an ambitious task that involves testing thousands of compounds through biochemical and cell-based assays. In addition, molecules usually target complex biological processes, involving several proteins and different molecular pathways, further emphasizing the need for high-throughput screening techniques and multiplexing technologies in order to identify drugs with the desired phenotype.We will briefly describe two multiplexing technologies (high-content imaging, microarrays and flow cytometry and two key processes for drug discovery research (assay development and validation following our own published industry quality standards. We will further focus on high-content imaging as a useful tool for phenotypic screening and will provide a concrete example of high-content imaging assay to detect Aurora A or B selective inhibitors discriminating the off-target effects related to inhibition of other cell cycle or non-cell cycle key regulators. Finally, we will describe other assays that can help to characterize the in vitro pharmacology of the inhibitors.

  10. Assessing the Accuracy of the Modified Chinese Autism Spectrum Rating Scale and Social Responsiveness Scale for Screening Autism Spectrum Disorder in Chinese Children

    Institute of Scientific and Technical Information of China (English)

    Bingrui Zhou; Hao Zhou; Lijie Wu; Xiaobing Zou; Xuerong Luo; Eric Fombonne; Yi Wang; Weili Yan; Xiu Xu

    2017-01-01

    The reported prevalence of autism spectrum disorder (ASD) has been increasing rapidly in many parts of the world.However,data on its prevalence in China are largely missing.Here,we assessed the suitability of the modified Chinese version of a newly-developed ASD screening tool,the Modified Chinese Autism Spectrum Rating Scales (MC-ASRS) in screening for ASD in Chinese children aged 6-12 years,through comparison with the Social Responsiveness Scale (SRS) that has been widely used for ASD screening.We recruited the parents/caregivers of 1588 typically-developing children and 190 children with ASD aged 6-12 years to complete the MC-ASRS and SRS,and evaluated the validity of both scales in discriminating children with ASD from those developing typically.The results showed that MC-ASRS performed as well as SRS in sensitivity,specificity,and area-under-the-curve (both >0.95) in receiver operating characteristic analysis,with a fair false-negative rate.These results suggest that MC-ASRS is a promising tool for screening for children with ASD in the general Chinese population.

  11. Large-Scale Off-Target Identification Using Fast and Accurate Dual Regularized One-Class Collaborative Filtering and Its Application to Drug Repurposing.

    Directory of Open Access Journals (Sweden)

    Hansaim Lim

    2016-10-01

    Full Text Available Target-based screening is one of the major approaches in drug discovery. Besides the intended target, unexpected drug off-target interactions often occur, and many of them have not been recognized and characterized. The off-target interactions can be responsible for either therapeutic or side effects. Thus, identifying the genome-wide off-targets of lead compounds or existing drugs will be critical for designing effective and safe drugs, and providing new opportunities for drug repurposing. Although many computational methods have been developed to predict drug-target interactions, they are either less accurate than the one that we are proposing here or computationally too intensive, thereby limiting their capability for large-scale off-target identification. In addition, the performances of most machine learning based algorithms have been mainly evaluated to predict off-target interactions in the same gene family for hundreds of chemicals. It is not clear how these algorithms perform in terms of detecting off-targets across gene families on a proteome scale. Here, we are presenting a fast and accurate off-target prediction method, REMAP, which is based on a dual regularized one-class collaborative filtering algorithm, to explore continuous chemical space, protein space, and their interactome on a large scale. When tested in a reliable, extensive, and cross-gene family benchmark, REMAP outperforms the state-of-the-art methods. Furthermore, REMAP is highly scalable. It can screen a dataset of 200 thousands chemicals against 20 thousands proteins within 2 hours. Using the reconstructed genome-wide target profile as the fingerprint of a chemical compound, we predicted that seven FDA-approved drugs can be repurposed as novel anti-cancer therapies. The anti-cancer activity of six of them is supported by experimental evidences. Thus, REMAP is a valuable addition to the existing in silico toolbox for drug target identification, drug repurposing

  12. Large-Scale Off-Target Identification Using Fast and Accurate Dual Regularized One-Class Collaborative Filtering and Its Application to Drug Repurposing.

    Science.gov (United States)

    Lim, Hansaim; Poleksic, Aleksandar; Yao, Yuan; Tong, Hanghang; He, Di; Zhuang, Luke; Meng, Patrick; Xie, Lei

    2016-10-01

    Target-based screening is one of the major approaches in drug discovery. Besides the intended target, unexpected drug off-target interactions often occur, and many of them have not been recognized and characterized. The off-target interactions can be responsible for either therapeutic or side effects. Thus, identifying the genome-wide off-targets of lead compounds or existing drugs will be critical for designing effective and safe drugs, and providing new opportunities for drug repurposing. Although many computational methods have been developed to predict drug-target interactions, they are either less accurate than the one that we are proposing here or computationally too intensive, thereby limiting their capability for large-scale off-target identification. In addition, the performances of most machine learning based algorithms have been mainly evaluated to predict off-target interactions in the same gene family for hundreds of chemicals. It is not clear how these algorithms perform in terms of detecting off-targets across gene families on a proteome scale. Here, we are presenting a fast and accurate off-target prediction method, REMAP, which is based on a dual regularized one-class collaborative filtering algorithm, to explore continuous chemical space, protein space, and their interactome on a large scale. When tested in a reliable, extensive, and cross-gene family benchmark, REMAP outperforms the state-of-the-art methods. Furthermore, REMAP is highly scalable. It can screen a dataset of 200 thousands chemicals against 20 thousands proteins within 2 hours. Using the reconstructed genome-wide target profile as the fingerprint of a chemical compound, we predicted that seven FDA-approved drugs can be repurposed as novel anti-cancer therapies. The anti-cancer activity of six of them is supported by experimental evidences. Thus, REMAP is a valuable addition to the existing in silico toolbox for drug target identification, drug repurposing, phenotypic screening, and

  13. A Novel Yeast Surface Display Method for Large-Scale Screen Inhibitors of Sortase A.

    Science.gov (United States)

    Wu, Lin; Li, Huijun; Tang, Tianle

    2017-01-24

    high-throughput analysis, but the conventional method is much more sensitive. The method described in this paper is expected to lead to large-scale screening of sortase A inhibitors which can be used to decrease the risk of drug resistance development.

  14. A Novel Yeast Surface Display Method for Large-Scale Screen Inhibitors of Sortase A

    Directory of Open Access Journals (Sweden)

    Lin Wu

    2017-01-01

    suitable for high-throughput analysis, but the conventional method is much more sensitive. The method described in this paper is expected to lead to large-scale screening of sortase A inhibitors which can be used to decrease the risk of drug resistance development.

  15. Large-scale systematic analysis of 2D fingerprint methods and parameters to improve virtual screening enrichments.

    Science.gov (United States)

    Sastry, Madhavi; Lowrie, Jeffrey F; Dixon, Steven L; Sherman, Woody

    2010-05-24

    A systematic virtual screening study on 11 pharmaceutically relevant targets has been conducted to investigate the interrelation between 8 two-dimensional (2D) fingerprinting methods, 13 atom-typing schemes, 13 bit scaling rules, and 12 similarity metrics using the new cheminformatics package Canvas. In total, 157 872 virtual screens were performed to assess the ability of each combination of parameters to identify actives in a database screen. In general, fingerprint methods, such as MOLPRINT2D, Radial, and Dendritic that encode information about local environment beyond simple linear paths outperformed other fingerprint methods. Atom-typing schemes with more specific information, such as Daylight, Mol2, and Carhart were generally superior to more generic atom-typing schemes. Enrichment factors across all targets were improved considerably with the best settings, although no single set of parameters performed optimally on all targets. The size of the addressable bit space for the fingerprints was also explored, and it was found to have a substantial impact on enrichments. Small bit spaces, such as 1024, resulted in many collisions and in a significant degradation in enrichments compared to larger bit spaces that avoid collisions.

  16. Human factors guidelines for large-screen displays

    International Nuclear Information System (INIS)

    Collier, Steve

    2005-09-01

    Any control-room project (including upgrades or evolutionary improvements to existing control-rooms) is well advised at the outset first to gather and update related background material for the design. This information-gathering exercise should also take into account experience from similar projects and operating experience. For these reasons, we decided to use our research, and experience in large-screen display design with several clients to update human factors guidance for large-screen displays, to take into account new ergonomics guidelines, operating experience, and work from similar projects. To write the updated guidelines, we drew on much of our experience across several departments at IFE, including research funded by the HRP programme, and experience with individual clients. Guidance here is accordingly focused mainly on recent areas of technical and human innovations in the man-machine interface. One particular area of focus was on the increasing use of large-screen display systems in modern control-rooms, and on how guidelines could be adapted and supplemented for their design. Guidance or reference to recommended sources is also given for control suite arrangement and layout, control-room layout, workstation layout, design of displays and controls, and design of the work environment, especially insofar as these ergonomic issues interact with the effectiveness of modern displays, in particular large screen displays. The work shows that there can be synergy between HRP research and bilateral activities: the one side offers a capability to develop tools and guidelines, while the other side gives an opportunity to test and refine these in practice, to the benefit of both parties. (Author)

  17. Human factors guidelines for large-screen displays

    Energy Technology Data Exchange (ETDEWEB)

    Collier, Steve

    2005-09-15

    Any control-room project (including upgrades or evolutionary improvements to existing control-rooms) is well advised at the outset first to gather and update related background material for the design. This information-gathering exercise should also take into account experience from similar projects and operating experience. For these reasons, we decided to use our research, and experience in large-screen display design with several clients to update human factors guidance for large-screen displays, to take into account new ergonomics guidelines, operating experience, and work from similar projects. To write the updated guidelines, we drew on much of our experience across several departments at IFE, including research funded by the HRP programme, and experience with individual clients. Guidance here is accordingly focused mainly on recent areas of technical and human innovations in the man-machine interface. One particular area of focus was on the increasing use of large-screen display systems in modern control-rooms, and on how guidelines could be adapted and supplemented for their design. Guidance or reference to recommended sources is also given for control suite arrangement and layout, control-room layout, workstation layout, design of displays and controls, and design of the work environment, especially insofar as these ergonomic issues interact with the effectiveness of modern displays, in particular large screen displays. The work shows that there can be synergy between HRP research and bilateral activities: the one side offers a capability to develop tools and guidelines, while the other side gives an opportunity to test and refine these in practice, to the benefit of both parties. (Author)

  18. Effectiveness of a Web-Based Protocol for the Screening and Phenotyping of Individuals with Tourette Syndrome for Genetic Studies

    Science.gov (United States)

    Egan, Crystelle; Marakovitz, Susan; O’Rourke, Julia; Osiecki, Lisa; Illmann, Cornelia; Barton, Lauren; McLaughlin, Elizabeth; Proujansky, Rachel; Royal, Justin; Cowley, Heather; Rangel-Lugo, Martha; Pauls, David; Scharf, Jeremiah M.; Mathews, Carol A.

    2014-01-01

    Genome-wide association studies (GWAS) and other emerging technologies offer great promise for the identification of genetic risk factors for complex psychiatric disorders, yet such studies are constrained by the need for large sample sizes. Web-based collection offers a relatively untapped resource for increasing participant recruitment. Therefore, we developed and implemented a novel web-based screening and phenotyping protocol for genetic studies of Tourette Syndrome (TS), a childhood-onset neuropsychiatric disorder characterized by motor and vocal tics. Participants were recruited over a 13 month period through the membership of the Tourette Syndrome Association (TSA) (n=28,878). Of the TSA members contacted, 4.3% (1,242) initiated the questionnaire, and 79.5% (987) of these were enrollment eligible. 63.9% (631) of enrolled participants completed the study by submitting phenotypic data and blood specimens. Age was the only variable that predicted study completion; children and young adults were significantly less likely to be study completers than adults 26 and older. Compared to a clinic-based study conducted over the same time period, the web-based method yielded a 60% larger sample. Web-based participants were older and more often female; otherwise, the sample characteristics did not differ significantly. TS diagnoses based on the web-screen demonstrated 100% accuracy compared to those derived from in-depth clinical interviews. Our results suggest that a web-based approach is effective for increasing the sample size for genetic studies of a relatively rare disorder and that our web-based screen is valid for diagnosing TS. Findings from this study should aid in the development of web-based protocols for other disorders. PMID:23090870

  19. Public support for neonatal screening for Pompe disease, a broad-phenotype condition

    Directory of Open Access Journals (Sweden)

    Weinreich Stephanie

    2012-03-01

    screening for Pompe disease, not only among those who have personal experience of the disease but also among the general public in the Netherlands. Optional screening on the basis of informed parental consent is probably unrealistic, underlining the need for new guidelines to help policymakers in their consideration of newborn screening for broad phenotype conditions.

  20. Energy transfers in large-scale and small-scale dynamos

    Science.gov (United States)

    Samtaney, Ravi; Kumar, Rohit; Verma, Mahendra

    2015-11-01

    We present the energy transfers, mainly energy fluxes and shell-to-shell energy transfers in small-scale dynamo (SSD) and large-scale dynamo (LSD) using numerical simulations of MHD turbulence for Pm = 20 (SSD) and for Pm = 0.2 on 10243 grid. For SSD, we demonstrate that the magnetic energy growth is caused by nonlocal energy transfers from the large-scale or forcing-scale velocity field to small-scale magnetic field. The peak of these energy transfers move towards lower wavenumbers as dynamo evolves, which is the reason for the growth of the magnetic fields at the large scales. The energy transfers U2U (velocity to velocity) and B2B (magnetic to magnetic) are forward and local. For LSD, we show that the magnetic energy growth takes place via energy transfers from large-scale velocity field to large-scale magnetic field. We observe forward U2U and B2B energy flux, similar to SSD.

  1. Quality Control Test for Sequence-Phenotype Assignments

    Science.gov (United States)

    Ortiz, Maria Teresa Lara; Rosario, Pablo Benjamín Leon; Luna-Nevarez, Pablo; Gamez, Alba Savin; Martínez-del Campo, Ana; Del Rio, Gabriel

    2015-01-01

    Relating a gene mutation to a phenotype is a common task in different disciplines such as protein biochemistry. In this endeavour, it is common to find false relationships arising from mutations introduced by cells that may be depurated using a phenotypic assay; yet, such phenotypic assays may introduce additional false relationships arising from experimental errors. Here we introduce the use of high-throughput DNA sequencers and statistical analysis aimed to identify incorrect DNA sequence-phenotype assignments and observed that 10–20% of these false assignments are expected in large screenings aimed to identify critical residues for protein function. We further show that this level of incorrect DNA sequence-phenotype assignments may significantly alter our understanding about the structure-function relationship of proteins. We have made available an implementation of our method at http://bis.ifc.unam.mx/en/software/chispas. PMID:25700273

  2. Managing sensitive phenotypic data and biomaterial in large-scale collaborative psychiatric genetic research projects: practical considerations.

    Science.gov (United States)

    Demiroglu, S Y; Skrowny, D; Quade, M; Schwanke, J; Budde, M; Gullatz, V; Reich-Erkelenz, D; Jakob, J J; Falkai, P; Rienhoff, O; Helbing, K; Heilbronner, U; Schulze, T G

    2012-12-01

    Large-scale collaborative research will be a hallmark of future psychiatric genetic research. Ideally, both academic and non-academic institutions should be able to participate in such collaborations to allow for the establishment of very large samples in a straightforward manner. Any such endeavor requires an easy-to-implement information technology (IT) framework. Here we present the requirements for a centralized framework and describe how they can be met through a modular IT toolbox.

  3. Scaling up high throughput field phenotyping of corn and soy research plots using ground rovers

    Science.gov (United States)

    Peshlov, Boyan; Nakarmi, Akash; Baldwin, Steven; Essner, Scott; French, Jasenka

    2017-05-01

    Crop improvement programs require large and meticulous selection processes that effectively and accurately collect and analyze data to generate quality plant products as efficiently as possible, develop superior cropping and/or crop improvement methods. Typically, data collection for such testing is performed by field teams using hand-held instruments or manually-controlled devices. Although steps are taken to reduce error, the data collected in such manner can be unreliable due to human error and fatigue, which reduces the ability to make accurate selection decisions. Monsanto engineering teams have developed a high-clearance mobile platform (Rover) as a step towards high throughput and high accuracy phenotyping at an industrial scale. The rovers are equipped with GPS navigation, multiple cameras and sensors and on-board computers to acquire data and compute plant vigor metrics per plot. The supporting IT systems enable automatic path planning, plot identification, image and point cloud data QA/QC and near real-time analysis where results are streamed to enterprise databases for additional statistical analysis and product advancement decisions. Since the rover program was launched in North America in 2013, the number of research plots we can analyze in a growing season has expanded dramatically. This work describes some of the successes and challenges in scaling up of the rover platform for automated phenotyping to enable science at scale.

  4. Using tumor phenotype, histological tumor distribution, and mammographic appearance to explain the survival differences between screen-detected and clinically detected breast cancers.

    Science.gov (United States)

    Chuang, Shu-Lin; Chen, Sam Li-Sheng; Yu, Cheng-Ping; Chang, King-Jen; Yen, Amy Ming-Fang; Chiu, Sherry Yueh-Hsia; Fann, Jean Ching-Yuan; Tabár, László; Stephen, Duffy W; Smith, Robert A; Chen, Hsiu-Hsi

    2014-08-01

    In the era of mass screening for breast cancer with mammography, it has been noted that conventional tumor attributes and mammographic appearance are insufficient to account for the better prognosis of screen-detected tumors. Such prognostication may require additional updated pathological information regarding tumor phenotype (e.g., basal status) and histological tumor distribution (focality). We investigated this hypothesis using a Bayesian approach to analyze breast cancer data from Dalarna County, Sweden. We used data for tumors diagnosed in the Swedish Two-County Trial and early service screening period, 1977-1995, and from the mature service screening period, 1996-1998. In the early period of mammographic screening (1977-1995), the crude hazard ratio (HR) of breast cancer death for screen-detected cases compared with symptomatic ones was 0.22 (95% CI: 0.17-0.29) compared with 0.53 (95% CI: 0.34-0.76) when adjusted for conventional tumor attributes only. Using the data from the mature service screening period, 1996-1998, the HR was 0.23 (95% CI: 0.08-0.44) unadjusted and 0.71 (95% CI: 0.26-1.47) after adjustment for tumor phenotype, mammographic appearance, histological tumor distribution, and conventional tumor attributes. The area under the ROC curve (AUC) for the prediction of breast cancer deaths using these variables without the detection mode was 0.82, only slightly less than that observed when additionally including the detection mode (AUC=0.83). Using Freedman statistics, conventional tumor attributes and mammographic appearances explained 58% (95% CI: 57.5-58.6%) of the difference of breast cancer survival between the screen-detected and the clinically detected breast cancers, whereas the corresponding figure was increased to 77% (95% CI: 75.6-77.6%) when adding the two information on tumor phenotype and histological tumor distribution. The results indicated that conventional tumor attributes and mammographic appearance are not sufficient to be

  5. Large-scale data analytics

    CERN Document Server

    Gkoulalas-Divanis, Aris

    2014-01-01

    Provides cutting-edge research in large-scale data analytics from diverse scientific areas Surveys varied subject areas and reports on individual results of research in the field Shares many tips and insights into large-scale data analytics from authors and editors with long-term experience and specialization in the field

  6. Large-scale grid management

    International Nuclear Information System (INIS)

    Langdal, Bjoern Inge; Eggen, Arnt Ove

    2003-01-01

    The network companies in the Norwegian electricity industry now have to establish a large-scale network management, a concept essentially characterized by (1) broader focus (Broad Band, Multi Utility,...) and (2) bigger units with large networks and more customers. Research done by SINTEF Energy Research shows so far that the approaches within large-scale network management may be structured according to three main challenges: centralization, decentralization and out sourcing. The article is part of a planned series

  7. Psychometric properties of the Vulnerability to Abuse Screening Scale for screening abuse of older adults

    Directory of Open Access Journals (Sweden)

    Raquel Batista Dantas

    Full Text Available ABSTRACT OBJECTIVE Adapt and evaluate the psychometric properties of the Vulnerability to Abuse Screening Scale to identify risk of domestic violence against older adults in Brazil. METHODS The instrument was adapted and validated in a sample of 151 older adults from a geriatric reference center in the municipality of Belo Horizonte, State of Minas Gerais, in 2014. We collected sociodemographic, clinical, and abuse-related information, and verified reliability by reproducibility in a sample of 55 older people, who underwent re-testing of the instrument seven days after the first application. Descriptive and comparative analyses were performed for all variables, with a significance level of 5%. The construct validity was analyzed by the principal components method with a tetrachoric correlation matrix, the reliability of the scale by the weighted Kappa (Kp statistic, and the internal consistency by the Kuder-Richardson estimator formula 20 (KR-20. RESULTS The average age of the participants was 72.1 years (DP = 6.96; 95%CI 70.94–73.17, with a maximum of 92 years, and they were predominantly female (76.2%; 95%CI 69.82–83.03. When analyzing the relationship between the scores of the Vulnerability to Abuse Screening Scale, categorized by presence (score > 3 or absence (score < 3 of vulnerability to abuse, with clinical and health conditions, we found statistically significant differences for self-perception of health (p = 0.002, depressive symptoms (p = 0.000, and presence of rheumatism (p = 0.003. There were no statistically significant differences between sexes. The Vulnerability to Abuse Screening Scale acceptably evaluated validity in the transcultural adaptation process, demonstrating dimensionality coherent with the original proposal (four factors. In the internal consistency analysis, the instrument presented good results (KR-20 = 0.69 and the reliability via reproducibility was considered excellent for the global scale (Kp = 0

  8. EuroPhenome and EMPReSS: online mouse phenotyping resource.

    Science.gov (United States)

    Mallon, Ann-Marie; Blake, Andrew; Hancock, John M

    2008-01-01

    EuroPhenome (http://www.europhenome.org) and EMPReSS (http://empress.har.mrc.ac.uk/) form an integrated resource to provide access to data and procedures for mouse phenotyping. EMPReSS describes 96 Standard Operating Procedures for mouse phenotyping. EuroPhenome contains data resulting from carrying out EMPReSS protocols on four inbred laboratory mouse strains. As well as web interfaces, both resources support web services to enable integration with other mouse phenotyping and functional genetics resources, and are committed to initiatives to improve integration of mouse phenotype databases. EuroPhenome will be the repository for a recently initiated effort to carry out large-scale phenotyping on a large number of knockout mouse lines (EUMODIC).

  9. Disease modeling and phenotypic drug screening for diabetic cardiomyopathy using human induced pluripotent stem cells.

    Science.gov (United States)

    Drawnel, Faye M; Boccardo, Stefano; Prummer, Michael; Delobel, Frédéric; Graff, Alexandra; Weber, Michael; Gérard, Régine; Badi, Laura; Kam-Thong, Tony; Bu, Lei; Jiang, Xin; Hoflack, Jean-Christophe; Kiialainen, Anna; Jeworutzki, Elena; Aoyama, Natsuyo; Carlson, Coby; Burcin, Mark; Gromo, Gianni; Boehringer, Markus; Stahlberg, Henning; Hall, Benjamin J; Magnone, Maria Chiara; Kolaja, Kyle; Chien, Kenneth R; Bailly, Jacques; Iacone, Roberto

    2014-11-06

    Diabetic cardiomyopathy is a complication of type 2 diabetes, with known contributions of lifestyle and genetics. We develop environmentally and genetically driven in vitro models of the condition using human-induced-pluripotent-stem-cell-derived cardiomyocytes. First, we mimic diabetic clinical chemistry to induce a phenotypic surrogate of diabetic cardiomyopathy, observing structural and functional disarray. Next, we consider genetic effects by deriving cardiomyocytes from two diabetic patients with variable disease progression. The cardiomyopathic phenotype is recapitulated in the patient-specific cells basally, with a severity dependent on their original clinical status. These models are incorporated into successive levels of a screening platform, identifying drugs that preserve cardiomyocyte phenotype in vitro during diabetic stress. In this work, we present a patient-specific induced pluripotent stem cell (iPSC) model of a complex metabolic condition, showing the power of this technique for discovery and testing of therapeutic strategies for a disease with ever-increasing clinical significance. Copyright © 2014 The Authors. Published by Elsevier Inc. All rights reserved.

  10. Disease Modeling and Phenotypic Drug Screening for Diabetic Cardiomyopathy using Human Induced Pluripotent Stem Cells

    Directory of Open Access Journals (Sweden)

    Faye M. Drawnel

    2014-11-01

    Full Text Available Diabetic cardiomyopathy is a complication of type 2 diabetes, with known contributions of lifestyle and genetics. We develop environmentally and genetically driven in vitro models of the condition using human-induced-pluripotent-stem-cell-derived cardiomyocytes. First, we mimic diabetic clinical chemistry to induce a phenotypic surrogate of diabetic cardiomyopathy, observing structural and functional disarray. Next, we consider genetic effects by deriving cardiomyocytes from two diabetic patients with variable disease progression. The cardiomyopathic phenotype is recapitulated in the patient-specific cells basally, with a severity dependent on their original clinical status. These models are incorporated into successive levels of a screening platform, identifying drugs that preserve cardiomyocyte phenotype in vitro during diabetic stress. In this work, we present a patient-specific induced pluripotent stem cell (iPSC model of a complex metabolic condition, showing the power of this technique for discovery and testing of therapeutic strategies for a disease with ever-increasing clinical significance.

  11. Translation of Genotype to Phenotype by a Hierarchy of Cell Subsystems.

    Science.gov (United States)

    Yu, Michael Ku; Kramer, Michael; Dutkowski, Janusz; Srivas, Rohith; Licon, Katherine; Kreisberg, Jason; Ng, Cherie T; Krogan, Nevan; Sharan, Roded; Ideker, Trey

    2016-02-24

    Accurately translating genotype to phenotype requires accounting for the functional impact of genetic variation at many biological scales. Here we present a strategy for genotype-phenotype reasoning based on existing knowledge of cellular subsystems. These subsystems and their hierarchical organization are defined by the Gene Ontology or a complementary ontology inferred directly from previously published datasets. Guided by the ontology's hierarchical structure, we organize genotype data into an "ontotype," that is, a hierarchy of perturbations representing the effects of genetic variation at multiple cellular scales. The ontotype is then interpreted using logical rules generated by machine learning to predict phenotype. This approach substantially outperforms previous, non-hierarchical methods for translating yeast genotype to cell growth phenotype, and it accurately predicts the growth outcomes of two new screens of 2,503 double gene knockouts impacting DNA repair or nuclear lumen. Ontotypes also generalize to larger knockout combinations, setting the stage for interpreting the complex genetics of disease.

  12. Drug discovery for schistosomiasis: hit and lead compounds identified in a library of known drugs by medium-throughput phenotypic screening.

    Directory of Open Access Journals (Sweden)

    Maha-Hamadien Abdulla

    2009-07-01

    Full Text Available Praziquantel (PZQ is the only widely available drug to treat schistosomiasis. Given the potential for drug resistance, it is prudent to search for novel therapeutics. Identification of anti-schistosomal chemicals has traditionally relied on phenotypic (whole organism screening with adult worms in vitro and/or animal models of disease-tools that limit automation and throughput with modern microtiter plate-formatted compound libraries.A partially automated, three-component phenotypic screen workflow is presented that utilizes at its apex the schistosomular stage of the parasite adapted to a 96-well plate format with a throughput of 640 compounds per month. Hits that arise are subsequently screened in vitro against adult parasites and finally for efficacy in a murine model of disease. Two GO/NO GO criteria filters in the workflow prioritize hit compounds for tests in the animal disease model in accordance with a target drug profile that demands short-course oral therapy. The screen workflow was inaugurated with 2,160 chemically diverse natural and synthetic compounds, of which 821 are drugs already approved for human use. This affords a unique starting point to 'reposition' (re-profile drugs as anti-schistosomals with potential savings in development timelines and costs.Multiple and dynamic phenotypes could be categorized for schistosomula and adults in vitro, and a diverse set of 'hit' drugs and chemistries were identified, including anti-schistosomals, anthelmintics, antibiotics, and neuromodulators. Of those hits prioritized for tests in the animal disease model, a number of leads were identified, one of which compares reasonably well with PZQ in significantly decreasing worm and egg burdens, and disease-associated pathology. Data arising from the three components of the screen are posted online as a community resource.To accelerate the identification of novel anti-schistosomals, we have developed a partially automated screen workflow that

  13. Ethics of large-scale change

    OpenAIRE

    Arler, Finn

    2006-01-01

      The subject of this paper is long-term large-scale changes in human society. Some very significant examples of large-scale change are presented: human population growth, human appropriation of land and primary production, the human use of fossil fuels, and climate change. The question is posed, which kind of attitude is appropriate when dealing with large-scale changes like these from an ethical point of view. Three kinds of approaches are discussed: Aldo Leopold's mountain thinking, th...

  14. Discovery of candidate disease genes in ENU-induced mouse mutants by large-scale sequencing, including a splice-site mutation in nucleoredoxin.

    Directory of Open Access Journals (Sweden)

    Melissa K Boles

    2009-12-01

    Full Text Available An accurate and precisely annotated genome assembly is a fundamental requirement for functional genomic analysis. Here, the complete DNA sequence and gene annotation of mouse Chromosome 11 was used to test the efficacy of large-scale sequencing for mutation identification. We re-sequenced the 14,000 annotated exons and boundaries from over 900 genes in 41 recessive mutant mouse lines that were isolated in an N-ethyl-N-nitrosourea (ENU mutation screen targeted to mouse Chromosome 11. Fifty-nine sequence variants were identified in 55 genes from 31 mutant lines. 39% of the lesions lie in coding sequences and create primarily missense mutations. The other 61% lie in noncoding regions, many of them in highly conserved sequences. A lesion in the perinatal lethal line l11Jus13 alters a consensus splice site of nucleoredoxin (Nxn, inserting 10 amino acids into the resulting protein. We conclude that point mutations can be accurately and sensitively recovered by large-scale sequencing, and that conserved noncoding regions should be included for disease mutation identification. Only seven of the candidate genes we report have been previously targeted by mutation in mice or rats, showing that despite ongoing efforts to functionally annotate genes in the mammalian genome, an enormous gap remains between phenotype and function. Our data show that the classical positional mapping approach of disease mutation identification can be extended to large target regions using high-throughput sequencing.

  15. Large-scale assessment of olfactory preferences and learning in Drosophila melanogaster: behavioral and genetic components

    Directory of Open Access Journals (Sweden)

    Elisabetta Versace

    2015-09-01

    Full Text Available In the Evolve and Resequence method (E&R, experimental evolution and genomics are combined to investigate evolutionary dynamics and the genotype-phenotype link. As other genomic approaches, this methods requires many replicates with large population sizes, which imposes severe restrictions on the analysis of behavioral phenotypes. Aiming to use E&R for investigating the evolution of behavior in Drosophila, we have developed a simple and effective method to assess spontaneous olfactory preferences and learning in large samples of fruit flies using a T-maze. We tested this procedure on (a a large wild-caught population and (b 11 isofemale lines of Drosophila melanogaster. Compared to previous methods, this procedure reduces the environmental noise and allows for the analysis of large population samples. Consistent with previous results, we show that flies have a preference for orange vs. apple odor. With our procedure wild-derived flies exhibit olfactory learning in the absence of previous laboratory selection. Furthermore, we find genetic differences in the olfactory learning with relatively high heritability. We propose this large-scale method as an effective tool for E&R and genome-wide association studies on olfactory preferences and learning.

  16. Mining Genome-Scale Growth Phenotype Data through Constant-Column Biclustering

    KAUST Repository

    Alzahrani, Majed A.

    2017-01-01

    for mining in growth phenotype data. Here, we propose Gracob, a novel, efficient graph-based method that casts and solves the constant-column biclustering problem as a maximal clique finding problem in a multipartite graph. We compared Gracob with a large

  17. Image-based phenotyping for non-destructive screening of different salinity tolerance traits in rice

    KAUST Repository

    Hairmansis, Aris

    2014-08-14

    Background Soil salinity is an abiotic stress wide spread in rice producing areas, limiting both plant growth and yield. The development of salt-tolerant rice requires efficient and high-throughput screening techniques to identify promising lines for salt affected areas. Advances made in image-based phenotyping techniques provide an opportunity to use non-destructive imaging to screen for salinity tolerance traits in a wide range of germplasm in a reliable, quantitative and efficient way. However, the application of image-based phenotyping in the development of salt-tolerant rice remains limited. Results A non-destructive image-based phenotyping protocol to assess salinity tolerance traits of two rice cultivars (IR64 and Fatmawati) has been established in this study. The response of rice to different levels of salt stress was quantified over time based on total shoot area and senescent shoot area, calculated from visible red-green-blue (RGB) and fluorescence images. The response of rice to salt stress (50, 75 and 100 mM NaCl) could be clearly distinguished from the control as indicated by the reduced increase of shoot area. The salt concentrations used had only a small effect on the growth of rice during the initial phase of stress, the shoot Na+ accumulation independent phase termed the ‘osmotic stress’ phase. However, after 20 d of treatment, the shoot area of salt stressed plants was reduced compared with non-stressed plants. This was accompanied by a significant increase in the concentration of Na+ in the shoot. Variation in the senescent area of the cultivars IR64 and Fatmawati in response to a high concentration of Na+ in the shoot indicates variation in tissue tolerance mechanisms between the cultivars. Conclusions Image analysis has the potential to be used for high-throughput screening procedures in the development of salt-tolerant rice. The ability of image analysis to discriminate between the different aspects of salt stress (shoot ion

  18. HCS-Neurons: identifying phenotypic changes in multi-neuron images upon drug treatments of high-content screening.

    Science.gov (United States)

    Charoenkwan, Phasit; Hwang, Eric; Cutler, Robert W; Lee, Hua-Chin; Ko, Li-Wei; Huang, Hui-Ling; Ho, Shinn-Ying

    2013-01-01

    High-content screening (HCS) has become a powerful tool for drug discovery. However, the discovery of drugs targeting neurons is still hampered by the inability to accurately identify and quantify the phenotypic changes of multiple neurons in a single image (named multi-neuron image) of a high-content screen. Therefore, it is desirable to develop an automated image analysis method for analyzing multi-neuron images. We propose an automated analysis method with novel descriptors of neuromorphology features for analyzing HCS-based multi-neuron images, called HCS-neurons. To observe multiple phenotypic changes of neurons, we propose two kinds of descriptors which are neuron feature descriptor (NFD) of 13 neuromorphology features, e.g., neurite length, and generic feature descriptors (GFDs), e.g., Haralick texture. HCS-neurons can 1) automatically extract all quantitative phenotype features in both NFD and GFDs, 2) identify statistically significant phenotypic changes upon drug treatments using ANOVA and regression analysis, and 3) generate an accurate classifier to group neurons treated by different drug concentrations using support vector machine and an intelligent feature selection method. To evaluate HCS-neurons, we treated P19 neurons with nocodazole (a microtubule depolymerizing drug which has been shown to impair neurite development) at six concentrations ranging from 0 to 1000 ng/mL. The experimental results show that all the 13 features of NFD have statistically significant difference with respect to changes in various levels of nocodazole drug concentrations (NDC) and the phenotypic changes of neurites were consistent to the known effect of nocodazole in promoting neurite retraction. Three identified features, total neurite length, average neurite length, and average neurite area were able to achieve an independent test accuracy of 90.28% for the six-dosage classification problem. This NFD module and neuron image datasets are provided as a freely downloadable

  19. Phenotype Instance Verification and Evaluation Tool (PIVET): A Scaled Phenotype Evidence Generation Framework Using Web-Based Medical Literature

    Science.gov (United States)

    Ke, Junyuan; Ho, Joyce C; Ghosh, Joydeep; Wallace, Byron C

    2018-01-01

    which PheKnow-Cloud was originally developed, but PIVET’s analysis is an order of magnitude faster than that of PheKnow-Cloud. Not only is PIVET much faster, it can be scaled to a larger corpus and still retain speed. We evaluated multiple classification models on top of the PIVET framework and found ridge regression to perform best, realizing an average F1 score of 0.91 when predicting clinically relevant phenotypes. Conclusions Our study shows that PIVET improves on the most notable existing computational tool for phenotype validation in terms of speed and automation and is comparable in terms of accuracy. PMID:29728351

  20. Non-destructive screening method for radiation hardened performance of large scale integration

    International Nuclear Information System (INIS)

    Zhou Dong; Xi Shanbin; Guo Qi; Ren Diyuan; Li Yudong; Sun Jing; Wen Lin

    2013-01-01

    The space radiation environment could induce radiation damage on the electronic devices. As the performance of commercial devices is generally superior to that of radiation hardened devices, it is necessary to screen out the devices with good radiation hardened performance from the commercial devices and applying these devices to space systems could improve the reliability of the systems. Combining the mathematical regression analysis with the different physical stressing experiments, we investigated the non-destructive screening method for radiation hardened performance of the integrated circuit. The relationship between the change of typical parameters and the radiation performance of the circuit was discussed. The irradiation-sensitive parameters were confirmed. The pluralistic linear regression equation toward the prediction of the radiation performance was established. Finally, the regression equations under stress conditions were verified by practical irradiation. The results show that the reliability and accuracy of the non-destructive screening method can be elevated by combining the mathematical regression analysis with the practical stressing experiment. (authors)

  1. Political consultation and large-scale research

    International Nuclear Information System (INIS)

    Bechmann, G.; Folkers, H.

    1977-01-01

    Large-scale research and policy consulting have an intermediary position between sociological sub-systems. While large-scale research coordinates science, policy, and production, policy consulting coordinates science, policy and political spheres. In this very position, large-scale research and policy consulting lack of institutional guarantees and rational back-ground guarantee which are characteristic for their sociological environment. This large-scale research can neither deal with the production of innovative goods under consideration of rentability, nor can it hope for full recognition by the basis-oriented scientific community. Policy consulting knows neither the competence assignment of the political system to make decisions nor can it judge succesfully by the critical standards of the established social science, at least as far as the present situation is concerned. This intermediary position of large-scale research and policy consulting has, in three points, a consequence supporting the thesis which states that this is a new form of institutionalization of science: These are: 1) external control, 2) the organization form, 3) the theoretical conception of large-scale research and policy consulting. (orig.) [de

  2. A Phenotypic Screen for Functional Mutants of Human Adenosine Deaminase Acting on RNA 1.

    Science.gov (United States)

    Wang, Yuru; Havel, Jocelyn; Beal, Peter A

    2015-11-20

    Adenosine deaminases acting on RNA (ADARs) are RNA-editing enzymes responsible for the conversion of adenosine to inosine at specific locations in cellular RNAs. ADAR1 and ADAR2 are two members of the family that have been shown to be catalytically active. Earlier, we reported a phenotypic screen for the study of human ADAR2 using budding yeast S. cerevisiae as the host system. While this screen has been successfully applied to the study of ADAR2, it failed with ADAR1. Here, we report a new reporter that uses a novel editing substrate and is suitable for the study of ADAR1. We screened plasmid libraries with randomized codons for two important residues in human ADAR1 (G1007 and E1008). The screening results combined with in vitro deamination assays led to the identification of mutants that are more active than the wild type protein. Furthermore, a screen of the ADAR1 E1008X library with a reporter construct bearing an A•G mismatch at the editing site suggests one role for the residue at position 1008 is to sense the identity of the base pairing partner for the editing site adenosine. This work has provided a starting point for future in vitro evolution studies of ADAR1 and led to new insight into ADAR's editing site selectivity.

  3. Large-scale multimedia modeling applications

    International Nuclear Information System (INIS)

    Droppo, J.G. Jr.; Buck, J.W.; Whelan, G.; Strenge, D.L.; Castleton, K.J.; Gelston, G.M.

    1995-08-01

    Over the past decade, the US Department of Energy (DOE) and other agencies have faced increasing scrutiny for a wide range of environmental issues related to past and current practices. A number of large-scale applications have been undertaken that required analysis of large numbers of potential environmental issues over a wide range of environmental conditions and contaminants. Several of these applications, referred to here as large-scale applications, have addressed long-term public health risks using a holistic approach for assessing impacts from potential waterborne and airborne transport pathways. Multimedia models such as the Multimedia Environmental Pollutant Assessment System (MEPAS) were designed for use in such applications. MEPAS integrates radioactive and hazardous contaminants impact computations for major exposure routes via air, surface water, ground water, and overland flow transport. A number of large-scale applications of MEPAS have been conducted to assess various endpoints for environmental and human health impacts. These applications are described in terms of lessons learned in the development of an effective approach for large-scale applications

  4. Phenotype Instance Verification and Evaluation Tool (PIVET): A Scaled Phenotype Evidence Generation Framework Using Web-Based Medical Literature.

    Science.gov (United States)

    Henderson, Jette; Ke, Junyuan; Ho, Joyce C; Ghosh, Joydeep; Wallace, Byron C

    2018-05-04

    PIVET's analysis is an order of magnitude faster than that of PheKnow-Cloud. Not only is PIVET much faster, it can be scaled to a larger corpus and still retain speed. We evaluated multiple classification models on top of the PIVET framework and found ridge regression to perform best, realizing an average F1 score of 0.91 when predicting clinically relevant phenotypes. Our study shows that PIVET improves on the most notable existing computational tool for phenotype validation in terms of speed and automation and is comparable in terms of accuracy. ©Jette Henderson, Junyuan Ke, Joyce C Ho, Joydeep Ghosh, Byron C Wallace. Originally published in the Journal of Medical Internet Research (http://www.jmir.org), 04.05.2018.

  5. Large-scale biophysical evaluation of protein PEGylation effects

    DEFF Research Database (Denmark)

    Vernet, Erik; Popa, Gina; Pozdnyakova, Irina

    2016-01-01

    PEGylation is the most widely used method to chemically modify protein biopharmaceuticals, but surprisingly limited public data is available on the biophysical effects of protein PEGylation. Here we report the first large-scale study, with site-specific mono-PEGylation of 15 different proteins...... of PEGylation on the thermal stability of a protein based on data generated by circular dichroism (CD), differential scanning calorimetry (DSC), or differential scanning fluorimetry (DSF). In addition, DSF was validated as a fast and inexpensive screening method for thermal unfolding studies of PEGylated...... proteins. Multivariate data analysis revealed clear trends in biophysical properties upon PEGylation for a subset of proteins, although no universal trends were found. Taken together, these findings are important in the consideration of biophysical methods and evaluation of second...

  6. Mutation Spectrum in the Large GTPase Dynamin 2, and Genotype–Phenotype Correlation in Autosomal Dominant Centronuclear Myopathy

    Science.gov (United States)

    Böhm, Johann; Biancalana, Valérie; DeChene, Elizabeth T.; Bitoun, Marc; Pierson, Christopher R.; Schaefer, Elise; Karasoy, Hatice; Dempsey, Melissa A.; Klein, Fabrice; Dondaine, Nicolas; Kretz, Christine; Haumesser, Nicolas; Poirson, Claire; Toussaint, Anne; Greenleaf, Rebecca S.; Barger, Melissa A.; Mahoney, Lane J.; Kang, Peter B.; Zanoteli, Edmar; Vissing, John; Witting, Nanna; Echaniz-Laguna, Andoni; Wallgren-Pettersson, Carina; Dowling, James; Merlini, Luciano; Oldfors, Anders; Ousager, Lilian Bomme; Melki, Judith; Krause, Amanda; Jern, Christina; Oliveira, Acary S. B.; Petit, Florence; Jacquette, Aurélia; Chaussenot, Annabelle; Mowat, David; Leheup, Bruno; Cristofano, Michele; Aldea, Juan José Poza; Michel, Fabrice; Furby, Alain; Llona, Jose E. Barcena; Van Coster, Rudy; Bertini, Enrico; Urtizberea, Jon Andoni; Drouin-Garraud, Valérie; Béroud, Christophe; Prudhon, Bernard; Bedford, Melanie; Mathews, Katherine; Erby, Lori A. H.; Smith, Stephen A.; Roggenbuck, Jennifer; Crowe, Carol A.; Spitale, Allison Brennan; Johal, Sheila C.; Amato, Anthony A.; Demmer, Laurie A.; Jonas, Jessica; Darras, Basil T.; Bird, Thomas D.; Laurino, Mercy; Welt, Selman I.; Trotter, Cynthia; Guicheney, Pascale; Das, Soma; Mandel, Jean-Louis; Beggs, Alan H.; Laporte, Jocelyn

    2012-01-01

    Centronuclear myopathy (CNM) is a genetically heterogeneous disorder associated with general skeletal muscle weakness, type I fiber predominance and atrophy, and abnormally centralized nuclei. Autosomal dominant CNM is due to mutations in the large GTPase dynamin 2 (DNM2), a mechanochemical enzyme regulating cytoskeleton and membrane trafficking in cells. To date, 40 families with CNM-related DNM2 mutations have been described, and here we report 60 additional families encompassing a broad genotypic and phenotypic spectrum. In total, 18 different mutations are reported in 100 families and our cohort harbors nine known and four new mutations, including the first splice-site mutation. Genotype–phenotype correlation hypotheses are drawn from the published and new data, and allow an efficient screening strategy for molecular diagnosis. In addition to CNM, dissimilar DNM2 mutations are associated with Charcot–Marie–Tooth (CMT) peripheral neuropathy (CMTD1B and CMT2M), suggesting a tissue-specific impact of the mutations. In this study, we discuss the possible clinical overlap of CNM and CMT, and the biological significance of the respective mutations based on the known functions of dynamin 2 and its protein structure. Defects in membrane trafficking due to DNM2 mutations potentially represent a common pathological mechanism in CNM and CMT. PMID:22396310

  7. Sensitivity of 2-[18F]fluoro-2-deoxyglucose positron emission tomography for advanced colorectal neoplasms: a large-scale analysis of 7505 asymptomatic screening individuals.

    Science.gov (United States)

    Sekiguchi, Masau; Kakugawa, Yasuo; Terauchi, Takashi; Matsumoto, Minori; Saito, Hiroshi; Muramatsu, Yukio; Saito, Yutaka; Matsuda, Takahisa

    2016-12-01

    The sensitivity of 2-[ 18 F]fluoro-2-deoxyglucose positron emission tomography (FDG-PET) for advanced colorectal neoplasms among healthy subjects is not yet fully understood. The present study aimed to clarify the sensitivity by analyzing large-scale data from an asymptomatic screening population. A total of 7505 asymptomatic screenees who underwent both FDG-PET and colonoscopy at our Cancer Screening Division between February 2004 and March 2013 were analyzed. FDG-PET and colonoscopy were performed on consecutive days, and each examination was interpreted in a blinded fashion. The results of the two examinations were compared for each of the divided six colonic segments, with those from colonoscopy being set as the reference. The relationships between the sensitivity of FDG-PET and clinicopathological features of advanced neoplasms were also evaluated. Two hundred ninety-one advanced neoplasms, including 24 invasive cancers, were detected in 262 individuals. Thirteen advanced neoplasms (advanced adenomas) were excluded from the analysis because of the coexistence of lesions in the same colonic segment. The sensitivity, specificity, and positive and negative predictive values of FDG-PET for advanced neoplasms were 16.9 % [95 % confidence interval (CI) 12.7-21.8 %], 99.3 % (95 % CI 99.2-99.4 %), 13.5 % (95 % CI 10.1-17.6 %), and 99.4 % (95 % CI 99.3-99.5 %), respectively. The sensitivity was lower for lesions with less advanced histological grade, of smaller size, and flat-type morphology, and for those located in the proximal part of the colon. FDG-PET is believed to be difficult to use as a primary screening tool in population-based colorectal cancer screening because of its low sensitivity for advanced neoplasms. Even when it is used in opportunistic cancer screening, the limit of its sensitivity should be considered.

  8. Decentralized Large-Scale Power Balancing

    DEFF Research Database (Denmark)

    Halvgaard, Rasmus; Jørgensen, John Bagterp; Poulsen, Niels Kjølstad

    2013-01-01

    problem is formulated as a centralized large-scale optimization problem but is then decomposed into smaller subproblems that are solved locally by each unit connected to an aggregator. For large-scale systems the method is faster than solving the full problem and can be distributed to include an arbitrary...

  9. A large-scale study of epilepsy in Ecuador: methodological aspects.

    Science.gov (United States)

    Placencia, M; Suarez, J; Crespo, F; Sander, J W; Shorvon, S D; Ellison, R H; Cascante, S M

    1992-01-01

    The methodology is presented of a large-scale study of epilepsy carried out in a highland area in northern Ecuador, South America, covering a population of 72,121 people; The study was carried out in two phases, the first, a cross-sectional phase, consisted of a house-to-house survey of all persons in this population, screening for epileptic seizures using a specially designed questionnaire. Possible cases identified in screening were assessed in a cascade diagnostic procedure applied by general doctors and neurologists. Its objectives were: to establish a comprehensive epidemiological profile of epileptic seizures; to describe the clinical phenomenology of this condition in the community; to validate methods for diagnosis and classification of epileptic seizures by a non-specialised team; and to ascertain the community's knowledge, attitudes and practices regarding epilepsy. A sample was selected in this phase in order to study the social aspects of epilepsy in this community. The second phase, which was longitudinal, assessed the ability of non-specialist care in the treatment of epilepsy. It consisted of a prospective clinical trial of antiepileptic therapy in untreated patients using two standard anti-epileptic drugs. Patients were followed for 12 months by a multidisciplinary team consisting of a primary health worker, rural doctor, neurologist, anthropologist, and psychologist. Standardised, reproducible instruments and methods were used. This study was carried out through co-operation between the medical profession, political agencies and the pharmaceutical industry, at an international level. We consider this a model for further large-scale studies of this type.

  10. Hydrocarbon phenotyping of algal species using pyrolysis-gas chromatography mass spectrometry

    Directory of Open Access Journals (Sweden)

    Kothari Shankar L

    2010-05-01

    Full Text Available Abstract Background Biofuels derived from algae biomass and algae lipids might reduce dependence on fossil fuels. Existing analytical techniques need to facilitate rapid characterization of algal species by phenotyping hydrocarbon-related constituents. Results In this study, we compared the hydrocarbon rich algae Botryococcus braunii against the photoautotrophic model algae Chlamydomonas reinhardtii using pyrolysis-gas chromatography quadrupole mass spectrometry (pyGC-MS. Sequences of up to 48 dried samples can be analyzed using pyGC-MS in an automated manner without any sample preparation. Chromatograms of 30-min run times are sufficient to profile pyrolysis products from C8 to C40 carbon chain length. The freely available software tools AMDIS and SpectConnect enables straightforward data processing. In Botryococcus samples, we identified fatty acids, vitamins, sterols and fatty acid esters and several long chain hydrocarbons. The algae species C. reinhardtii, B. braunii race A and B. braunii race B were readily discriminated using their hydrocarbon phenotypes. Substructure annotation and spectral clustering yielded network graphs of similar components for visual overviews of abundant and minor constituents. Conclusion Pyrolysis-GC-MS facilitates large scale screening of hydrocarbon phenotypes for comparisons of strain differences in algae or impact of altered growth and nutrient conditions.

  11. Automating large-scale reactor systems

    International Nuclear Information System (INIS)

    Kisner, R.A.

    1985-01-01

    This paper conveys a philosophy for developing automated large-scale control systems that behave in an integrated, intelligent, flexible manner. Methods for operating large-scale systems under varying degrees of equipment degradation are discussed, and a design approach that separates the effort into phases is suggested. 5 refs., 1 fig

  12. The use of saliva as a practical and feasible alternative to urine in large-scale screening for congenital cytomegalovirus infection increasesinclusion and detection rates

    Directory of Open Access Journals (Sweden)

    Emanuelle Santos de Carvalho Cardoso

    2015-04-01

    Full Text Available INTRODUCTION: Although urine is considered the gold-standard material for the detection of congenital cytomegalovirus (CMV infection, it can be difficult to obtain in newborns. The aim of this study was to compare the efficiency of detection of congenital CMV infection in saliva and urine samples. METHODS: One thousand newborns were included in the study. Congenital cytomegalovirus deoxyribonucleic acid (DNA was detected by polymerase chain reaction (PCR. RESULTS: Saliva samples were obtained from all the newborns, whereas urine collection was successful in only 333 cases. There was no statistically significant difference between the use of saliva alone or saliva and urine collected simultaneously for the detection of CMV infection. CONCLUSIONS: Saliva samples can be used in large-scale neonatal screening for CMV infection.

  13. Large scale meta-analysis of fragment-based screening campaigns: privileged fragments and complementary technologies.

    Science.gov (United States)

    Kutchukian, Peter S; Wassermann, Anne Mai; Lindvall, Mika K; Wright, S Kirk; Ottl, Johannes; Jacob, Jaison; Scheufler, Clemens; Marzinzik, Andreas; Brooijmans, Natasja; Glick, Meir

    2015-06-01

    A first step in fragment-based drug discovery (FBDD) often entails a fragment-based screen (FBS) to identify fragment "hits." However, the integration of conflicting results from orthogonal screens remains a challenge. Here we present a meta-analysis of 35 fragment-based campaigns at Novartis, which employed a generic 1400-fragment library against diverse target families using various biophysical and biochemical techniques. By statistically interrogating the multidimensional FBS data, we sought to investigate three questions: (1) What makes a fragment amenable for FBS? (2) How do hits from different fragment screening technologies and target classes compare with each other? (3) What is the best way to pair FBS assay technologies? In doing so, we identified substructures that were privileged for specific target classes, as well as fragments that were privileged for authentic activity against many targets. We also revealed some of the discrepancies between technologies. Finally, we uncovered a simple rule of thumb in screening strategy: when choosing two technologies for a campaign, pairing a biochemical and biophysical screen tends to yield the greatest coverage of authentic hits. © 2014 Society for Laboratory Automation and Screening.

  14. High-Throughput Genetic Screens Identify a Large and Diverse Collection of New Sporulation Genes in Bacillus subtilis.

    Science.gov (United States)

    Meeske, Alexander J; Rodrigues, Christopher D A; Brady, Jacqueline; Lim, Hoong Chuin; Bernhardt, Thomas G; Rudner, David Z

    2016-01-01

    The differentiation of the bacterium Bacillus subtilis into a dormant spore is among the most well-characterized developmental pathways in biology. Classical genetic screens performed over the past half century identified scores of factors involved in every step of this morphological process. More recently, transcriptional profiling uncovered additional sporulation-induced genes required for successful spore development. Here, we used transposon-sequencing (Tn-seq) to assess whether there were any sporulation genes left to be discovered. Our screen identified 133 out of the 148 genes with known sporulation defects. Surprisingly, we discovered 24 additional genes that had not been previously implicated in spore formation. To investigate their functions, we used fluorescence microscopy to survey early, middle, and late stages of differentiation of null mutants from the B. subtilis ordered knockout collection. This analysis identified mutants that are delayed in the initiation of sporulation, defective in membrane remodeling, and impaired in spore maturation. Several mutants had novel sporulation phenotypes. We performed in-depth characterization of two new factors that participate in cell-cell signaling pathways during sporulation. One (SpoIIT) functions in the activation of σE in the mother cell; the other (SpoIIIL) is required for σG activity in the forespore. Our analysis also revealed that as many as 36 sporulation-induced genes with no previously reported mutant phenotypes are required for timely spore maturation. Finally, we discovered a large set of transposon insertions that trigger premature initiation of sporulation. Our results highlight the power of Tn-seq for the discovery of new genes and novel pathways in sporulation and, combined with the recently completed null mutant collection, open the door for similar screens in other, less well-characterized processes.

  15. High-Throughput Genetic Screens Identify a Large and Diverse Collection of New Sporulation Genes in Bacillus subtilis

    Science.gov (United States)

    Brady, Jacqueline; Lim, Hoong Chuin; Bernhardt, Thomas G.; Rudner, David Z.

    2016-01-01

    The differentiation of the bacterium Bacillus subtilis into a dormant spore is among the most well-characterized developmental pathways in biology. Classical genetic screens performed over the past half century identified scores of factors involved in every step of this morphological process. More recently, transcriptional profiling uncovered additional sporulation-induced genes required for successful spore development. Here, we used transposon-sequencing (Tn-seq) to assess whether there were any sporulation genes left to be discovered. Our screen identified 133 out of the 148 genes with known sporulation defects. Surprisingly, we discovered 24 additional genes that had not been previously implicated in spore formation. To investigate their functions, we used fluorescence microscopy to survey early, middle, and late stages of differentiation of null mutants from the B. subtilis ordered knockout collection. This analysis identified mutants that are delayed in the initiation of sporulation, defective in membrane remodeling, and impaired in spore maturation. Several mutants had novel sporulation phenotypes. We performed in-depth characterization of two new factors that participate in cell–cell signaling pathways during sporulation. One (SpoIIT) functions in the activation of σE in the mother cell; the other (SpoIIIL) is required for σG activity in the forespore. Our analysis also revealed that as many as 36 sporulation-induced genes with no previously reported mutant phenotypes are required for timely spore maturation. Finally, we discovered a large set of transposon insertions that trigger premature initiation of sporulation. Our results highlight the power of Tn-seq for the discovery of new genes and novel pathways in sporulation and, combined with the recently completed null mutant collection, open the door for similar screens in other, less well-characterized processes. PMID:26735940

  16. Spanish-language screening scales: A critical review.

    Science.gov (United States)

    Torres-Castro, S; Mena-Montes, B; González-Ambrosio, G; Zubieta-Zavala, A; Torres-Carrillo, N M; Acosta-Castillo, G I; Espinel-Bermúdez, M C

    2018-05-09

    Dementia is a chronic, degenerative disease with a strong impact on families and health systems. The instruments currently in use for measuring cognitive impairment have different psychometric characteristics in terms of application time, cut-off point, reliability, and validity. The objective of this review is to describe the characteristics of the validated, Spanish-language versions of the Mini-Cog, Clock-Drawing Test, and Mini-Mental State Examination scales for cognitive impairment screening. We performed a three-stage literature search of articles published on Medline since 1953. We selected articles on validated, Spanish-language versions of the scales that included data on reliability, validity, sensitivity, and specificity. The 3 screening tools assessed in this article provide support for primary care professionals. Timely identification of mild cognitive impairment and dementia is crucial for the prognosis of these patients. Copyright © 2018 Sociedad Española de Neurología. Publicado por Elsevier España, S.L.U. All rights reserved.

  17. World Endometriosis Research Foundation Endometriosis Phenome and Biobanking Harmonisation Project: I. Surgical phenotype data collection in endometriosis research

    DEFF Research Database (Denmark)

    Becker, Christian M.; Laufer, Marc R.; Stratton, Pamela

    2014-01-01

    ObjectiveTo standardize the recording of surgical phenotypic information on endometriosis and related sample collections obtained at laparoscopy, allowing large-scale collaborative research into the condition.......ObjectiveTo standardize the recording of surgical phenotypic information on endometriosis and related sample collections obtained at laparoscopy, allowing large-scale collaborative research into the condition....

  18. The Software Reliability of Large Scale Integration Circuit and Very Large Scale Integration Circuit

    OpenAIRE

    Artem Ganiyev; Jan Vitasek

    2010-01-01

    This article describes evaluation method of faultless function of large scale integration circuits (LSI) and very large scale integration circuits (VLSI). In the article there is a comparative analysis of factors which determine faultless of integrated circuits, analysis of already existing methods and model of faultless function evaluation of LSI and VLSI. The main part describes a proposed algorithm and program for analysis of fault rate in LSI and VLSI circuits.

  19. Mining Genome-Scale Growth Phenotype Data through Constant-Column Biclustering

    KAUST Repository

    Alzahrani, Majed A.

    2017-07-10

    Growth phenotype profiling of genome-wide gene-deletion strains over stress conditions can offer a clear picture that the essentiality of genes depends on environmental conditions. Systematically identifying groups of genes from such recently emerging high-throughput data that share similar patterns of conditional essentiality and dispensability under various environmental conditions can elucidate how genetic interactions of the growth phenotype are regulated in response to the environment. In this dissertation, we first demonstrate that detecting such “co-fit” gene groups can be cast as a less well-studied problem in biclustering, i.e., constant-column biclustering. Despite significant advances in biclustering techniques, very few were designed for mining in growth phenotype data. Here, we propose Gracob, a novel, efficient graph-based method that casts and solves the constant-column biclustering problem as a maximal clique finding problem in a multipartite graph. We compared Gracob with a large collection of widely used biclustering methods that cover different types of algorithms designed to detect different types of biclusters. Gracob showed superior performance on finding co-fit genes over all the existing methods on both a variety of synthetic data sets with a wide range of settings, and three real growth phenotype data sets for E. coli, proteobacteria, and yeast.

  20. Large-scale generation of human iPSC-derived neural stem cells/early neural progenitor cells and their neuronal differentiation.

    Science.gov (United States)

    D'Aiuto, Leonardo; Zhi, Yun; Kumar Das, Dhanjit; Wilcox, Madeleine R; Johnson, Jon W; McClain, Lora; MacDonald, Matthew L; Di Maio, Roberto; Schurdak, Mark E; Piazza, Paolo; Viggiano, Luigi; Sweet, Robert; Kinchington, Paul R; Bhattacharjee, Ayantika G; Yolken, Robert; Nimgaonka, Vishwajit L; Nimgaonkar, Vishwajit L

    2014-01-01

    Induced pluripotent stem cell (iPSC)-based technologies offer an unprecedented opportunity to perform high-throughput screening of novel drugs for neurological and neurodegenerative diseases. Such screenings require a robust and scalable method for generating large numbers of mature, differentiated neuronal cells. Currently available methods based on differentiation of embryoid bodies (EBs) or directed differentiation of adherent culture systems are either expensive or are not scalable. We developed a protocol for large-scale generation of neuronal stem cells (NSCs)/early neural progenitor cells (eNPCs) and their differentiation into neurons. Our scalable protocol allows robust and cost-effective generation of NSCs/eNPCs from iPSCs. Following culture in neurobasal medium supplemented with B27 and BDNF, NSCs/eNPCs differentiate predominantly into vesicular glutamate transporter 1 (VGLUT1) positive neurons. Targeted mass spectrometry analysis demonstrates that iPSC-derived neurons express ligand-gated channels and other synaptic proteins and whole-cell patch-clamp experiments indicate that these channels are functional. The robust and cost-effective differentiation protocol described here for large-scale generation of NSCs/eNPCs and their differentiation into neurons paves the way for automated high-throughput screening of drugs for neurological and neurodegenerative diseases.

  1. iScreen: Image-Based High-Content RNAi Screening Analysis Tools.

    Science.gov (United States)

    Zhong, Rui; Dong, Xiaonan; Levine, Beth; Xie, Yang; Xiao, Guanghua

    2015-09-01

    High-throughput RNA interference (RNAi) screening has opened up a path to investigating functional genomics in a genome-wide pattern. However, such studies are often restricted to assays that have a single readout format. Recently, advanced image technologies have been coupled with high-throughput RNAi screening to develop high-content screening, in which one or more cell image(s), instead of a single readout, were generated from each well. This image-based high-content screening technology has led to genome-wide functional annotation in a wider spectrum of biological research studies, as well as in drug and target discovery, so that complex cellular phenotypes can be measured in a multiparametric format. Despite these advances, data analysis and visualization tools are still largely lacking for these types of experiments. Therefore, we developed iScreen (image-Based High-content RNAi Screening Analysis Tool), an R package for the statistical modeling and visualization of image-based high-content RNAi screening. Two case studies were used to demonstrate the capability and efficiency of the iScreen package. iScreen is available for download on CRAN (http://cran.cnr.berkeley.edu/web/packages/iScreen/index.html). The user manual is also available as a supplementary document. © 2014 Society for Laboratory Automation and Screening.

  2. Worm Phenotype Ontology: Integrating phenotype data within and beyond the C. elegans community

    Directory of Open Access Journals (Sweden)

    Yook Karen

    2011-01-01

    Full Text Available Abstract Background Caenorhabditis elegans gene-based phenotype information dates back to the 1970's, beginning with Sydney Brenner and the characterization of behavioral and morphological mutant alleles via classical genetics in order to understand nervous system function. Since then C. elegans has become an important genetic model system for the study of basic biological and biomedical principles, largely through the use of phenotype analysis. Because of the growth of C. elegans as a genetically tractable model organism and the development of large-scale analyses, there has been a significant increase of phenotype data that needs to be managed and made accessible to the research community. To do so, a standardized vocabulary is necessary to integrate phenotype data from diverse sources, permit integration with other data types and render the data in a computable form. Results We describe a hierarchically structured, controlled vocabulary of terms that can be used to standardize phenotype descriptions in C. elegans, namely the Worm Phenotype Ontology (WPO. The WPO is currently comprised of 1,880 phenotype terms, 74% of which have been used in the annotation of phenotypes associated with greater than 18,000 C. elegans genes. The scope of the WPO is not exclusively limited to C. elegans biology, rather it is devised to also incorporate phenotypes observed in related nematode species. We have enriched the value of the WPO by integrating it with other ontologies, thereby increasing the accessibility of worm phenotypes to non-nematode biologists. We are actively developing the WPO to continue to fulfill the evolving needs of the scientific community and hope to engage researchers in this crucial endeavor. Conclusions We provide a phenotype ontology (WPO that will help to facilitate data retrieval, and cross-species comparisons within the nematode community. In the larger scientific community, the WPO will permit data integration, and

  3. Variable phenotypic expression and onset in MYH14 distal hereditary motor neuropathy phenotype in a large, multigenerational North American family.

    Science.gov (United States)

    Iyadurai, Stanley; Arnold, W David; Kissel, John T; Ruhno, Corey; Mcgovern, Vicki L; Snyder, Pamela J; Prior, Thomas W; Roggenbuck, Jennifer; Burghes, Arthur H; Kolb, Stephen J

    2017-08-01

    Distal hereditary motor neuropathy (dHMN) causes distal-predominant weakness without prominent sensory loss. Myosin heavy chain disorders most commonly result in distal myopathy and cardiomyopathy with or without hearing loss, but a complex phenotype with dHMN, myopathy, hoarseness, and hearing loss was reported in a Korean family with a c.2822G>T mutation in MYH14. In this study we report phenotypic features in a North American family with the c.2822G>T in MYH14. Clinical and molecular characterization was performed in a large, 6-generation, Caucasian family with MYH14 dHMN. A total of 11 affected and 7 unaffected individuals were evaluated and showed varying age of onset and severity of weakness. Genotypic concordance was confirmed with molecular analysis. Electrophysiological studies demonstrated distal motor axonal degeneration without myopathy in all affected subjects tested. Mutation of MYH14 can result in a range of neuromuscular phenotypes that includes a dHMN and hearing loss phenotype with variable age of onset. Muscle Nerve 56: 341-345, 2017. © 2016 Wiley Periodicals, Inc.

  4. Phylogenetic distribution of large-scale genome patchiness

    Directory of Open Access Journals (Sweden)

    Hackenberg Michael

    2008-04-01

    Full Text Available Abstract Background The phylogenetic distribution of large-scale genome structure (i.e. mosaic compositional patchiness has been explored mainly by analytical ultracentrifugation of bulk DNA. However, with the availability of large, good-quality chromosome sequences, and the recently developed computational methods to directly analyze patchiness on the genome sequence, an evolutionary comparative analysis can be carried out at the sequence level. Results The local variations in the scaling exponent of the Detrended Fluctuation Analysis are used here to analyze large-scale genome structure and directly uncover the characteristic scales present in genome sequences. Furthermore, through shuffling experiments of selected genome regions, computationally-identified, isochore-like regions were identified as the biological source for the uncovered large-scale genome structure. The phylogenetic distribution of short- and large-scale patchiness was determined in the best-sequenced genome assemblies from eleven eukaryotic genomes: mammals (Homo sapiens, Pan troglodytes, Mus musculus, Rattus norvegicus, and Canis familiaris, birds (Gallus gallus, fishes (Danio rerio, invertebrates (Drosophila melanogaster and Caenorhabditis elegans, plants (Arabidopsis thaliana and yeasts (Saccharomyces cerevisiae. We found large-scale patchiness of genome structure, associated with in silico determined, isochore-like regions, throughout this wide phylogenetic range. Conclusion Large-scale genome structure is detected by directly analyzing DNA sequences in a wide range of eukaryotic chromosome sequences, from human to yeast. In all these genomes, large-scale patchiness can be associated with the isochore-like regions, as directly detected in silico at the sequence level.

  5. Managing large-scale models: DBS

    International Nuclear Information System (INIS)

    1981-05-01

    A set of fundamental management tools for developing and operating a large scale model and data base system is presented. Based on experience in operating and developing a large scale computerized system, the only reasonable way to gain strong management control of such a system is to implement appropriate controls and procedures. Chapter I discusses the purpose of the book. Chapter II classifies a broad range of generic management problems into three groups: documentation, operations, and maintenance. First, system problems are identified then solutions for gaining management control are disucssed. Chapters III, IV, and V present practical methods for dealing with these problems. These methods were developed for managing SEAS but have general application for large scale models and data bases

  6. Large Scale Self-Organizing Information Distribution System

    National Research Council Canada - National Science Library

    Low, Steven

    2005-01-01

    This project investigates issues in "large-scale" networks. Here "large-scale" refers to networks with large number of high capacity nodes and transmission links, and shared by a large number of users...

  7. Large-Scale Cognitive GWAS Meta-Analysis Reveals Tissue-Specific Neural Expression and Potential Nootropic Drug Targets

    Directory of Open Access Journals (Sweden)

    Max Lam

    2017-11-01

    Full Text Available Here, we present a large (n = 107,207 genome-wide association study (GWAS of general cognitive ability (“g”, further enhanced by combining results with a large-scale GWAS of educational attainment. We identified 70 independent genomic loci associated with general cognitive ability. Results showed significant enrichment for genes causing Mendelian disorders with an intellectual disability phenotype. Competitive pathway analysis implicated the biological processes of neurogenesis and synaptic regulation, as well as the gene targets of two pharmacologic agents: cinnarizine, a T-type calcium channel blocker, and LY97241, a potassium channel inhibitor. Transcriptome-wide and epigenome-wide analysis revealed that the implicated loci were enriched for genes expressed across all brain regions (most strongly in the cerebellum. Enrichment was exclusive to genes expressed in neurons but not oligodendrocytes or astrocytes. Finally, we report genetic correlations between cognitive ability and disparate phenotypes including psychiatric disorders, several autoimmune disorders, longevity, and maternal age at first birth.

  8. Large scale structure and baryogenesis

    International Nuclear Information System (INIS)

    Kirilova, D.P.; Chizhov, M.V.

    2001-08-01

    We discuss a possible connection between the large scale structure formation and the baryogenesis in the universe. An update review of the observational indications for the presence of a very large scale 120h -1 Mpc in the distribution of the visible matter of the universe is provided. The possibility to generate a periodic distribution with the characteristic scale 120h -1 Mpc through a mechanism producing quasi-periodic baryon density perturbations during inflationary stage, is discussed. The evolution of the baryon charge density distribution is explored in the framework of a low temperature boson condensate baryogenesis scenario. Both the observed very large scale of a the visible matter distribution in the universe and the observed baryon asymmetry value could naturally appear as a result of the evolution of a complex scalar field condensate, formed at the inflationary stage. Moreover, for some model's parameters a natural separation of matter superclusters from antimatter ones can be achieved. (author)

  9. State-of-the-art of large scale biogas plants

    International Nuclear Information System (INIS)

    Prisum, J.M.; Noergaard, P.

    1992-01-01

    A survey of the technological state of large scale biogas plants in Europe treating manure is given. 83 plants are in operation at present. Of these, 16 are centralised digestion plants. Transport costs at centralised digestion plants amounts to between 25 and 40 percent of the total operational costs. Various transport equipment is used. Most large scale digesters are CSTRs, but serial, contact, 2-step, and plug-flow digesters are also found. Construction materials are mostly steel and concrete. Mesophilic digestion is most common (56%), thermophilic digestion is used in 17% of the plants, combined mesophilic and thermophilic digestion is used in 28% of the centralised plants. Mixing of digester content is performed with gas injection, propellers, and gas-liquid displacement. Heating is carried out using external or internal heat exchangers. Heat recovery is only used in Denmark. Gas purification equipment is commonplace, but not often needed. Several plants use separation of the digested manure, often as part of a post-treatment/-purification process or for the production of 'compost'. Screens, sieve belt separaters, centrifuges and filter presses are employed. The use of biogas varies considerably. In some cases, combined heat and power stations are supplying the grid and district heating systems. Other plants use only either the electricity or heat. (au)

  10. Automatic management software for large-scale cluster system

    International Nuclear Information System (INIS)

    Weng Yunjian; Chinese Academy of Sciences, Beijing; Sun Gongxing

    2007-01-01

    At present, the large-scale cluster system faces to the difficult management. For example the manager has large work load. It needs to cost much time on the management and the maintenance of large-scale cluster system. The nodes in large-scale cluster system are very easy to be chaotic. Thousands of nodes are put in big rooms so that some managers are very easy to make the confusion with machines. How do effectively carry on accurate management under the large-scale cluster system? The article introduces ELFms in the large-scale cluster system. Furthermore, it is proposed to realize the large-scale cluster system automatic management. (authors)

  11. GPU-based large-scale visualization

    KAUST Repository

    Hadwiger, Markus

    2013-11-19

    Recent advances in image and volume acquisition as well as computational advances in simulation have led to an explosion of the amount of data that must be visualized and analyzed. Modern techniques combine the parallel processing power of GPUs with out-of-core methods and data streaming to enable the interactive visualization of giga- and terabytes of image and volume data. A major enabler for interactivity is making both the computational and the visualization effort proportional to the amount of data that is actually visible on screen, decoupling it from the full data size. This leads to powerful display-aware multi-resolution techniques that enable the visualization of data of almost arbitrary size. The course consists of two major parts: An introductory part that progresses from fundamentals to modern techniques, and a more advanced part that discusses details of ray-guided volume rendering, novel data structures for display-aware visualization and processing, and the remote visualization of large online data collections. You will learn how to develop efficient GPU data structures and large-scale visualizations, implement out-of-core strategies and concepts such as virtual texturing that have only been employed recently, as well as how to use modern multi-resolution representations. These approaches reduce the GPU memory requirements of extremely large data to a working set size that fits into current GPUs. You will learn how to perform ray-casting of volume data of almost arbitrary size and how to render and process gigapixel images using scalable, display-aware techniques. We will describe custom virtual texturing architectures as well as recent hardware developments in this area. We will also describe client/server systems for distributed visualization, on-demand data processing and streaming, and remote visualization. We will describe implementations using OpenGL as well as CUDA, exploiting parallelism on GPUs combined with additional asynchronous

  12. Automated Protocol for Large-Scale Modeling of Gene Expression Data.

    Science.gov (United States)

    Hall, Michelle Lynn; Calkins, David; Sherman, Woody

    2016-11-28

    With the continued rise of phenotypic- and genotypic-based screening projects, computational methods to analyze, process, and ultimately make predictions in this field take on growing importance. Here we show how automated machine learning workflows can produce models that are predictive of differential gene expression as a function of a compound structure using data from A673 cells as a proof of principle. In particular, we present predictive models with an average accuracy of greater than 70% across a highly diverse ∼1000 gene expression profile. In contrast to the usual in silico design paradigm, where one interrogates a particular target-based response, this work opens the opportunity for virtual screening and lead optimization for desired multitarget gene expression profiles.

  13. Rod-coating: towards large-area fabrication of uniform reduced graphene oxide films for flexible touch screens.

    Science.gov (United States)

    Wang, Jie; Liang, Minghui; Fang, Yan; Qiu, Tengfei; Zhang, Jin; Zhi, Linjie

    2012-06-05

    A novel strategy is developed for the large-scale fabrication of reduced graphene oxide films directly on flexible substrates in a controlled manner by the combination of a rod-coating technique and room-temperature reduction of graphene oxide. The as-prepared films display excellent uniformity, good transparency and conductivity, and great flexibility in a touch screen. Copyright © 2012 WILEY-VCH Verlag GmbH & Co. KGaA, Weinheim.

  14. A Phenotype Classification of Internet Use Disorder in a Large-Scale High-School Study

    Directory of Open Access Journals (Sweden)

    Katajun Lindenberg

    2018-04-01

    Full Text Available Internet Use Disorder (IUD affects numerous adolescents worldwide, and (Internet Gaming Disorder, a specific subtype of IUD, has recently been included in DSM-5 and ICD-11. Epidemiological studies have identified prevalence rates up to 5.7% among adolescents in Germany. However, little is known about the risk development during adolescence and its association to education. The aim of this study was to: (a identify a clinically relevant latent profile in a large-scale high-school sample; (b estimate prevalence rates of IUD for distinct age groups and (c investigate associations to gender and education. N = 5387 adolescents out of 41 schools in Germany aged 11–21 were assessed using the Compulsive Internet Use Scale (CIUS. Latent profile analyses showed five profile groups with differences in CIUS response pattern, age and school type. IUD was found in 6.1% and high-risk Internet use in 13.9% of the total sample. Two peaks were found in prevalence rates indicating the highest risk of IUD in age groups 15–16 and 19–21. Prevalence did not differ significantly between boys and girls. High-level education schools showed the lowest (4.9% and vocational secondary schools the highest prevalence rate (7.8%. The differences between school types could not be explained by academic level.

  15. Large scale network-centric distributed systems

    CERN Document Server

    Sarbazi-Azad, Hamid

    2014-01-01

    A highly accessible reference offering a broad range of topics and insights on large scale network-centric distributed systems Evolving from the fields of high-performance computing and networking, large scale network-centric distributed systems continues to grow as one of the most important topics in computing and communication and many interdisciplinary areas. Dealing with both wired and wireless networks, this book focuses on the design and performance issues of such systems. Large Scale Network-Centric Distributed Systems provides in-depth coverage ranging from ground-level hardware issu

  16. Large-Scale Outflows in Seyfert Galaxies

    Science.gov (United States)

    Colbert, E. J. M.; Baum, S. A.

    1995-12-01

    \\catcode`\\@=11 \\ialign{m @th#1hfil ##hfil \\crcr#2\\crcr\\sim\\crcr}}} \\catcode`\\@=12 Highly collimated outflows extend out to Mpc scales in many radio-loud active galaxies. In Seyfert galaxies, which are radio-quiet, the outflows extend out to kpc scales and do not appear to be as highly collimated. In order to study the nature of large-scale (>~1 kpc) outflows in Seyferts, we have conducted optical, radio and X-ray surveys of a distance-limited sample of 22 edge-on Seyfert galaxies. Results of the optical emission-line imaging and spectroscopic survey imply that large-scale outflows are present in >~{{1} /{4}} of all Seyferts. The radio (VLA) and X-ray (ROSAT) surveys show that large-scale radio and X-ray emission is present at about the same frequency. Kinetic luminosities of the outflows in Seyferts are comparable to those in starburst-driven superwinds. Large-scale radio sources in Seyferts appear diffuse, but do not resemble radio halos found in some edge-on starburst galaxies (e.g. M82). We discuss the feasibility of the outflows being powered by the active nucleus (e.g. a jet) or a circumnuclear starburst.

  17. Model-independent phenotyping of C. elegans locomotion using scale-invariant feature transform.

    Directory of Open Access Journals (Sweden)

    Yelena Koren

    Full Text Available To uncover the genetic basis of behavioral traits in the model organism C. elegans, a common strategy is to study locomotion defects in mutants. Despite efforts to introduce (semi-automated phenotyping strategies, current methods overwhelmingly depend on worm-specific features that must be hand-crafted and as such are not generalizable for phenotyping motility in other animal models. Hence, there is an ongoing need for robust algorithms that can automatically analyze and classify motility phenotypes quantitatively. To this end, we have developed a fully-automated approach to characterize C. elegans' phenotypes that does not require the definition of nematode-specific features. Rather, we make use of the popular computer vision Scale-Invariant Feature Transform (SIFT from which we construct histograms of commonly-observed SIFT features to represent nematode motility. We first evaluated our method on a synthetic dataset simulating a range of nematode crawling gaits. Next, we evaluated our algorithm on two distinct datasets of crawling C. elegans with mutants affecting neuromuscular structure and function. Not only is our algorithm able to detect differences between strains, results capture similarities in locomotory phenotypes that lead to clustering that is consistent with expectations based on genetic relationships. Our proposed approach generalizes directly and should be applicable to other animal models. Such applicability holds promise for computational ethology as more groups collect high-resolution image data of animal behavior.

  18. SCALE INTERACTION IN A MIXING LAYER. THE ROLE OF THE LARGE-SCALE GRADIENTS

    KAUST Repository

    Fiscaletti, Daniele

    2015-08-23

    The interaction between scales is investigated in a turbulent mixing layer. The large-scale amplitude modulation of the small scales already observed in other works depends on the crosswise location. Large-scale positive fluctuations correlate with a stronger activity of the small scales on the low speed-side of the mixing layer, and a reduced activity on the high speed-side. However, from physical considerations we would expect the scales to interact in a qualitatively similar way within the flow and across different turbulent flows. Therefore, instead of the large-scale fluctuations, the large-scale gradients modulation of the small scales has been additionally investigated.

  19. Lack of association between PKLR rs3020781 and NOS1AP rs7538490 and type 2 diabetes, overweight, obesity and related metabolic phenotypes in a Danish large-scale study: case-control studies and analyses of quantitative traits

    Directory of Open Access Journals (Sweden)

    Almind Katrine

    2008-12-01

    Full Text Available Abstract Background Several studies in multiple ethnicities have reported linkage to type 2 diabetes on chromosome 1q21-25. Both PKLR encoding the liver pyruvate kinase and NOS1AP encoding the nitric oxide synthase 1 (neuronal adaptor protein (CAPON are positioned within this chromosomal region and are thus positional candidates for the observed linkage peak. The C-allele of PKLR rs3020781 and the T-allele of NOS1AP rs7538490 are reported to strongly associate with type 2 diabetes in various European-descent populations comprising a total of 2,198 individuals with a combined odds ratio (OR of 1.33 [1.16–1.54] and 1.53 [1.28–1.81], respectively. Our aim was to validate these findings by investigating the impact of the two variants on type 2 diabetes and related quantitative metabolic phenotypes in a large study sample of Danes. Further, we intended to expand the analyses by examining the effect of the variants in relation to overweight and obesity. Methods PKLR rs3020781 and NOS1AP rs7538490 were genotyped, using TaqMan allelic discrimination, in a combined study sample comprising a total of 16,801 and 16,913 individuals, respectively. The participants were ascertained from four different study groups; the population-based Inter99 cohort (nPKLR = 5,962, nNOS1AP = 6,008, a type 2 diabetic patient group (nPKLR = 1,873, nNOS1AP = 1,874 from Steno Diabetes Center, a population-based study sample (nPKLR = 599, nNOS1AP = 596 from Steno Diabetes Center and the ADDITION Denmark screening study cohort (nPKLR = 8,367, nNOS1AP = 8,435. Results In case-control studies we evaluated the potential association between rs3020781 and rs7538490 and type 2 diabetes and obesity. No significant associations were observed for type 2 diabetes (rs3020781: pAF = 0.49, OR = 1.02 [0.96–1.10]; rs7538490: pAF = 0.84, OR = 0.99 [0.93–1.06]. Neither did we show association with overweight or obesity. Additionally, the PKLR and the NOS1AP genotypes were demonstrated not

  20. Dissecting the large-scale galactic conformity

    Science.gov (United States)

    Seo, Seongu

    2018-01-01

    Galactic conformity is an observed phenomenon that galaxies located in the same region have similar properties such as star formation rate, color, gas fraction, and so on. The conformity was first observed among galaxies within in the same halos (“one-halo conformity”). The one-halo conformity can be readily explained by mutual interactions among galaxies within a halo. Recent observations however further witnessed a puzzling connection among galaxies with no direct interaction. In particular, galaxies located within a sphere of ~5 Mpc radius tend to show similarities, even though the galaxies do not share common halos with each other ("two-halo conformity" or “large-scale conformity”). Using a cosmological hydrodynamic simulation, Illustris, we investigate the physical origin of the two-halo conformity and put forward two scenarios. First, back-splash galaxies are likely responsible for the large-scale conformity. They have evolved into red galaxies due to ram-pressure stripping in a given galaxy cluster and happen to reside now within a ~5 Mpc sphere. Second, galaxies in strong tidal field induced by large-scale structure also seem to give rise to the large-scale conformity. The strong tides suppress star formation in the galaxies. We discuss the importance of the large-scale conformity in the context of galaxy evolution.

  1. Enhancing Reproducibility in Cancer Drug Screening: How Do We Move Forward?

    DEFF Research Database (Denmark)

    Shi, Leming; Haibe-Kains, Benjamin; Birkbak, Nicolai Juul

    2014-01-01

    Large-scale pharmacogenomic high-throughput screening (HTS) studies hold great potential for generating robust genomic predictors of drug response. Two recent large-scale HTS studies have reported results of such screens, revealing several known and novel drug sensitivities and biomarkers...

  2. Large-scale perspective as a challenge

    NARCIS (Netherlands)

    Plomp, M.G.A.

    2012-01-01

    1. Scale forms a challenge for chain researchers: when exactly is something ‘large-scale’? What are the underlying factors (e.g. number of parties, data, objects in the chain, complexity) that determine this? It appears to be a continuum between small- and large-scale, where positioning on that

  3. Algorithm 896: LSA: Algorithms for Large-Scale Optimization

    Czech Academy of Sciences Publication Activity Database

    Lukšan, Ladislav; Matonoha, Ctirad; Vlček, Jan

    2009-01-01

    Roč. 36, č. 3 (2009), 16-1-16-29 ISSN 0098-3500 R&D Pro jects: GA AV ČR IAA1030405; GA ČR GP201/06/P397 Institutional research plan: CEZ:AV0Z10300504 Keywords : algorithms * design * large-scale optimization * large-scale nonsmooth optimization * large-scale nonlinear least squares * large-scale nonlinear minimax * large-scale systems of nonlinear equations * sparse pro blems * partially separable pro blems * limited-memory methods * discrete Newton methods * quasi-Newton methods * primal interior-point methods Subject RIV: BB - Applied Statistics, Operational Research Impact factor: 1.904, year: 2009

  4. Scale interactions in a mixing layer – the role of the large-scale gradients

    KAUST Repository

    Fiscaletti, D.

    2016-02-15

    © 2016 Cambridge University Press. The interaction between the large and the small scales of turbulence is investigated in a mixing layer, at a Reynolds number based on the Taylor microscale of , via direct numerical simulations. The analysis is performed in physical space, and the local vorticity root-mean-square (r.m.s.) is taken as a measure of the small-scale activity. It is found that positive large-scale velocity fluctuations correspond to large vorticity r.m.s. on the low-speed side of the mixing layer, whereas, they correspond to low vorticity r.m.s. on the high-speed side. The relationship between large and small scales thus depends on position if the vorticity r.m.s. is correlated with the large-scale velocity fluctuations. On the contrary, the correlation coefficient is nearly constant throughout the mixing layer and close to unity if the vorticity r.m.s. is correlated with the large-scale velocity gradients. Therefore, the small-scale activity appears closely related to large-scale gradients, while the correlation between the small-scale activity and the large-scale velocity fluctuations is shown to reflect a property of the large scales. Furthermore, the vorticity from unfiltered (small scales) and from low pass filtered (large scales) velocity fields tend to be aligned when examined within vortical tubes. These results provide evidence for the so-called \\'scale invariance\\' (Meneveau & Katz, Annu. Rev. Fluid Mech., vol. 32, 2000, pp. 1-32), and suggest that some of the large-scale characteristics are not lost at the small scales, at least at the Reynolds number achieved in the present simulation.

  5. Large-scale matrix-handling subroutines 'ATLAS'

    International Nuclear Information System (INIS)

    Tsunematsu, Toshihide; Takeda, Tatsuoki; Fujita, Keiichi; Matsuura, Toshihiko; Tahara, Nobuo

    1978-03-01

    Subroutine package ''ATLAS'' has been developed for handling large-scale matrices. The package is composed of four kinds of subroutines, i.e., basic arithmetic routines, routines for solving linear simultaneous equations and for solving general eigenvalue problems and utility routines. The subroutines are useful in large scale plasma-fluid simulations. (auth.)

  6. Large-scale solar heat

    Energy Technology Data Exchange (ETDEWEB)

    Tolonen, J.; Konttinen, P.; Lund, P. [Helsinki Univ. of Technology, Otaniemi (Finland). Dept. of Engineering Physics and Mathematics

    1998-12-31

    In this project a large domestic solar heating system was built and a solar district heating system was modelled and simulated. Objectives were to improve the performance and reduce costs of a large-scale solar heating system. As a result of the project the benefit/cost ratio can be increased by 40 % through dimensioning and optimising the system at the designing stage. (orig.)

  7. Large-Area Cross-Aligned Silver Nanowire Electrodes for Flexible, Transparent, and Force-Sensitive Mechanochromic Touch Screens.

    Science.gov (United States)

    Cho, Seungse; Kang, Saewon; Pandya, Ashish; Shanker, Ravi; Khan, Ziyauddin; Lee, Youngsu; Park, Jonghwa; Craig, Stephen L; Ko, Hyunhyub

    2017-04-25

    Silver nanowire (AgNW) networks are considered to be promising structures for use as flexible transparent electrodes for various optoelectronic devices. One important application of AgNW transparent electrodes is the flexible touch screens. However, the performances of flexible touch screens are still limited by the large surface roughness and low electrical to optical conductivity ratio of random network AgNW electrodes. In addition, although the perception of writing force on the touch screen enables a variety of different functions, the current technology still relies on the complicated capacitive force touch sensors. This paper demonstrates a simple and high-throughput bar-coating assembly technique for the fabrication of large-area (>20 × 20 cm 2 ), highly cross-aligned AgNW networks for transparent electrodes with the sheet resistance of 21.0 Ω sq -1 at 95.0% of optical transmittance, which compares favorably with that of random AgNW networks (sheet resistance of 21.0 Ω sq -1 at 90.4% of optical transmittance). As a proof of concept demonstration, we fabricate flexible, transparent, and force-sensitive touch screens using cross-aligned AgNW electrodes integrated with mechanochromic spiropyran-polydimethylsiloxane composite film. Our force-sensitive touch screens enable the precise monitoring of dynamic writings, tracing and drawing of underneath pictures, and perception of handwriting patterns with locally different writing forces. The suggested technique provides a robust and powerful platform for the controllable assembly of nanowires beyond the scale of conventional fabrication techniques, which can find diverse applications in multifunctional flexible electronic and optoelectronic devices.

  8. Probes of large-scale structure in the Universe

    International Nuclear Information System (INIS)

    Suto, Yasushi; Gorski, K.; Juszkiewicz, R.; Silk, J.

    1988-01-01

    Recent progress in observational techniques has made it possible to confront quantitatively various models for the large-scale structure of the Universe with detailed observational data. We develop a general formalism to show that the gravitational instability theory for the origin of large-scale structure is now capable of critically confronting observational results on cosmic microwave background radiation angular anisotropies, large-scale bulk motions and large-scale clumpiness in the galaxy counts. (author)

  9. Screening for Adolescent Problematic Internet Use: Validation of the Problematic and Risky Internet Use Screening Scale (PRIUSS).

    Science.gov (United States)

    Jelenchick, Lauren A; Eickhoff, Jens; Zhang, Chong; Kraninger, Kristina; Christakis, Dimitri A; Moreno, Megan A

    2015-01-01

    Problematic Internet use (PIU) is an emerging health concern that lacks screening measures validated for use with adolescents and young adults. This study aimed to validate the Problematic and Risky Internet Use Screening Scale (PRIUSS) for use with older adolescents and to increase its clinical utility by determining scoring guidelines and assessing the relationship between PIU and other mental health conditions. This cross-sectional survey study took place at a large, public Midwestern university among 330 older adolescents aged 18 to 25 years. Confirmatory factor analysis and Spearman's correlations were used to assess the PRIUSS' structural and construct validity, respectively. A risk-based scoring cutoff was estimated using a Bayesian latent class modeling approach to computing a receiver operating characteristic curve. The confirmatory factor analysis indices for the 3-factor model indicated an acceptable fit (goodness-of-fit index 0.89, root mean square error of approximation 0.07). A cutoff of 25 (sensitivity 0.80, 95% confidence interval [CI] 0.47-0.99; specificity 0.79, 95% CI 0.73-0.84) is proposed for identifying those at risk for PIU. Participants at risk for PIU were at significantly greater odds of also reporting symptoms of attention-deficit/hyperactivity disorder (odds ratio [OR] 2.36 95% CI 1.21-4.62, P = .009), depression (OR 3.25, 95% CI 1.65-6.42, P = .008), and social anxiety (OR 3.77, 95% CI 2.06-6.89, P < .000). The PRIUSS demonstrated validity as a PIU screening instrument for adolescents and young adults. Screening for PIU may also help to identify those at high reciprocal risk for other mental health conditions. Copyright © 2015. Published by Elsevier Inc.

  10. Large-scale grid management; Storskala Nettforvaltning

    Energy Technology Data Exchange (ETDEWEB)

    Langdal, Bjoern Inge; Eggen, Arnt Ove

    2003-07-01

    The network companies in the Norwegian electricity industry now have to establish a large-scale network management, a concept essentially characterized by (1) broader focus (Broad Band, Multi Utility,...) and (2) bigger units with large networks and more customers. Research done by SINTEF Energy Research shows so far that the approaches within large-scale network management may be structured according to three main challenges: centralization, decentralization and out sourcing. The article is part of a planned series.

  11. Japanese large-scale interferometers

    CERN Document Server

    Kuroda, K; Miyoki, S; Ishizuka, H; Taylor, C T; Yamamoto, K; Miyakawa, O; Fujimoto, M K; Kawamura, S; Takahashi, R; Yamazaki, T; Arai, K; Tatsumi, D; Ueda, A; Fukushima, M; Sato, S; Shintomi, T; Yamamoto, A; Suzuki, T; Saitô, Y; Haruyama, T; Sato, N; Higashi, Y; Uchiyama, T; Tomaru, T; Tsubono, K; Ando, M; Takamori, A; Numata, K; Ueda, K I; Yoneda, H; Nakagawa, K; Musha, M; Mio, N; Moriwaki, S; Somiya, K; Araya, A; Kanda, N; Telada, S; Sasaki, M; Tagoshi, H; Nakamura, T; Tanaka, T; Ohara, K

    2002-01-01

    The objective of the TAMA 300 interferometer was to develop advanced technologies for kilometre scale interferometers and to observe gravitational wave events in nearby galaxies. It was designed as a power-recycled Fabry-Perot-Michelson interferometer and was intended as a step towards a final interferometer in Japan. The present successful status of TAMA is presented. TAMA forms a basis for LCGT (large-scale cryogenic gravitational wave telescope), a 3 km scale cryogenic interferometer to be built in the Kamioka mine in Japan, implementing cryogenic mirror techniques. The plan of LCGT is schematically described along with its associated R and D.

  12. Virtual screening methods as tools for drug lead discovery from large chemical libraries.

    Science.gov (United States)

    Ma, X H; Zhu, F; Liu, X; Shi, Z; Zhang, J X; Yang, S Y; Wei, Y Q; Chen, Y Z

    2012-01-01

    Virtual screening methods have been developed and explored as useful tools for searching drug lead compounds from chemical libraries, including large libraries that have become publically available. In this review, we discussed the new developments in exploring virtual screening methods for enhanced performance in searching large chemical libraries, their applications in screening libraries of ~ 1 million or more compounds in the last five years, the difficulties in their applications, and the strategies for further improving these methods.

  13. GRAPHICS-IMAGE MIXED METHOD FOR LARGE-SCALE BUILDINGS RENDERING

    Directory of Open Access Journals (Sweden)

    Y. Zhou

    2018-05-01

    Full Text Available Urban 3D model data is huge and unstructured, LOD and Out-of-core algorithm are usually used to reduce the amount of data that drawn in each frame to improve the rendering efficiency. When the scene is large enough, even the complex optimization algorithm is difficult to achieve better results. Based on the traditional study, a novel idea was developed. We propose a graphics and image mixed method for large-scale buildings rendering. Firstly, the view field is divided into several regions, the graphics-image mixed method used to render the scene on both screen and FBO, then blending the FBO with scree. The algorithm is tested on the huge CityGML model data in the urban areas of New York which contained 188195 public building models, and compared with the Cesium platform. The experiment result shows the system was running smoothly. The experimental results confirm that the algorithm can achieve more massive building scene roaming under the same hardware conditions, and can rendering the scene without vision loss.

  14. Screening and large-scale expression of membrane proteins in mammalian cells for structural studies

    OpenAIRE

    Goehring, April; Lee, Chia-Hsueh; Wang, Kevin H.; Michel, Jennifer Carlisle; Claxton, Derek P.; Baconguis, Isabelle; Althoff, Thorsten; Fischer, Suzanne; Garcia, K. Christopher; Gouaux, Eric

    2014-01-01

    Structural, biochemical and biophysical studies of eukaryotic membrane proteins are often hampered by difficulties in over-expression of the candidate molecule. Baculovirus transduction of mammalian cells (BacMam), although a powerful method to heterologously express membrane proteins, can be cumbersome for screening and expression of multiple constructs. We therefore developed plasmid Eric Gouaux (pEG) BacMam, a vector optimized for use in screening assays, as well as for efficient productio...

  15. Visual analysis of inter-process communication for large-scale parallel computing.

    Science.gov (United States)

    Muelder, Chris; Gygi, Francois; Ma, Kwan-Liu

    2009-01-01

    In serial computation, program profiling is often helpful for optimization of key sections of code. When moving to parallel computation, not only does the code execution need to be considered but also communication between the different processes which can induce delays that are detrimental to performance. As the number of processes increases, so does the impact of the communication delays on performance. For large-scale parallel applications, it is critical to understand how the communication impacts performance in order to make the code more efficient. There are several tools available for visualizing program execution and communications on parallel systems. These tools generally provide either views which statistically summarize the entire program execution or process-centric views. However, process-centric visualizations do not scale well as the number of processes gets very large. In particular, the most common representation of parallel processes is a Gantt char t with a row for each process. As the number of processes increases, these charts can become difficult to work with and can even exceed screen resolution. We propose a new visualization approach that affords more scalability and then demonstrate it on systems running with up to 16,384 processes.

  16. CImbinator: a web-based tool for drug synergy analysis in small- and large-scale datasets.

    Science.gov (United States)

    Flobak, Åsmund; Vazquez, Miguel; Lægreid, Astrid; Valencia, Alfonso

    2017-08-01

    Drug synergies are sought to identify combinations of drugs particularly beneficial. User-friendly software solutions that can assist analysis of large-scale datasets are required. CImbinator is a web-service that can aid in batch-wise and in-depth analyzes of data from small-scale and large-scale drug combination screens. CImbinator offers to quantify drug combination effects, using both the commonly employed median effect equation, as well as advanced experimental mathematical models describing dose response relationships. CImbinator is written in Ruby and R. It uses the R package drc for advanced drug response modeling. CImbinator is available at http://cimbinator.bioinfo.cnio.es , the source-code is open and available at https://github.com/Rbbt-Workflows/combination_index . A Docker image is also available at https://hub.docker.com/r/mikisvaz/rbbt-ci_mbinator/ . asmund.flobak@ntnu.no or miguel.vazquez@cnio.es. Supplementary data are available at Bioinformatics online. © The Author(s) 2017. Published by Oxford University Press.

  17. Large scale model testing

    International Nuclear Information System (INIS)

    Brumovsky, M.; Filip, R.; Polachova, H.; Stepanek, S.

    1989-01-01

    Fracture mechanics and fatigue calculations for WWER reactor pressure vessels were checked by large scale model testing performed using large testing machine ZZ 8000 (with a maximum load of 80 MN) at the SKODA WORKS. The results are described from testing the material resistance to fracture (non-ductile). The testing included the base materials and welded joints. The rated specimen thickness was 150 mm with defects of a depth between 15 and 100 mm. The results are also presented of nozzles of 850 mm inner diameter in a scale of 1:3; static, cyclic, and dynamic tests were performed without and with surface defects (15, 30 and 45 mm deep). During cyclic tests the crack growth rate in the elastic-plastic region was also determined. (author). 6 figs., 2 tabs., 5 refs

  18. Optimization of Phenotyping Assays for the Model Monocot Setaria viridis.

    Science.gov (United States)

    Acharya, Biswa R; Roy Choudhury, Swarup; Estelle, Aiden B; Vijayakumar, Anitha; Zhu, Chuanmei; Hovis, Laryssa; Pandey, Sona

    2017-01-01

    Setaria viridis (green foxtail) is an important model plant for the study of C4 photosynthesis in panicoid grasses, and is fast emerging as a system of choice for the study of plant development, domestication, abiotic stress responses and evolution. Basic research findings in Setaria are expected to advance research not only in this species and its close relative S. italica (foxtail millet), but also in other panicoid grasses, many of which are important food or bioenergy crops. Here we report on the standardization of multiple growth and development assays for S. viridis under controlled conditions, and in response to several phytohormones and abiotic stresses. We optimized these assays at three different stages of the plant's life: seed germination and post-germination growth using agar plate-based assays, early seedling growth and development using germination pouch-based assays, and adult plant growth and development under environmentally controlled growth chambers and greenhouses. These assays will be useful for the community to perform large scale phenotyping analyses, mutant screens, comparative physiological analysis, and functional characterization of novel genes of Setaria or other related agricultural crops. Precise description of various growth conditions, effective treatment conditions and description of the resultant phenotypes will help expand the use of S. viridis as an effective model system.

  19. Optimization of Phenotyping Assays for the Model Monocot Setaria viridis

    Directory of Open Access Journals (Sweden)

    Biswa R. Acharya

    2017-12-01

    Full Text Available Setaria viridis (green foxtail is an important model plant for the study of C4 photosynthesis in panicoid grasses, and is fast emerging as a system of choice for the study of plant development, domestication, abiotic stress responses and evolution. Basic research findings in Setaria are expected to advance research not only in this species and its close relative S. italica (foxtail millet, but also in other panicoid grasses, many of which are important food or bioenergy crops. Here we report on the standardization of multiple growth and development assays for S. viridis under controlled conditions, and in response to several phytohormones and abiotic stresses. We optimized these assays at three different stages of the plant’s life: seed germination and post-germination growth using agar plate-based assays, early seedling growth and development using germination pouch-based assays, and adult plant growth and development under environmentally controlled growth chambers and greenhouses. These assays will be useful for the community to perform large scale phenotyping analyses, mutant screens, comparative physiological analysis, and functional characterization of novel genes of Setaria or other related agricultural crops. Precise description of various growth conditions, effective treatment conditions and description of the resultant phenotypes will help expand the use of S. viridis as an effective model system.

  20. Why small-scale cannabis growers stay small: five mechanisms that prevent small-scale growers from going large scale.

    Science.gov (United States)

    Hammersvik, Eirik; Sandberg, Sveinung; Pedersen, Willy

    2012-11-01

    Over the past 15-20 years, domestic cultivation of cannabis has been established in a number of European countries. New techniques have made such cultivation easier; however, the bulk of growers remain small-scale. In this study, we explore the factors that prevent small-scale growers from increasing their production. The study is based on 1 year of ethnographic fieldwork and qualitative interviews conducted with 45 Norwegian cannabis growers, 10 of whom were growing on a large-scale and 35 on a small-scale. The study identifies five mechanisms that prevent small-scale indoor growers from going large-scale. First, large-scale operations involve a number of people, large sums of money, a high work-load and a high risk of detection, and thus demand a higher level of organizational skills than for small growing operations. Second, financial assets are needed to start a large 'grow-site'. Housing rent, electricity, equipment and nutrients are expensive. Third, to be able to sell large quantities of cannabis, growers need access to an illegal distribution network and knowledge of how to act according to black market norms and structures. Fourth, large-scale operations require advanced horticultural skills to maximize yield and quality, which demands greater skills and knowledge than does small-scale cultivation. Fifth, small-scale growers are often embedded in the 'cannabis culture', which emphasizes anti-commercialism, anti-violence and ecological and community values. Hence, starting up large-scale production will imply having to renegotiate or abandon these values. Going from small- to large-scale cannabis production is a demanding task-ideologically, technically, economically and personally. The many obstacles that small-scale growers face and the lack of interest and motivation for going large-scale suggest that the risk of a 'slippery slope' from small-scale to large-scale growing is limited. Possible political implications of the findings are discussed. Copyright

  1. Distributed large-scale dimensional metrology new insights

    CERN Document Server

    Franceschini, Fiorenzo; Maisano, Domenico

    2011-01-01

    Focuses on the latest insights into and challenges of distributed large scale dimensional metrology Enables practitioners to study distributed large scale dimensional metrology independently Includes specific examples of the development of new system prototypes

  2. A multi-scale convolutional neural network for phenotyping high-content cellular images.

    Science.gov (United States)

    Godinez, William J; Hossain, Imtiaz; Lazic, Stanley E; Davies, John W; Zhang, Xian

    2017-07-01

    Identifying phenotypes based on high-content cellular images is challenging. Conventional image analysis pipelines for phenotype identification comprise multiple independent steps, with each step requiring method customization and adjustment of multiple parameters. Here, we present an approach based on a multi-scale convolutional neural network (M-CNN) that classifies, in a single cohesive step, cellular images into phenotypes by using directly and solely the images' pixel intensity values. The only parameters in the approach are the weights of the neural network, which are automatically optimized based on training images. The approach requires no a priori knowledge or manual customization, and is applicable to single- or multi-channel images displaying single or multiple cells. We evaluated the classification performance of the approach on eight diverse benchmark datasets. The approach yielded overall a higher classification accuracy compared with state-of-the-art results, including those of other deep CNN architectures. In addition to using the network to simply obtain a yes-or-no prediction for a given phenotype, we use the probability outputs calculated by the network to quantitatively describe the phenotypes. This study shows that these probability values correlate with chemical treatment concentrations. This finding validates further our approach and enables chemical treatment potency estimation via CNNs. The network specifications and solver definitions are provided in Supplementary Software 1. william_jose.godinez_navarro@novartis.com or xian-1.zhang@novartis.com. Supplementary data are available at Bioinformatics online. © The Author 2017. Published by Oxford University Press. All rights reserved. For Permissions, please e-mail: journals.permissions@oup.com

  3. Hydroxychloroquine screening practice patterns within a large multispecialty ophthalmic practice.

    Science.gov (United States)

    Au, Adrian; Parikh, Vishal; Modi, Yasha S; Ehlers, Justis P; Schachat, Andrew P; Singh, Rishi P

    2015-09-01

    To determine provider compliance with hydroxychloroquine screening following the revised recommendations published in 2011 by the American Academy of Ophthalmology. Evaluation of adherence to a screening protocol. Subjects were identified with hydroxychloroquine as a medication by electronic query at a large multispecialty ophthalmic practice. Patients were excluded if patients: (1) were screened by an outside physician; (2) lacked recorded height, weight, start date, or dosing; or (3) took hydroxychloroquine for malaria prophylaxis. Screening tests were stratified by ophthalmic subspecialty. Guidelines define proper screening as 1 subjective test-Humphrey visual field (HVF), and 1 objective test-spectral-domain optical coherence tomography (SD OCT), fundus autofluorescence (FAF), or multifocal electroretinography (mfERG). Adherence to guidelines was determined by categorizing practices as: (1) "appropriate"-consistent with guidelines; (2) "underscreened"-insufficient testing; or (3) "inappropriate"-no testing. The study comprised 756 patients with a mean age of 56 years undergoing 1294 screening visits. Twenty-one patients received initial screenings outside the institution. Most common screening tests employed included SD OCT (56.6%), 10-2 HVF (55.0%), and Amsler grid (40.0%). Of the 735 initial screenings, 341 (46.4%) were appropriately screened, 204 (27.8%) underscreened, and 190 (25.9%) inappropriately screened. Of those who presented solely for screening (560), 307 (54.8%) were appropriately screened, 144 (25.7%) underscreened, and 109 (19.5%) inappropriately screened. Of patients presenting for hydroxychloroquine screening, 54.8% of patients received appropriate evaluation, indicating lack of adherence to guidelines. Overall, SD OCT and 10-2 HVF were the preferred screening modalities, with FAF and mfERG less frequently ordered. Copyright © 2015 Elsevier Inc. All rights reserved.

  4. SCALE INTERACTION IN A MIXING LAYER. THE ROLE OF THE LARGE-SCALE GRADIENTS

    KAUST Repository

    Fiscaletti, Daniele; Attili, Antonio; Bisetti, Fabrizio; Elsinga, Gerrit E.

    2015-01-01

    from physical considerations we would expect the scales to interact in a qualitatively similar way within the flow and across different turbulent flows. Therefore, instead of the large-scale fluctuations, the large-scale gradients modulation of the small scales has been additionally investigated.

  5. Prostate Cancer Screening Results from PLCO

    Science.gov (United States)

    Learn the results of the Prostate, Lung, Colorectal, and Ovarian (PLCO) Cancer Screening Trial, a large-scale clinical trial to determine whether certain cancer screening tests can help reduce deaths from prostate, lung, colorectal, and ovarian cancer.

  6. Not a load of rubbish: simulated field trials in large-scale containers.

    Science.gov (United States)

    Hohmann, M; Stahl, A; Rudloff, J; Wittkop, B; Snowdon, R J

    2016-09-01

    Assessment of yield performance under fluctuating environmental conditions is a major aim of crop breeders. Unfortunately, results from controlled-environment evaluations of complex agronomic traits rarely translate to field performance. A major cause is that crops grown over their complete lifecycle in a greenhouse or growth chamber are generally constricted in their root growth, which influences their response to important abiotic constraints like water or nutrient availability. To overcome this poor transferability, we established a plant growth system comprising large refuse containers (120 L 'wheelie bins') that allow detailed phenotyping of small field-crop populations under semi-controlled growth conditions. Diverse winter oilseed rape cultivars were grown at field densities throughout the crop lifecycle, in different experiments over 2 years, to compare seed yields from individual containers to plot yields from multi-environment field trials. We found that we were able to predict yields in the field with high accuracy from container-grown plants. The container system proved suitable for detailed studies of stress response physiology and performance in pre-breeding populations. Investment in automated large-container systems may help breeders improve field transferability of greenhouse experiments, enabling screening of pre-breeding materials for abiotic stress response traits with a positive influence on yield. © 2016 John Wiley & Sons Ltd.

  7. Trends in large-scale testing of reactor structures

    International Nuclear Information System (INIS)

    Blejwas, T.E.

    2003-01-01

    Large-scale tests of reactor structures have been conducted at Sandia National Laboratories since the late 1970s. This paper describes a number of different large-scale impact tests, pressurization tests of models of containment structures, and thermal-pressure tests of models of reactor pressure vessels. The advantages of large-scale testing are evident, but cost, in particular limits its use. As computer models have grown in size, such as number of degrees of freedom, the advent of computer graphics has made possible very realistic representation of results - results that may not accurately represent reality. A necessary condition to avoiding this pitfall is the validation of the analytical methods and underlying physical representations. Ironically, the immensely larger computer models sometimes increase the need for large-scale testing, because the modeling is applied to increasing more complex structural systems and/or more complex physical phenomena. Unfortunately, the cost of large-scale tests is a disadvantage that will likely severely limit similar testing in the future. International collaborations may provide the best mechanism for funding future programs with large-scale tests. (author)

  8. Chinese version of the Postpartum Depression Screening Scale: translation and validation.

    Science.gov (United States)

    Li, Lezhi; Liu, Fang; Zhang, Huilin; Wang, Li; Chen, Xiaofang

    2011-01-01

    Postpartum depression is an important public health problem in China. Although 10%-20% of Chinese women having recently given birth are affected by postpartum depression, only 10% receive treatment due to the lack of proper screening. The aims of this study were to translate the Postpartum Depression Screening Scale into Chinese (C-PDSS) and establish the psychometric properties of the C-PDSS. The study was undertaken in three phases, composed of forward and backward translation of the Postpartum Depression Screening Scale into Chinese, examination of content validity, and field testing to establish the reliability, validity, and optimal cutoff score of the C-PDSS along with its sensitivity, specificity, and predictive values. A total sample of 387 mothers within 12 weeks postpartum participated in the study. Each mother was asked to complete the C-PDSS and the Chinese version of the Edinburgh Postnatal Depression Scale and then was interviewed by an experienced researcher using the Diagnostic and Statistical Manual of Mental Disorders, 4th Edition, Text Revision. The Cronbach's alpha coefficient was .96 for the total C-PDSS, and the overall intraclass correlation was .79. Factor analysis of the scale revealed that it was composed of 7 factors with eigenvalues >1, accounting for 74.25% of the total variance. There was a significantly positive correlation between the C-PDSS and the Chinese version of the Edinburgh Postnatal Depression Scale (r = .66, p confirmatory factor analysis and generalization of the C-PDSS to a different sample in China.

  9. The Bergen Shopping Addiction Scale: reliability and validity of a brief screening test

    Science.gov (United States)

    Andreassen, Cecilie S.; Griffiths, Mark D.; Pallesen, Ståle; Bilder, Robert M.; Torsheim, Torbjørn; Aboujaoude, Elias

    2015-01-01

    Although excessive and compulsive shopping has been increasingly placed within the behavioral addiction paradigm in recent years, items in existing screens arguably do not assess the core criteria and components of addiction. To date, assessment screens for shopping disorders have primarily been rooted within the impulse-control or obsessive-compulsive disorder paradigms. Furthermore, existing screens use the terms ‘shopping,’ ‘buying,’ and ‘spending’ interchangeably, and do not necessarily reflect contemporary shopping habits. Consequently, a new screening tool for assessing shopping addiction was developed. Initially, 28 items, four for each of seven addiction criteria (salience, mood modification, conflict, tolerance, withdrawal, relapse, and problems), were constructed. These items and validated scales (i.e., Compulsive Buying Measurement Scale, Mini-International Personality Item Pool, Hospital Anxiety and Depression Scale, Rosenberg Self-Esteem Scale) were then administered to 23,537 participants (Mage = 35.8 years, SDage = 13.3). The highest loading item from each set of four pooled items reflecting the seven addiction criteria were retained in the final scale, The Bergen Shopping Addiction Scale (BSAS). The factor structure of the BSAS was good (RMSEA = 0.064, CFI = 0.983, TLI = 0.973) and coefficient alpha was 0.87. The scores on the BSAS converged with scores on the Compulsive Buying Measurement Scale (CBMS; 0.80), and were positively correlated with extroversion and neuroticism, and negatively with conscientiousness, agreeableness, and intellect/imagination. The scores of the BSAS were positively associated with anxiety, depression, and low self-esteem and inversely related to age. Females scored higher than males on the BSAS. The BSAS is the first scale to fully embed shopping addiction within an addiction paradigm. A recommended cutoff score for the new scale and future research directions are discussed. PMID:26441749

  10. The Bergen Shopping Addiction Scale: Reliability and validity of a brief screening test

    Directory of Open Access Journals (Sweden)

    Cecilie Schou Andreassen

    2015-09-01

    Full Text Available Although excessive and compulsive shopping has been increasingly placed within the behavioral addiction paradigm in recent years, items in existing screens arguably do not assess the core criteria and components of addiction. To date, assessment screens for shopping disorders have primarily been rooted within the impulse-control or obsessive-compulsive disorder paradigms. Furthermore, existing screens use the terms ‘shopping’, ‘buying’, and ‘spending’ interchangeably, and do not necessarily reflect contemporary shopping habits. Consequently, a new screening tool for assessing shopping addiction was developed. Initially, 28 items, four for each of seven addiction criteria (salience, mood modification, conflict, tolerance, withdrawal, relapse, and problems, were constructed. These items and validated scales (i.e., Compulsive Buying Measurement Scale, Mini-International Personality Item Pool, Hospital Anxiety and Depression Scale, Rosenberg Self-Esteem Scale were then administered to 23,537 participants (Mage=35.8 years, SDage=13.3. The highest loading item from each set of four pooled items reflecting the seven addiction criteria were retained in the final scale, The Bergen Shopping Addiction Scale (BSAS. The factor structure of the BSAS was good (RMSEA=.064, CFI=.983, TLI=.973 and coefficient alpha was .87. The scores on the BSAS converged with scores on the Compulsive Buying Measurement Scale (.80, and were positively correlated with extroversion and neuroticism, and negatively with conscientiousness, agreeableness, and intellect/imagination. The scores of the BSAS were positively associated with anxiety, depression, and low self-esteem and inversely related to age. Females scored higher than males on the BSAS. The BSAS is the first scale to fully embed shopping addiction within an addiction paradigm. A recommended cutoff score for the new scale and future research directions are discussed.

  11. The Bergen Shopping Addiction Scale: reliability and validity of a brief screening test.

    Science.gov (United States)

    Andreassen, Cecilie S; Griffiths, Mark D; Pallesen, Ståle; Bilder, Robert M; Torsheim, Torbjørn; Aboujaoude, Elias

    2015-01-01

    Although excessive and compulsive shopping has been increasingly placed within the behavioral addiction paradigm in recent years, items in existing screens arguably do not assess the core criteria and components of addiction. To date, assessment screens for shopping disorders have primarily been rooted within the impulse-control or obsessive-compulsive disorder paradigms. Furthermore, existing screens use the terms 'shopping,' 'buying,' and 'spending' interchangeably, and do not necessarily reflect contemporary shopping habits. Consequently, a new screening tool for assessing shopping addiction was developed. Initially, 28 items, four for each of seven addiction criteria (salience, mood modification, conflict, tolerance, withdrawal, relapse, and problems), were constructed. These items and validated scales (i.e., Compulsive Buying Measurement Scale, Mini-International Personality Item Pool, Hospital Anxiety and Depression Scale, Rosenberg Self-Esteem Scale) were then administered to 23,537 participants (M age = 35.8 years, SD age = 13.3). The highest loading item from each set of four pooled items reflecting the seven addiction criteria were retained in the final scale, The Bergen Shopping Addiction Scale (BSAS). The factor structure of the BSAS was good (RMSEA = 0.064, CFI = 0.983, TLI = 0.973) and coefficient alpha was 0.87. The scores on the BSAS converged with scores on the Compulsive Buying Measurement Scale (CBMS; 0.80), and were positively correlated with extroversion and neuroticism, and negatively with conscientiousness, agreeableness, and intellect/imagination. The scores of the BSAS were positively associated with anxiety, depression, and low self-esteem and inversely related to age. Females scored higher than males on the BSAS. The BSAS is the first scale to fully embed shopping addiction within an addiction paradigm. A recommended cutoff score for the new scale and future research directions are discussed.

  12. Simulator comparison of thumball, thumb switch, and touch screen input concepts for interaction with a large screen cockpit display format

    Science.gov (United States)

    Jones, Denise R.; Parrish, Russell V.

    1990-01-01

    A piloted simulation study was conducted comparing three different input methods for interfacing to a large screen, multiwindow, whole flight deck display for management of transport aircraft systems. The thumball concept utilized a miniature trackball embedded in a conventional side arm controller. The multifunction control throttle and stick (MCTAS) concept employed a thumb switch located in the throttle handle. The touch screen concept provided data entry through a capacitive touch screen installed on the display surface. The objective and subjective results obtained indicate that, with present implementations, the thumball concept was the most appropriate for interfacing with aircraft systems/subsystems presented on a large screen display. Not unexpectedly, the completion time differences between the three concepts varied with the task being performed, although the thumball implementation consistently outperformed the other two concepts. However, pilot suggestions for improved implementations of the MCTAS and touch screen concepts could reduce some of these differences.

  13. The validity of military screening for mental health problems: diagnostic accuracy of the PCL, K10 and AUDIT scales in an entire military population.

    Science.gov (United States)

    Searle, Amelia K; Van Hooff, Miranda; McFarlane, Alexander C; Davies, Christopher E; Fairweather-Schmidt, A Kate; Hodson, Stephanie E; Benassi, Helen; Steele, Nicole

    2015-03-01

    Depression, alcohol use disorders and post-traumatic stress disorder (PTSD) are serious issues among military personnel due to their impact on operational capability and individual well-being. Several military forces screen for these disorders using scales including the Kessler Psychological Distress Scale (K10), Alcohol Use Disorders Identification Test (AUDIT), and Post-traumatic Stress Disorder Checklist (PCL). However, it is unknown whether established cutoffs apply to military populations. This study is the first to test the diagnostic accuracy of these three scales in a population-based military cohort. A large sample of currently-serving Australian Defence Force (ADF) Navy, Army and Air Force personnel (n = 24,481) completed the K10, AUDIT and PCL-C (civilian version). Then, a stratified sub-sample (n = 1798) completed a structured diagnostic interview detecting 30-day disorder. Data were weighted to represent the ADF population (n = 50,049). Receiver operating characteristic (ROC) analyses suggested all three scales had acceptable sensitivity and specificity, with areas under the curve from 0.75 to 0.93. AUDIT and K10 screening cutoffs closely paralleled established cutoffs, whereas the PCL-C screening cutoff resembled that recommended for US military personnel. These self-report scales represent a cost-effective and clinically-useful means of screening personnel for disorder. Military populations may need lower cutoffs than civilians to screen for PTSD. Copyright © 2014 John Wiley & Sons, Ltd.

  14. Large-scale analysis of acute ethanol exposure in zebrafish development: a critical time window and resilience.

    Directory of Open Access Journals (Sweden)

    Shaukat Ali

    Full Text Available BACKGROUND: In humans, ethanol exposure during pregnancy causes a spectrum of developmental defects (fetal alcohol syndrome or FAS. Individuals vary in phenotypic expression. Zebrafish embryos develop FAS-like features after ethanol exposure. In this study, we ask whether stage-specific effects of ethanol can be identified in the zebrafish, and if so, whether they allow the pinpointing of sensitive developmental mechanisms. We have therefore conducted the first large-scale (>1500 embryos analysis of acute, stage-specific drug effects on zebrafish development, with a large panel of readouts. METHODOLOGY/PRINCIPAL FINDINGS: Zebrafish embryos were raised in 96-well plates. Range-finding indicated that 10% ethanol for 1 h was suitable for an acute exposure regime. High-resolution magic-angle spinning proton magnetic resonance spectroscopy showed that this produced a transient pulse of 0.86% concentration of ethanol in the embryo within the chorion. Survivors at 5 days postfertilisation were analysed. Phenotypes ranged from normal (resilient to severely malformed. Ethanol exposure at early stages caused high mortality (≥88%. At later stages of exposure, mortality declined and malformations developed. Pharyngeal arch hypoplasia and behavioral impairment were most common after prim-6 and prim-16 exposure. By contrast, microphthalmia and growth retardation were stage-independent. CONCLUSIONS: Our findings show that some ethanol effects are strongly stage-dependent. The phenotypes mimic key aspects of FAS including craniofacial abnormality, microphthalmia, growth retardation and behavioral impairment. We also identify a critical time window (prim-6 and prim-16 for ethanol sensitivity. Finally, our identification of a wide phenotypic spectrum is reminiscent of human FAS, and may provide a useful model for studying disease resilience.

  15. Large Scale Computations in Air Pollution Modelling

    DEFF Research Database (Denmark)

    Zlatev, Z.; Brandt, J.; Builtjes, P. J. H.

    Proceedings of the NATO Advanced Research Workshop on Large Scale Computations in Air Pollution Modelling, Sofia, Bulgaria, 6-10 July 1998......Proceedings of the NATO Advanced Research Workshop on Large Scale Computations in Air Pollution Modelling, Sofia, Bulgaria, 6-10 July 1998...

  16. Phenotypic Progression of Stargardt Disease in a Large Consanguineous Tunisian Family Harboring New ABCA4 Mutations

    Directory of Open Access Journals (Sweden)

    Yousra Falfoul

    2018-01-01

    Full Text Available To assess the progression of Stargardt (STGD disease over nine years in two branches of a large consanguineous Tunisian family. Initially, different phenotypes were observed with clinical intra- and interfamilial variations. At presentation, four different retinal phenotypes were observed. In phenotype 1, bull’s eye maculopathy and slight alteration of photopic responses in full-field electroretinography were observed in the youngest child. In phenotype 2, macular atrophy and yellow white were observed in two brothers. In phenotype 3, diffuse macular, peripapillary, and peripheral RPE atrophy and hyperfluorescent dots were observed in two sisters. In phenotype 4, Stargardt disease-fundus flavimaculatus phenotype was observed in two cousins with later age of onset. After a progression of 9 years, all seven patients displayed the same phenotype 3 with advanced stage STGD and diffuse atrophy. WES and MLPA identified two ABCA4 mutations M1: c.[(?_4635_(5714+?dup; (?_6148_(6479_+? del] and M2: c.[2041C>T], p.[R681∗]. In one branch, the three affected patients had M1/M1 causal mutations and in the other branch the two affected patients had M1/M2 causal mutations. After 9-year follow-up, all patients showed the same phenotypic evolution, confirming the progressive nature of the disease. Genetic variations in the two branches made no difference to similar end-stage disease.

  17. Large-Scale 3D Printing: The Way Forward

    Science.gov (United States)

    Jassmi, Hamad Al; Najjar, Fady Al; Ismail Mourad, Abdel-Hamid

    2018-03-01

    Research on small-scale 3D printing has rapidly evolved, where numerous industrial products have been tested and successfully applied. Nonetheless, research on large-scale 3D printing, directed to large-scale applications such as construction and automotive manufacturing, yet demands a great a great deal of efforts. Large-scale 3D printing is considered an interdisciplinary topic and requires establishing a blended knowledge base from numerous research fields including structural engineering, materials science, mechatronics, software engineering, artificial intelligence and architectural engineering. This review article summarizes key topics of relevance to new research trends on large-scale 3D printing, particularly pertaining (1) technological solutions of additive construction (i.e. the 3D printers themselves), (2) materials science challenges, and (3) new design opportunities.

  18. Growth Limits in Large Scale Networks

    DEFF Research Database (Denmark)

    Knudsen, Thomas Phillip

    limitations. The rising complexity of network management with the convergence of communications platforms is shown as problematic for both automatic management feasibility and for manpower resource management. In the fourth step the scope is extended to include the present society with the DDN project as its......The Subject of large scale networks is approached from the perspective of the network planner. An analysis of the long term planning problems is presented with the main focus on the changing requirements for large scale networks and the potential problems in meeting these requirements. The problems...... the fundamental technological resources in network technologies are analysed for scalability. Here several technological limits to continued growth are presented. The third step involves a survey of major problems in managing large scale networks given the growth of user requirements and the technological...

  19. Accelerating sustainability in large-scale facilities

    CERN Multimedia

    Marina Giampietro

    2011-01-01

    Scientific research centres and large-scale facilities are intrinsically energy intensive, but how can big science improve its energy management and eventually contribute to the environmental cause with new cleantech? CERN’s commitment to providing tangible answers to these questions was sealed in the first workshop on energy management for large scale scientific infrastructures held in Lund, Sweden, on the 13-14 October.   Participants at the energy management for large scale scientific infrastructures workshop. The workshop, co-organised with the European Spallation Source (ESS) and  the European Association of National Research Facilities (ERF), tackled a recognised need for addressing energy issues in relation with science and technology policies. It brought together more than 150 representatives of Research Infrastrutures (RIs) and energy experts from Europe and North America. “Without compromising our scientific projects, we can ...

  20. Visual coherence for large-scale line-plot visualizations

    KAUST Repository

    Muigg, Philipp

    2011-06-01

    Displaying a large number of lines within a limited amount of screen space is a task that is common to many different classes of visualization techniques such as time-series visualizations, parallel coordinates, link-node diagrams, and phase-space diagrams. This paper addresses the challenging problems of cluttering and overdraw inherent to such visualizations. We generate a 2x2 tensor field during line rasterization that encodes the distribution of line orientations through each image pixel. Anisotropic diffusion of a noise texture is then used to generate a dense, coherent visualization of line orientation. In order to represent features of different scales, we employ a multi-resolution representation of the tensor field. The resulting technique can easily be applied to a wide variety of line-based visualizations. We demonstrate this for parallel coordinates, a time-series visualization, and a phase-space diagram. Furthermore, we demonstrate how to integrate a focus+context approach by incorporating a second tensor field. Our approach achieves interactive rendering performance for large data sets containing millions of data items, due to its image-based nature and ease of implementation on GPUs. Simulation results from computational fluid dynamics are used to evaluate the performance and usefulness of the proposed method. © 2011 The Author(s).

  1. Visual coherence for large-scale line-plot visualizations

    KAUST Repository

    Muigg, Philipp; Hadwiger, Markus; Doleisch, Helmut; Grö ller, Eduard M.

    2011-01-01

    Displaying a large number of lines within a limited amount of screen space is a task that is common to many different classes of visualization techniques such as time-series visualizations, parallel coordinates, link-node diagrams, and phase-space diagrams. This paper addresses the challenging problems of cluttering and overdraw inherent to such visualizations. We generate a 2x2 tensor field during line rasterization that encodes the distribution of line orientations through each image pixel. Anisotropic diffusion of a noise texture is then used to generate a dense, coherent visualization of line orientation. In order to represent features of different scales, we employ a multi-resolution representation of the tensor field. The resulting technique can easily be applied to a wide variety of line-based visualizations. We demonstrate this for parallel coordinates, a time-series visualization, and a phase-space diagram. Furthermore, we demonstrate how to integrate a focus+context approach by incorporating a second tensor field. Our approach achieves interactive rendering performance for large data sets containing millions of data items, due to its image-based nature and ease of implementation on GPUs. Simulation results from computational fluid dynamics are used to evaluate the performance and usefulness of the proposed method. © 2011 The Author(s).

  2. Large scale reflood test

    International Nuclear Information System (INIS)

    Hirano, Kemmei; Murao, Yoshio

    1980-01-01

    The large-scale reflood test with a view to ensuring the safety of light water reactors was started in fiscal 1976 based on the special account act for power source development promotion measures by the entrustment from the Science and Technology Agency. Thereafter, to establish the safety of PWRs in loss-of-coolant accidents by joint international efforts, the Japan-West Germany-U.S. research cooperation program was started in April, 1980. Thereupon, the large-scale reflood test is now included in this program. It consists of two tests using a cylindrical core testing apparatus for examining the overall system effect and a plate core testing apparatus for testing individual effects. Each apparatus is composed of the mock-ups of pressure vessel, primary loop, containment vessel and ECCS. The testing method, the test results and the research cooperation program are described. (J.P.N.)

  3. Cell and small animal models for phenotypic drug discovery

    Directory of Open Access Journals (Sweden)

    Szabo M

    2017-06-01

    Full Text Available Mihaly Szabo,1 Sara Svensson Akusjärvi,1 Ankur Saxena,1 Jianping Liu,2 Gayathri Chandrasekar,1 Satish S Kitambi1 1Department of Microbiology Tumor, and Cell Biology, 2Department of Biochemistry and Biophysics, Karolinska Institutet, Solna, Sweden Abstract: The phenotype-based drug discovery (PDD approach is re-emerging as an alternative platform for drug discovery. This review provides an overview of the various model systems and technical advances in imaging and image analyses that strengthen the PDD platform. In PDD screens, compounds of therapeutic value are identified based on the phenotypic perturbations produced irrespective of target(s or mechanism of action. In this article, examples of phenotypic changes that can be detected and quantified with relative ease in a cell-based setup are discussed. In addition, a higher order of PDD screening setup using small animal models is also explored. As PDD screens integrate physiology and multiple signaling mechanisms during the screening process, the identified hits have higher biomedical applicability. Taken together, this review highlights the advantages gained by adopting a PDD approach in drug discovery. Such a PDD platform can complement target-based systems that are currently in practice to accelerate drug discovery. Keywords: phenotype, screening, PDD, discovery, zebrafish, drug

  4. Large Scale Cosmological Anomalies and Inhomogeneous Dark Energy

    Directory of Open Access Journals (Sweden)

    Leandros Perivolaropoulos

    2014-01-01

    Full Text Available A wide range of large scale observations hint towards possible modifications on the standard cosmological model which is based on a homogeneous and isotropic universe with a small cosmological constant and matter. These observations, also known as “cosmic anomalies” include unexpected Cosmic Microwave Background perturbations on large angular scales, large dipolar peculiar velocity flows of galaxies (“bulk flows”, the measurement of inhomogenous values of the fine structure constant on cosmological scales (“alpha dipole” and other effects. The presence of the observational anomalies could either be a large statistical fluctuation in the context of ΛCDM or it could indicate a non-trivial departure from the cosmological principle on Hubble scales. Such a departure is very much constrained by cosmological observations for matter. For dark energy however there are no significant observational constraints for Hubble scale inhomogeneities. In this brief review I discuss some of the theoretical models that can naturally lead to inhomogeneous dark energy, their observational constraints and their potential to explain the large scale cosmic anomalies.

  5. Large-scale patterns in Rayleigh-Benard convection

    International Nuclear Information System (INIS)

    Hardenberg, J. von; Parodi, A.; Passoni, G.; Provenzale, A.; Spiegel, E.A.

    2008-01-01

    Rayleigh-Benard convection at large Rayleigh number is characterized by the presence of intense, vertically moving plumes. Both laboratory and numerical experiments reveal that the rising and descending plumes aggregate into separate clusters so as to produce large-scale updrafts and downdrafts. The horizontal scales of the aggregates reported so far have been comparable to the horizontal extent of the containers, but it has not been clear whether that represents a limitation imposed by domain size. In this work, we present numerical simulations of convection at sufficiently large aspect ratio to ascertain whether there is an intrinsic saturation scale for the clustering process when that ratio is large enough. From a series of simulations of Rayleigh-Benard convection with Rayleigh numbers between 10 5 and 10 8 and with aspect ratios up to 12π, we conclude that the clustering process has a finite horizontal saturation scale with at most a weak dependence on Rayleigh number in the range studied

  6. web cellHTS2: A web-application for the analysis of high-throughput screening data

    Directory of Open Access Journals (Sweden)

    Boutros Michael

    2010-04-01

    Full Text Available Abstract Background The analysis of high-throughput screening data sets is an expanding field in bioinformatics. High-throughput screens by RNAi generate large primary data sets which need to be analyzed and annotated to identify relevant phenotypic hits. Large-scale RNAi screens are frequently used to identify novel factors that influence a broad range of cellular processes, including signaling pathway activity, cell proliferation, and host cell infection. Here, we present a web-based application utility for the end-to-end analysis of large cell-based screening experiments by cellHTS2. Results The software guides the user through the configuration steps that are required for the analysis of single or multi-channel experiments. The web-application provides options for various standardization and normalization methods, annotation of data sets and a comprehensive HTML report of the screening data analysis, including a ranked hit list. Sessions can be saved and restored for later re-analysis. The web frontend for the cellHTS2 R/Bioconductor package interacts with it through an R-server implementation that enables highly parallel analysis of screening data sets. web cellHTS2 further provides a file import and configuration module for common file formats. Conclusions The implemented web-application facilitates the analysis of high-throughput data sets and provides a user-friendly interface. web cellHTS2 is accessible online at http://web-cellHTS2.dkfz.de. A standalone version as a virtual appliance and source code for platforms supporting Java 1.5.0 can be downloaded from the web cellHTS2 page. web cellHTS2 is freely distributed under GPL.

  7. Automated microscopy for high-content RNAi screening

    Science.gov (United States)

    2010-01-01

    Fluorescence microscopy is one of the most powerful tools to investigate complex cellular processes such as cell division, cell motility, or intracellular trafficking. The availability of RNA interference (RNAi) technology and automated microscopy has opened the possibility to perform cellular imaging in functional genomics and other large-scale applications. Although imaging often dramatically increases the content of a screening assay, it poses new challenges to achieve accurate quantitative annotation and therefore needs to be carefully adjusted to the specific needs of individual screening applications. In this review, we discuss principles of assay design, large-scale RNAi, microscope automation, and computational data analysis. We highlight strategies for imaging-based RNAi screening adapted to different library and assay designs. PMID:20176920

  8. Robust Microplate-Based Methods for Culturing and in Vivo Phenotypic Screening of Chlamydomonas reinhardtii

    Directory of Open Access Journals (Sweden)

    Timothy C. Haire

    2018-03-01

    Full Text Available Chlamydomonas reinhardtii (Cr, a unicellular alga, is routinely utilized to study photosynthetic biochemistry, ciliary motility, and cellular reproduction. Its minimal culture requirements, unicellular morphology, and ease of transformation have made it a popular model system. Despite its relatively slow doubling time, compared with many bacteria, it is an ideal eukaryotic system for microplate-based studies utilizing either, or both, absorbance as well as fluorescence assays. Such microplate assays are powerful tools for researchers in the areas of toxicology, pharmacology, chemical genetics, biotechnology, and more. However, while microplate-based assays are valuable tools for screening biological systems, these methodologies can significantly alter the conditions in which the organisms are cultured and their subsequent physiology or morphology. Herein we describe a novel method for the microplate culture and in vivo phenotypic analysis of growth, viability, and photosynthetic pigments of C. reinhardtii. We evaluated the utility of our assay by screening silver nanoparticles for their effects on growth and viability. These methods are amenable to a wide assortment of studies and present a significant advancement in the methodologies available for research involving this model organism.

  9. Manufacturing test of large scale hollow capsule and long length cladding in the large scale oxide dispersion strengthened (ODS) martensitic steel

    International Nuclear Information System (INIS)

    Narita, Takeshi; Ukai, Shigeharu; Kaito, Takeji; Ohtsuka, Satoshi; Fujiwara, Masayuki

    2004-04-01

    Mass production capability of oxide dispersion strengthened (ODS) martensitic steel cladding (9Cr) has being evaluated in the Phase II of the Feasibility Studies on Commercialized Fast Reactor Cycle System. The cost for manufacturing mother tube (raw materials powder production, mechanical alloying (MA) by ball mill, canning, hot extrusion, and machining) is a dominant factor in the total cost for manufacturing ODS ferritic steel cladding. In this study, the large-sale 9Cr-ODS martensitic steel mother tube which is made with a large-scale hollow capsule, and long length claddings were manufactured, and the applicability of these processes was evaluated. Following results were obtained in this study. (1) Manufacturing the large scale mother tube in the dimension of 32 mm OD, 21 mm ID, and 2 m length has been successfully carried out using large scale hollow capsule. This mother tube has a high degree of accuracy in size. (2) The chemical composition and the micro structure of the manufactured mother tube are similar to the existing mother tube manufactured by a small scale can. And the remarkable difference between the bottom and top sides in the manufactured mother tube has not been observed. (3) The long length cladding has been successfully manufactured from the large scale mother tube which was made using a large scale hollow capsule. (4) For reducing the manufacturing cost of the ODS steel claddings, manufacturing process of the mother tubes using a large scale hollow capsules is promising. (author)

  10. Amplification of large-scale magnetic field in nonhelical magnetohydrodynamics

    KAUST Repository

    Kumar, Rohit

    2017-08-11

    It is typically assumed that the kinetic and magnetic helicities play a crucial role in the growth of large-scale dynamo. In this paper, we demonstrate that helicity is not essential for the amplification of large-scale magnetic field. For this purpose, we perform nonhelical magnetohydrodynamic (MHD) simulation, and show that the large-scale magnetic field can grow in nonhelical MHD when random external forcing is employed at scale 1/10 the box size. The energy fluxes and shell-to-shell transfer rates computed using the numerical data show that the large-scale magnetic energy grows due to the energy transfers from the velocity field at the forcing scales.

  11. Reply: Comparison of slope instability screening tools following a large storm event and application to forest management and policy

    Science.gov (United States)

    Whittaker, Kara A.; McShane, Dan

    2013-02-01

    A large storm event in southwest Washington State triggered over 2500 landslides and provided an opportunity to assess two slope stability screening tools. The statistical analysis conducted demonstrated that both screening tools are effective at predicting where landslides were likely to take place (Whittaker and McShane, 2012). Here we reply to two discussions of this article related to the development of the slope stability screening tools and the accuracy and scale of the spatial data used. Neither of the discussions address our statistical analysis or results. We provide greater detail on our sampling criteria and also elaborate on the policy and management implications of our findings and how they complement those of a separate investigation of landslides resulting from the same storm. The conclusions made in Whittaker and McShane (2012) stand as originally published unless future analysis indicates otherwise.

  12. Hydrometeorological variability on a large french catchment and its relation to large-scale circulation across temporal scales

    Science.gov (United States)

    Massei, Nicolas; Dieppois, Bastien; Fritier, Nicolas; Laignel, Benoit; Debret, Maxime; Lavers, David; Hannah, David

    2015-04-01

    In the present context of global changes, considerable efforts have been deployed by the hydrological scientific community to improve our understanding of the impacts of climate fluctuations on water resources. Both observational and modeling studies have been extensively employed to characterize hydrological changes and trends, assess the impact of climate variability or provide future scenarios of water resources. In the aim of a better understanding of hydrological changes, it is of crucial importance to determine how and to what extent trends and long-term oscillations detectable in hydrological variables are linked to global climate oscillations. In this work, we develop an approach associating large-scale/local-scale correlation, enmpirical statistical downscaling and wavelet multiresolution decomposition of monthly precipitation and streamflow over the Seine river watershed, and the North Atlantic sea level pressure (SLP) in order to gain additional insights on the atmospheric patterns associated with the regional hydrology. We hypothesized that: i) atmospheric patterns may change according to the different temporal wavelengths defining the variability of the signals; and ii) definition of those hydrological/circulation relationships for each temporal wavelength may improve the determination of large-scale predictors of local variations. The results showed that the large-scale/local-scale links were not necessarily constant according to time-scale (i.e. for the different frequencies characterizing the signals), resulting in changing spatial patterns across scales. This was then taken into account by developing an empirical statistical downscaling (ESD) modeling approach which integrated discrete wavelet multiresolution analysis for reconstructing local hydrometeorological processes (predictand : precipitation and streamflow on the Seine river catchment) based on a large-scale predictor (SLP over the Euro-Atlantic sector) on a monthly time-step. This approach

  13. Superconducting materials for large scale applications

    International Nuclear Information System (INIS)

    Dew-Hughes, D.

    1975-01-01

    Applications of superconductors capable of carrying large current densities in large-scale electrical devices are examined. Discussions are included on critical current density, superconducting materials available, and future prospects for improved superconducting materials. (JRD)

  14. Large-scale influences in near-wall turbulence.

    Science.gov (United States)

    Hutchins, Nicholas; Marusic, Ivan

    2007-03-15

    Hot-wire data acquired in a high Reynolds number facility are used to illustrate the need for adequate scale separation when considering the coherent structure in wall-bounded turbulence. It is found that a large-scale motion in the log region becomes increasingly comparable in energy to the near-wall cycle as the Reynolds number increases. Through decomposition of fluctuating velocity signals, it is shown that this large-scale motion has a distinct modulating influence on the small-scale energy (akin to amplitude modulation). Reassessment of DNS data, in light of these results, shows similar trends, with the rate and intensity of production due to the near-wall cycle subject to a modulating influence from the largest-scale motions.

  15. PHENOstruct: Prediction of human phenotype ontology terms using heterogeneous data sources.

    Science.gov (United States)

    Kahanda, Indika; Funk, Christopher; Verspoor, Karin; Ben-Hur, Asa

    2015-01-01

    The human phenotype ontology (HPO) was recently developed as a standardized vocabulary for describing the phenotype abnormalities associated with human diseases. At present, only a small fraction of human protein coding genes have HPO annotations. But, researchers believe that a large portion of currently unannotated genes are related to disease phenotypes. Therefore, it is important to predict gene-HPO term associations using accurate computational methods. In this work we demonstrate the performance advantage of the structured SVM approach which was shown to be highly effective for Gene Ontology term prediction in comparison to several baseline methods. Furthermore, we highlight a collection of informative data sources suitable for the problem of predicting gene-HPO associations, including large scale literature mining data.

  16. Optimization of a Fluorescence-Based Assay for Large-Scale Drug Screening against Babesia and Theileria Parasites

    OpenAIRE

    Rizk, Mohamed Abdo; El-Sayed, Shimaa Abd El-Salam; Terkawi, Mohamed Alaa; Youssef, Mohamed Ahmed; El Said, El Said El Shirbini; Elsayed, Gehad; El-Khodery, Sabry; El-Ashker, Maged; Elsify, Ahmed; Omar, Mosaab; Salama, Akram; Yokoyama, Naoaki; Igarashi, Ikuo

    2015-01-01

    A rapid and accurate assay for evaluating antibabesial drugs on a large scale is required for the discovery of novel chemotherapeutic agents against Babesia parasites. In the current study, we evaluated the usefulness of a fluorescence-based assay for determining the efficacies of antibabesial compounds against bovine and equine hemoparasites in in vitro cultures. Three different hematocrits (HCTs; 2.5%, 5%, and 10%) were used without daily replacement of the medium. The results of a high-thr...

  17. PKI security in large-scale healthcare networks.

    Science.gov (United States)

    Mantas, Georgios; Lymberopoulos, Dimitrios; Komninos, Nikos

    2012-06-01

    During the past few years a lot of PKI (Public Key Infrastructures) infrastructures have been proposed for healthcare networks in order to ensure secure communication services and exchange of data among healthcare professionals. However, there is a plethora of challenges in these healthcare PKI infrastructures. Especially, there are a lot of challenges for PKI infrastructures deployed over large-scale healthcare networks. In this paper, we propose a PKI infrastructure to ensure security in a large-scale Internet-based healthcare network connecting a wide spectrum of healthcare units geographically distributed within a wide region. Furthermore, the proposed PKI infrastructure facilitates the trust issues that arise in a large-scale healthcare network including multi-domain PKI infrastructures.

  18. Emerging large-scale solar heating applications

    International Nuclear Information System (INIS)

    Wong, W.P.; McClung, J.L.

    2009-01-01

    Currently the market for solar heating applications in Canada is dominated by outdoor swimming pool heating, make-up air pre-heating and domestic water heating in homes, commercial and institutional buildings. All of these involve relatively small systems, except for a few air pre-heating systems on very large buildings. Together these applications make up well over 90% of the solar thermal collectors installed in Canada during 2007. These three applications, along with the recent re-emergence of large-scale concentrated solar thermal for generating electricity, also dominate the world markets. This paper examines some emerging markets for large scale solar heating applications, with a focus on the Canadian climate and market. (author)

  19. Emerging large-scale solar heating applications

    Energy Technology Data Exchange (ETDEWEB)

    Wong, W.P.; McClung, J.L. [Science Applications International Corporation (SAIC Canada), Ottawa, Ontario (Canada)

    2009-07-01

    Currently the market for solar heating applications in Canada is dominated by outdoor swimming pool heating, make-up air pre-heating and domestic water heating in homes, commercial and institutional buildings. All of these involve relatively small systems, except for a few air pre-heating systems on very large buildings. Together these applications make up well over 90% of the solar thermal collectors installed in Canada during 2007. These three applications, along with the recent re-emergence of large-scale concentrated solar thermal for generating electricity, also dominate the world markets. This paper examines some emerging markets for large scale solar heating applications, with a focus on the Canadian climate and market. (author)

  20. Large-scale regions of antimatter

    International Nuclear Information System (INIS)

    Grobov, A. V.; Rubin, S. G.

    2015-01-01

    Amodified mechanism of the formation of large-scale antimatter regions is proposed. Antimatter appears owing to fluctuations of a complex scalar field that carries a baryon charge in the inflation era

  1. Large-scale regions of antimatter

    Energy Technology Data Exchange (ETDEWEB)

    Grobov, A. V., E-mail: alexey.grobov@gmail.com; Rubin, S. G., E-mail: sgrubin@mephi.ru [National Research Nuclear University MEPhI (Russian Federation)

    2015-07-15

    Amodified mechanism of the formation of large-scale antimatter regions is proposed. Antimatter appears owing to fluctuations of a complex scalar field that carries a baryon charge in the inflation era.

  2. Large-Scale Analysis of Art Proportions

    DEFF Research Database (Denmark)

    Jensen, Karl Kristoffer

    2014-01-01

    While literature often tries to impute mathematical constants into art, this large-scale study (11 databases of paintings and photos, around 200.000 items) shows a different truth. The analysis, consisting of the width/height proportions, shows a value of rarely if ever one (square) and with majo......While literature often tries to impute mathematical constants into art, this large-scale study (11 databases of paintings and photos, around 200.000 items) shows a different truth. The analysis, consisting of the width/height proportions, shows a value of rarely if ever one (square...

  3. The Expanded Large Scale Gap Test

    Science.gov (United States)

    1987-03-01

    NSWC TR 86-32 DTIC THE EXPANDED LARGE SCALE GAP TEST BY T. P. LIDDIARD D. PRICE RESEARCH AND TECHNOLOGY DEPARTMENT ’ ~MARCH 1987 Ap~proved for public...arises, to reduce the spread in the LSGT 50% gap value.) The worst charges, such as those with the highest or lowest densities, the largest re-pressed...Arlington, VA 22217 PE 62314N INS3A 1 RJ14E31 7R4TBK 11 TITLE (Include Security CIlmsilficatiorn The Expanded Large Scale Gap Test . 12. PEIRSONAL AUTHOR() T

  4. Large scale and big data processing and management

    CERN Document Server

    Sakr, Sherif

    2014-01-01

    Large Scale and Big Data: Processing and Management provides readers with a central source of reference on the data management techniques currently available for large-scale data processing. Presenting chapters written by leading researchers, academics, and practitioners, it addresses the fundamental challenges associated with Big Data processing tools and techniques across a range of computing environments.The book begins by discussing the basic concepts and tools of large-scale Big Data processing and cloud computing. It also provides an overview of different programming models and cloud-bas

  5. Reconfiguring Cooperative Work by Visualizing EPR on Large Projected Screens

    DEFF Research Database (Denmark)

    Simonsen, Jesper

    Simonsen, J. (2006): Reconfiguring Cooperative Work by Visualizing EPR on Large Projected Screens, Paper presented at the PDC 2006 workshop on: Reconfiguring Healthcare: Issues in Computer Supported Cooperative Work in Healthcare Environments. Participatory Design Conference, Trento, Italy, August...

  6. Large scale cluster computing workshop

    International Nuclear Information System (INIS)

    Dane Skow; Alan Silverman

    2002-01-01

    Recent revolutions in computer hardware and software technologies have paved the way for the large-scale deployment of clusters of commodity computers to address problems heretofore the domain of tightly coupled SMP processors. Near term projects within High Energy Physics and other computing communities will deploy clusters of scale 1000s of processors and be used by 100s to 1000s of independent users. This will expand the reach in both dimensions by an order of magnitude from the current successful production facilities. The goals of this workshop were: (1) to determine what tools exist which can scale up to the cluster sizes foreseen for the next generation of HENP experiments (several thousand nodes) and by implication to identify areas where some investment of money or effort is likely to be needed. (2) To compare and record experimences gained with such tools. (3) To produce a practical guide to all stages of planning, installing, building and operating a large computing cluster in HENP. (4) To identify and connect groups with similar interest within HENP and the larger clustering community

  7. Large-scale testing of women in Copenhagen has not reduced the prevalence of Chlamydia trachomatis infections

    DEFF Research Database (Denmark)

    Westh, Henrik Torkil; Kolmos, H J

    2003-01-01

    OBJECTIVE: To examine the impact of a stable, large-scale enzyme immunoassay (EIA) Chlamydia trachomatis testing situation in Copenhagen, and to estimate the impact of introducing a genomic-based assay with higher sensitivity and specificity. METHODS: Over a five-year study period, 25 305-28 505...... and negative predictive values of the Chlamydia test result, new screening strategies for both men and women in younger age groups will be necessary if chlamydial infections are to be curtailed....

  8. Large-Scale Agriculture and Outgrower Schemes in Ethiopia

    DEFF Research Database (Denmark)

    Wendimu, Mengistu Assefa

    , the impact of large-scale agriculture and outgrower schemes on productivity, household welfare and wages in developing countries is highly contentious. Chapter 1 of this thesis provides an introduction to the study, while also reviewing the key debate in the contemporary land ‘grabbing’ and historical large...... sugarcane outgrower scheme on household income and asset stocks. Chapter 5 examines the wages and working conditions in ‘formal’ large-scale and ‘informal’ small-scale irrigated agriculture. The results in Chapter 2 show that moisture stress, the use of untested planting materials, and conflict over land...... commands a higher wage than ‘formal’ large-scale agriculture, while rather different wage determination mechanisms exist in the two sectors. Human capital characteristics (education and experience) partly explain the differences in wages within the formal sector, but play no significant role...

  9. Economically viable large-scale hydrogen liquefaction

    Science.gov (United States)

    Cardella, U.; Decker, L.; Klein, H.

    2017-02-01

    The liquid hydrogen demand, particularly driven by clean energy applications, will rise in the near future. As industrial large scale liquefiers will play a major role within the hydrogen supply chain, production capacity will have to increase by a multiple of today’s typical sizes. The main goal is to reduce the total cost of ownership for these plants by increasing energy efficiency with innovative and simple process designs, optimized in capital expenditure. New concepts must ensure a manageable plant complexity and flexible operability. In the phase of process development and selection, a dimensioning of key equipment for large scale liquefiers, such as turbines and compressors as well as heat exchangers, must be performed iteratively to ensure technological feasibility and maturity. Further critical aspects related to hydrogen liquefaction, e.g. fluid properties, ortho-para hydrogen conversion, and coldbox configuration, must be analysed in detail. This paper provides an overview on the approach, challenges and preliminary results in the development of efficient as well as economically viable concepts for large-scale hydrogen liquefaction.

  10. The World Health Organization Adult Attention-Deficit/Hyperactivity Disorder Self-Report Screening Scale for DSM-5.

    Science.gov (United States)

    Ustun, Berk; Adler, Lenard A; Rudin, Cynthia; Faraone, Stephen V; Spencer, Thomas J; Berglund, Patricia; Gruber, Michael J; Kessler, Ronald C

    2017-05-01

    Recognition that adult attention-deficit/hyperactivity disorder (ADHD) is common, seriously impairing, and usually undiagnosed has led to the development of adult ADHD screening scales for use in community, workplace, and primary care settings. However, these scales are all calibrated to DSM-IV criteria, which are narrower than the recently developed DSM-5 criteria. To update for DSM-5 criteria and improve the operating characteristics of the widely used World Health Organization Adult ADHD Self-Report Scale (ASRS) for screening. Probability subsamples of participants in 2 general population surveys (2001-2003 household survey [n = 119] and 2004-2005 managed care subscriber survey [n = 218]) who completed the full 29-question self-report ASRS, with both subsamples over-sampling ASRS-screened positives, were blindly administered a semistructured research diagnostic interview for DSM-5 adult ADHD. In 2016, the Risk-Calibrated Supersparse Linear Integer Model, a novel machine-learning algorithm designed to create screening scales with optimal integer weights and limited numbers of screening questions, was applied to the pooled data to create a DSM-5 version of the ASRS screening scale. The accuracy of the new scale was then confirmed in an independent 2011-2012 clinical sample of patients seeking evaluation at the New York University Langone Medical Center Adult ADHD Program (NYU Langone) and 2015-2016 primary care controls (n = 300). Data analysis was conducted from April 4, 2016, to September 22, 2016. The sensitivity, specificity, area under the curve (AUC), and positive predictive value (PPV) of the revised ASRS. Of the total 637 participants, 44 (37.0%) household survey respondents, 51 (23.4%) managed care respondents, and 173 (57.7%) NYU Langone respondents met DSM-5 criteria for adult ADHD in the semistructured diagnostic interview. Of the respondents who met DSM-5 criteria for adult ADHD, 123 were male (45.9%); mean (SD) age was 33.1 (11.4) years

  11. The World Health Organization Adult Attention-Deficit/Hyperactivity Disorder Self-Report Screening Scale for DSM-5

    Science.gov (United States)

    Ustun, Berk; Adler, Lenard A.; Rudin, Cynthia; Faraone, Stephen V.; Spencer, Thomas J.; Berglund, Patricia; Gruber, Michael J.

    2017-01-01

    Importance Recognition that adult attention-deficit/hyperactivity disorder (ADHD) is common, seriously impairing, and usually undiagnosed has led to the development of adult ADHD screening scales for use in community, workplace, and primary care settings. However, these scales are all calibrated to DSM-IV criteria, which are narrower than the recently developed DSM-5 criteria. Objectives To update for DSM-5 criteria and improve the operating characteristics of the widely used World Health Organization Adult ADHD Self-Report Scale (ASRS) for screening. Design, Setting, and Participants Probability subsamples of participants in 2 general population surveys (2001-2003 household survey [n = 119] and 2004-2005 managed care subscriber survey [n = 218]) who completed the full 29-question self-report ASRS, with both subsamples over-sampling ASRS-screened positives, were blindly administered a semistructured research diagnostic interview for DSM-5 adult ADHD. In 2016, the Risk-Calibrated Supersparse Linear Integer Model, a novel machine-learning algorithm designed to create screening scales with optimal integer weights and limited numbers of screening questions, was applied to the pooled data to create a DSM-5 version of the ASRS screening scale. The accuracy of the new scale was then confirmed in an independent 2011-2012 clinical sample of patients seeking evaluation at the New York University Langone Medical Center Adult ADHD Program (NYU Langone) and 2015-2016 primary care controls (n = 300). Data analysis was conducted from April 4, 2016, to September 22, 2016. Main Outcomes and Measures The sensitivity, specificity, area under the curve (AUC), and positive predictive value (PPV) of the revised ASRS. Results Of the total 637 participants, 44 (37.0%) household survey respondents, 51 (23.4%) managed care respondents, and 173 (57.7%) NYU Langone respondents met DSM-5 criteria for adult ADHD in the semistructured diagnostic interview. Of the respondents who met

  12. Large scale chromatographic separations using continuous displacement chromatography (CDC)

    International Nuclear Information System (INIS)

    Taniguchi, V.T.; Doty, A.W.; Byers, C.H.

    1988-01-01

    A process for large scale chromatographic separations using a continuous chromatography technique is described. The process combines the advantages of large scale batch fixed column displacement chromatography with conventional analytical or elution continuous annular chromatography (CAC) to enable large scale displacement chromatography to be performed on a continuous basis (CDC). Such large scale, continuous displacement chromatography separations have not been reported in the literature. The process is demonstrated with the ion exchange separation of a binary lanthanide (Nd/Pr) mixture. The process is, however, applicable to any displacement chromatography separation that can be performed using conventional batch, fixed column chromatography

  13. Screening wells by multi-scale grids for multi-stage Markov Chain Monte Carlo simulation

    DEFF Research Database (Denmark)

    Akbari, Hani; Engsig-Karup, Allan Peter

    2018-01-01

    /production wells, aiming at accurate breakthrough capturing as well as above mentioned efficiency goals. However this short time simulation needs fine-scale structure of the geological model around wells and running a fine-scale model is not as cheap as necessary for screening steps. On the other hand applying...... it on a coarse-scale model declines important data around wells and causes inaccurate results, particularly accurate breakthrough capturing which is important for prediction applications. Therefore we propose a multi-scale grid which preserves the fine-scale model around wells (as well as high permeable regions...... and fractures) and coarsens rest of the field and keeps efficiency and accuracy for the screening well stage and coarse-scale simulation, as well. A discrete wavelet transform is used as a powerful tool to generate the desired unstructured multi-scale grid efficiently. Finally an accepted proposal on coarse...

  14. Large Scale Processes and Extreme Floods in Brazil

    Science.gov (United States)

    Ribeiro Lima, C. H.; AghaKouchak, A.; Lall, U.

    2016-12-01

    Persistent large scale anomalies in the atmospheric circulation and ocean state have been associated with heavy rainfall and extreme floods in water basins of different sizes across the world. Such studies have emerged in the last years as a new tool to improve the traditional, stationary based approach in flood frequency analysis and flood prediction. Here we seek to advance previous studies by evaluating the dominance of large scale processes (e.g. atmospheric rivers/moisture transport) over local processes (e.g. local convection) in producing floods. We consider flood-prone regions in Brazil as case studies and the role of large scale climate processes in generating extreme floods in such regions is explored by means of observed streamflow, reanalysis data and machine learning methods. The dynamics of the large scale atmospheric circulation in the days prior to the flood events are evaluated based on the vertically integrated moisture flux and its divergence field, which are interpreted in a low-dimensional space as obtained by machine learning techniques, particularly supervised kernel principal component analysis. In such reduced dimensional space, clusters are obtained in order to better understand the role of regional moisture recycling or teleconnected moisture in producing floods of a given magnitude. The convective available potential energy (CAPE) is also used as a measure of local convection activities. We investigate for individual sites the exceedance probability in which large scale atmospheric fluxes dominate the flood process. Finally, we analyze regional patterns of floods and how the scaling law of floods with drainage area responds to changes in the climate forcing mechanisms (e.g. local vs large scale).

  15. Computing in Large-Scale Dynamic Systems

    NARCIS (Netherlands)

    Pruteanu, A.S.

    2013-01-01

    Software applications developed for large-scale systems have always been difficult to de- velop due to problems caused by the large number of computing devices involved. Above a certain network size (roughly one hundred), necessary services such as code updating, topol- ogy discovery and data

  16. Fires in large scale ventilation systems

    International Nuclear Information System (INIS)

    Gregory, W.S.; Martin, R.A.; White, B.W.; Nichols, B.D.; Smith, P.R.; Leslie, I.H.; Fenton, D.L.; Gunaji, M.V.; Blythe, J.P.

    1991-01-01

    This paper summarizes the experience gained simulating fires in large scale ventilation systems patterned after ventilation systems found in nuclear fuel cycle facilities. The series of experiments discussed included: (1) combustion aerosol loading of 0.61x0.61 m HEPA filters with the combustion products of two organic fuels, polystyrene and polymethylemethacrylate; (2) gas dynamic and heat transport through a large scale ventilation system consisting of a 0.61x0.61 m duct 90 m in length, with dampers, HEPA filters, blowers, etc.; (3) gas dynamic and simultaneous transport of heat and solid particulate (consisting of glass beads with a mean aerodynamic diameter of 10μ) through the large scale ventilation system; and (4) the transport of heat and soot, generated by kerosene pool fires, through the large scale ventilation system. The FIRAC computer code, designed to predict fire-induced transients in nuclear fuel cycle facility ventilation systems, was used to predict the results of experiments (2) through (4). In general, the results of the predictions were satisfactory. The code predictions for the gas dynamics, heat transport, and particulate transport and deposition were within 10% of the experimentally measured values. However, the code was less successful in predicting the amount of soot generation from kerosene pool fires, probably due to the fire module of the code being a one-dimensional zone model. The experiments revealed a complicated three-dimensional combustion pattern within the fire room of the ventilation system. Further refinement of the fire module within FIRAC is needed. (orig.)

  17. Large-Scale Spacecraft Fire Safety Tests

    Science.gov (United States)

    Urban, David; Ruff, Gary A.; Ferkul, Paul V.; Olson, Sandra; Fernandez-Pello, A. Carlos; T'ien, James S.; Torero, Jose L.; Cowlard, Adam J.; Rouvreau, Sebastien; Minster, Olivier; hide

    2014-01-01

    An international collaborative program is underway to address open issues in spacecraft fire safety. Because of limited access to long-term low-gravity conditions and the small volume generally allotted for these experiments, there have been relatively few experiments that directly study spacecraft fire safety under low-gravity conditions. Furthermore, none of these experiments have studied sample sizes and environment conditions typical of those expected in a spacecraft fire. The major constraint has been the size of the sample, with prior experiments limited to samples of the order of 10 cm in length and width or smaller. This lack of experimental data forces spacecraft designers to base their designs and safety precautions on 1-g understanding of flame spread, fire detection, and suppression. However, low-gravity combustion research has demonstrated substantial differences in flame behavior in low-gravity. This, combined with the differences caused by the confined spacecraft environment, necessitates practical scale spacecraft fire safety research to mitigate risks for future space missions. To address this issue, a large-scale spacecraft fire experiment is under development by NASA and an international team of investigators. This poster presents the objectives, status, and concept of this collaborative international project (Saffire). The project plan is to conduct fire safety experiments on three sequential flights of an unmanned ISS re-supply spacecraft (the Orbital Cygnus vehicle) after they have completed their delivery of cargo to the ISS and have begun their return journeys to earth. On two flights (Saffire-1 and Saffire-3), the experiment will consist of a flame spread test involving a meter-scale sample ignited in the pressurized volume of the spacecraft and allowed to burn to completion while measurements are made. On one of the flights (Saffire-2), 9 smaller (5 x 30 cm) samples will be tested to evaluate NASAs material flammability screening tests

  18. Machine learning and computer vision approaches for phenotypic profiling.

    Science.gov (United States)

    Grys, Ben T; Lo, Dara S; Sahin, Nil; Kraus, Oren Z; Morris, Quaid; Boone, Charles; Andrews, Brenda J

    2017-01-02

    With recent advances in high-throughput, automated microscopy, there has been an increased demand for effective computational strategies to analyze large-scale, image-based data. To this end, computer vision approaches have been applied to cell segmentation and feature extraction, whereas machine-learning approaches have been developed to aid in phenotypic classification and clustering of data acquired from biological images. Here, we provide an overview of the commonly used computer vision and machine-learning methods for generating and categorizing phenotypic profiles, highlighting the general biological utility of each approach. © 2017 Grys et al.

  19. Large-scale Complex IT Systems

    OpenAIRE

    Sommerville, Ian; Cliff, Dave; Calinescu, Radu; Keen, Justin; Kelly, Tim; Kwiatkowska, Marta; McDermid, John; Paige, Richard

    2011-01-01

    This paper explores the issues around the construction of large-scale complex systems which are built as 'systems of systems' and suggests that there are fundamental reasons, derived from the inherent complexity in these systems, why our current software engineering methods and techniques cannot be scaled up to cope with the engineering challenges of constructing such systems. It then goes on to propose a research and education agenda for software engineering that identifies the major challen...

  20. Large-scale complex IT systems

    OpenAIRE

    Sommerville, Ian; Cliff, Dave; Calinescu, Radu; Keen, Justin; Kelly, Tim; Kwiatkowska, Marta; McDermid, John; Paige, Richard

    2012-01-01

    12 pages, 2 figures This paper explores the issues around the construction of large-scale complex systems which are built as 'systems of systems' and suggests that there are fundamental reasons, derived from the inherent complexity in these systems, why our current software engineering methods and techniques cannot be scaled up to cope with the engineering challenges of constructing such systems. It then goes on to propose a research and education agenda for software engineering that ident...

  1. First Mile Challenges for Large-Scale IoT

    KAUST Repository

    Bader, Ahmed; Elsawy, Hesham; Gharbieh, Mohammad; Alouini, Mohamed-Slim; Adinoyi, Abdulkareem; Alshaalan, Furaih

    2017-01-01

    The Internet of Things is large-scale by nature. This is not only manifested by the large number of connected devices, but also by the sheer scale of spatial traffic intensity that must be accommodated, primarily in the uplink direction. To that end

  2. An analysis of normalization methods for Drosophila RNAi genomic screens and development of a robust validation scheme

    Science.gov (United States)

    Wiles, Amy M.; Ravi, Dashnamoorthy; Bhavani, Selvaraj; Bishop, Alexander J.R.

    2010-01-01

    Genome-wide RNAi screening is a powerful, yet relatively immature technology that allows investigation into the role of individual genes in a process of choice. Most RNAi screens identify a large number of genes with a continuous gradient in the assessed phenotype. Screeners must then decide whether to examine just those genes with the most robust phenotype or to examine the full gradient of genes that cause an effect and how to identify the candidate genes to be validated. We have used RNAi in Drosophila cells to examine viability in a 384-well plate format and compare two screens, untreated control and treatment. We compare multiple normalization methods, which take advantage of different features within the data, including quantile normalization, background subtraction, scaling, cellHTS2 1, and interquartile range measurement. Considering the false-positive potential that arises from RNAi technology, a robust validation method was designed for the purpose of gene selection for future investigations. In a retrospective analysis, we describe the use of validation data to evaluate each normalization method. While no normalization method worked ideally, we found that a combination of two methods, background subtraction followed by quantile normalization and cellHTS2, at different thresholds, captures the most dependable and diverse candidate genes. Thresholds are suggested depending on whether a few candidate genes are desired or a more extensive systems level analysis is sought. In summary, our normalization approaches and experimental design to perform validation experiments are likely to apply to those high-throughput screening systems attempting to identify genes for systems level analysis. PMID:18753689

  3. Gene networks underlying convergent and pleiotropic phenotypes in a large and systematically-phenotyped cohort with heterogeneous developmental disorders.

    Directory of Open Access Journals (Sweden)

    Tallulah Andrews

    2015-03-01

    Full Text Available Readily-accessible and standardised capture of genotypic variation has revolutionised our understanding of the genetic contribution to disease. Unfortunately, the corresponding systematic capture of patient phenotypic variation needed to fully interpret the impact of genetic variation has lagged far behind. Exploiting deep and systematic phenotyping of a cohort of 197 patients presenting with heterogeneous developmental disorders and whose genomes harbour de novo CNVs, we systematically applied a range of commonly-used functional genomics approaches to identify the underlying molecular perturbations and their phenotypic impact. Grouping patients into 408 non-exclusive patient-phenotype groups, we identified a functional association amongst the genes disrupted in 209 (51% groups. We find evidence for a significant number of molecular interactions amongst the association-contributing genes, including a single highly-interconnected network disrupted in 20% of patients with intellectual disability, and show using microcephaly how these molecular networks can be used as baits to identify additional members whose genes are variant in other patients with the same phenotype. Exploiting the systematic phenotyping of this cohort, we observe phenotypic concordance amongst patients whose variant genes contribute to the same functional association but note that (i this relationship shows significant variation across the different approaches used to infer a commonly perturbed molecular pathway, and (ii that the phenotypic similarities detected amongst patients who share the same inferred pathway perturbation result from these patients sharing many distinct phenotypes, rather than sharing a more specific phenotype, inferring that these pathways are best characterized by their pleiotropic effects.

  4. Gene networks underlying convergent and pleiotropic phenotypes in a large and systematically-phenotyped cohort with heterogeneous developmental disorders.

    Science.gov (United States)

    Andrews, Tallulah; Meader, Stephen; Vulto-van Silfhout, Anneke; Taylor, Avigail; Steinberg, Julia; Hehir-Kwa, Jayne; Pfundt, Rolph; de Leeuw, Nicole; de Vries, Bert B A; Webber, Caleb

    2015-03-01

    Readily-accessible and standardised capture of genotypic variation has revolutionised our understanding of the genetic contribution to disease. Unfortunately, the corresponding systematic capture of patient phenotypic variation needed to fully interpret the impact of genetic variation has lagged far behind. Exploiting deep and systematic phenotyping of a cohort of 197 patients presenting with heterogeneous developmental disorders and whose genomes harbour de novo CNVs, we systematically applied a range of commonly-used functional genomics approaches to identify the underlying molecular perturbations and their phenotypic impact. Grouping patients into 408 non-exclusive patient-phenotype groups, we identified a functional association amongst the genes disrupted in 209 (51%) groups. We find evidence for a significant number of molecular interactions amongst the association-contributing genes, including a single highly-interconnected network disrupted in 20% of patients with intellectual disability, and show using microcephaly how these molecular networks can be used as baits to identify additional members whose genes are variant in other patients with the same phenotype. Exploiting the systematic phenotyping of this cohort, we observe phenotypic concordance amongst patients whose variant genes contribute to the same functional association but note that (i) this relationship shows significant variation across the different approaches used to infer a commonly perturbed molecular pathway, and (ii) that the phenotypic similarities detected amongst patients who share the same inferred pathway perturbation result from these patients sharing many distinct phenotypes, rather than sharing a more specific phenotype, inferring that these pathways are best characterized by their pleiotropic effects.

  5. Emerging semantics to link phenotype and environment

    Directory of Open Access Journals (Sweden)

    Anne E. Thessen

    2015-12-01

    Full Text Available Understanding the interplay between environmental conditions and phenotypes is a fundamental goal of biology. Unfortunately, data that include observations on phenotype and environment are highly heterogeneous and thus difficult to find and integrate. One approach that is likely to improve the status quo involves the use of ontologies to standardize and link data about phenotypes and environments. Specifying and linking data through ontologies will allow researchers to increase the scope and flexibility of large-scale analyses aided by modern computing methods. Investments in this area would advance diverse fields such as ecology, phylogenetics, and conservation biology. While several biological ontologies are well-developed, using them to link phenotypes and environments is rare because of gaps in ontological coverage and limits to interoperability among ontologies and disciplines. In this manuscript, we present (1 use cases from diverse disciplines to illustrate questions that could be answered more efficiently using a robust linkage between phenotypes and environments, (2 two proof-of-concept analyses that show the value of linking phenotypes to environments in fishes and amphibians, and (3 two proposed example data models for linking phenotypes and environments using the extensible observation ontology (OBOE and the Biological Collections Ontology (BCO; these provide a starting point for the development of a data model linking phenotypes and environments.

  6. Poor phenotype-genotype association in a large series of patients with Type III Bartter syndrome.

    Science.gov (United States)

    García Castaño, Alejandro; Pérez de Nanclares, Gustavo; Madariaga, Leire; Aguirre, Mireia; Madrid, Álvaro; Chocrón, Sara; Nadal, Inmaculada; Navarro, Mercedes; Lucas, Elena; Fijo, Julia; Espino, Mar; Espitaletta, Zilac; García Nieto, Víctor; Barajas de Frutos, David; Loza, Reyner; Pintos, Guillem; Castaño, Luis; Ariceta, Gema

    2017-01-01

    Type III Bartter syndrome (BS) is an autosomal recessive renal tubule disorder caused by loss-of-function mutations in the CLCNKB gene, which encodes the chloride channel protein ClC-Kb. In this study, we carried out a complete clinical and genetic characterization in a cohort of 30 patients, one of the largest series described. By comparing with other published populations, and considering that 80% of our patients presented the p.Ala204Thr Spanish founder mutation presumably associated with a common phenotype, we aimed to test the hypothesis that allelic differences could explain the wide phenotypic variability observed in patients with type III BS. Clinical data were retrieved from the referral centers. The exon regions and flanking intronic sequences of the CLCNKB gene were screened for mutations by polymerase chain reaction (PCR) followed by direct Sanger sequencing. Presence of gross deletions or duplications in the region was checked for by MLPA and QMPSF analyses. Polyuria, polydipsia and dehydration were the main common symptoms. Metabolic alkalosis and hypokalemia of renal origin were detected in all patients at diagnosis. Calciuria levels were variable: hypercalciuria was detected in 31% of patients, while 23% had hypocalciuria. Nephrocalcinosis was diagnosed in 20% of the cohort. Two novel CLCNKB mutations were identified: a small homozygous deletion (c.753delG) in one patient and a small deletion (c.1026delC) in another. The latter was present in compound heterozygosis with the already previously described p.Glu442Gly mutation. No phenotypic association was obtained regarding the genotype. A poor correlation was found between a specific type of mutation in the CLCNKB gene and type III BS phenotype. Importantly, two CLCNKB mutations not previously described were found in our cohort.

  7. Poor phenotype-genotype association in a large series of patients with Type III Bartter syndrome.

    Directory of Open Access Journals (Sweden)

    Alejandro García Castaño

    Full Text Available Type III Bartter syndrome (BS is an autosomal recessive renal tubule disorder caused by loss-of-function mutations in the CLCNKB gene, which encodes the chloride channel protein ClC-Kb. In this study, we carried out a complete clinical and genetic characterization in a cohort of 30 patients, one of the largest series described. By comparing with other published populations, and considering that 80% of our patients presented the p.Ala204Thr Spanish founder mutation presumably associated with a common phenotype, we aimed to test the hypothesis that allelic differences could explain the wide phenotypic variability observed in patients with type III BS.Clinical data were retrieved from the referral centers. The exon regions and flanking intronic sequences of the CLCNKB gene were screened for mutations by polymerase chain reaction (PCR followed by direct Sanger sequencing. Presence of gross deletions or duplications in the region was checked for by MLPA and QMPSF analyses.Polyuria, polydipsia and dehydration were the main common symptoms. Metabolic alkalosis and hypokalemia of renal origin were detected in all patients at diagnosis. Calciuria levels were variable: hypercalciuria was detected in 31% of patients, while 23% had hypocalciuria. Nephrocalcinosis was diagnosed in 20% of the cohort. Two novel CLCNKB mutations were identified: a small homozygous deletion (c.753delG in one patient and a small deletion (c.1026delC in another. The latter was present in compound heterozygosis with the already previously described p.Glu442Gly mutation. No phenotypic association was obtained regarding the genotype.A poor correlation was found between a specific type of mutation in the CLCNKB gene and type III BS phenotype. Importantly, two CLCNKB mutations not previously described were found in our cohort.

  8. Prospects for large scale electricity storage in Denmark

    DEFF Research Database (Denmark)

    Krog Ekman, Claus; Jensen, Søren Højgaard

    2010-01-01

    In a future power systems with additional wind power capacity there will be an increased need for large scale power management as well as reliable balancing and reserve capabilities. Different technologies for large scale electricity storage provide solutions to the different challenges arising w...

  9. Retrospective analysis of cohort database: Phenotypic variability in a large dataset of patients confirmed to have homozygous familial hypercholesterolemia

    Directory of Open Access Journals (Sweden)

    Frederick J. Raal

    2016-06-01

    Full Text Available These data describe the phenotypic variability in a large cohort of patients confirmed to have homozygous familial hypercholesterolemia. Herein, we describe the observed relationship of treated low-density lipoprotein cholesterol with age. We also overlay the low-density lipoprotein receptor gene (LDLR functional status with these phenotypic data. A full description of these data is available in our recent study published in Atherosclerosis, “Phenotype Diversity Among Patients With Homozygous Familial Hypercholesterolemia: A Cohort Study” (Raal et al., 2016 [1].

  10. Predictable Phenotypes of Antibiotic Resistance Mutations.

    Science.gov (United States)

    Knopp, M; Andersson, D I

    2018-05-15

    Antibiotic-resistant bacteria represent a major threat to our ability to treat bacterial infections. Two factors that determine the evolutionary success of antibiotic resistance mutations are their impact on resistance level and the fitness cost. Recent studies suggest that resistance mutations commonly show epistatic interactions, which would complicate predictions of their stability in bacterial populations. We analyzed 13 different chromosomal resistance mutations and 10 host strains of Salmonella enterica and Escherichia coli to address two main questions. (i) Are there epistatic interactions between different chromosomal resistance mutations? (ii) How does the strain background and genetic distance influence the effect of chromosomal resistance mutations on resistance and fitness? Our results show that the effects of combined resistance mutations on resistance and fitness are largely predictable and that epistasis remains rare even when up to four mutations were combined. Furthermore, a majority of the mutations, especially target alteration mutations, demonstrate strain-independent phenotypes across different species. This study extends our understanding of epistasis among resistance mutations and shows that interactions between different resistance mutations are often predictable from the characteristics of the individual mutations. IMPORTANCE The spread of antibiotic-resistant bacteria imposes an urgent threat to public health. The ability to forecast the evolutionary success of resistant mutants would help to combat dissemination of antibiotic resistance. Previous studies have shown that the phenotypic effects (fitness and resistance level) of resistance mutations can vary substantially depending on the genetic context in which they occur. We conducted a broad screen using many different resistance mutations and host strains to identify potential epistatic interactions between various types of resistance mutations and to determine the effect of strain

  11. Evolution of scaling emergence in large-scale spatial epidemic spreading.

    Science.gov (United States)

    Wang, Lin; Li, Xiang; Zhang, Yi-Qing; Zhang, Yan; Zhang, Kan

    2011-01-01

    Zipf's law and Heaps' law are two representatives of the scaling concepts, which play a significant role in the study of complexity science. The coexistence of the Zipf's law and the Heaps' law motivates different understandings on the dependence between these two scalings, which has still hardly been clarified. In this article, we observe an evolution process of the scalings: the Zipf's law and the Heaps' law are naturally shaped to coexist at the initial time, while the crossover comes with the emergence of their inconsistency at the larger time before reaching a stable state, where the Heaps' law still exists with the disappearance of strict Zipf's law. Such findings are illustrated with a scenario of large-scale spatial epidemic spreading, and the empirical results of pandemic disease support a universal analysis of the relation between the two laws regardless of the biological details of disease. Employing the United States domestic air transportation and demographic data to construct a metapopulation model for simulating the pandemic spread at the U.S. country level, we uncover that the broad heterogeneity of the infrastructure plays a key role in the evolution of scaling emergence. The analyses of large-scale spatial epidemic spreading help understand the temporal evolution of scalings, indicating the coexistence of the Zipf's law and the Heaps' law depends on the collective dynamics of epidemic processes, and the heterogeneity of epidemic spread indicates the significance of performing targeted containment strategies at the early time of a pandemic disease.

  12. Large-Scale Structure and Hyperuniformity of Amorphous Ices

    Science.gov (United States)

    Martelli, Fausto; Torquato, Salvatore; Giovambattista, Nicolas; Car, Roberto

    2017-09-01

    We investigate the large-scale structure of amorphous ices and transitions between their different forms by quantifying their large-scale density fluctuations. Specifically, we simulate the isothermal compression of low-density amorphous ice (LDA) and hexagonal ice to produce high-density amorphous ice (HDA). Both HDA and LDA are nearly hyperuniform; i.e., they are characterized by an anomalous suppression of large-scale density fluctuations. By contrast, in correspondence with the nonequilibrium phase transitions to HDA, the presence of structural heterogeneities strongly suppresses the hyperuniformity and the system becomes hyposurficial (devoid of "surface-area fluctuations"). Our investigation challenges the largely accepted "frozen-liquid" picture, which views glasses as structurally arrested liquids. Beyond implications for water, our findings enrich our understanding of pressure-induced structural transformations in glasses.

  13. Large-screen display industry: market and technology trends for direct view and projection displays

    Science.gov (United States)

    Castellano, Joseph A.; Mentley, David E.

    1996-03-01

    Large screen information displays are defined as dynamic electronic displays that can be viewed by more than one person and are at least 2-feet wide. These large area displays for public viewing provide convenience, entertainment, security, and efficiency to the viewers. There are numerous uses for large screen information displays including those in advertising, transportation, traffic control, conference room presentations, computer aided design, banking, and military command/control. A noticeable characteristic of the large screen display market is the interchangeability of display types. For any given application, the user can usually choose from at least three alternative technologies, and sometimes from many more. Some display types have features that make them suitable for specific applications due to temperature, brightness, power consumption, or other such characteristic. The overall worldwide unit consumption of large screen information displays of all types and for all applications (excluding consumer TV) will increase from 401,109 units in 1995 to 655,797 units in 2002. On a unit consumption basis, applications in business and education represent the largest share of unit consumption over this time period; in 1995, this application represented 69.7% of the total. The market (value of shipments) will grow from DOL3.1 billion in 1995 to DOL3.9 billion in 2002. The market will be dominated by front LCD projectors and LCD overhead projector plates.

  14. Advanced phenotyping and phenotype data analysis for the plant growth and development study

    Directory of Open Access Journals (Sweden)

    Md. Matiur eRahaman

    2015-08-01

    Full Text Available Due to increase in the consumption of food, feed, fuel and to ensure global food security for rapidly growing human population, there is need to breed high yielding crops that can adapt to future climate. To solve these global issues, novel approaches are required to provide quantitative phenotypes to elucidate the genetic basis of agriculturally import traits and to screen germplasm with super performance in function under resource-limited environment. At present, plant phenomics has offered and integrated suite technologies for understanding the complete set of phenotypes of plants, towards the progression of the full characteristics of plants with whole sequenced genomes. In this aspect, high-throughput phenotyping platforms have been developed that enables to capture extensive and intensive phenotype data from non-destructive imaging over time. These developments advance our view on plant growth and performance with responses to the changing climate and environment. In this paper, we present a brief review on currently developed high-throughput plant phenotyping infrastructures based on imaging techniques and corresponding principles for phenotype data analysis.

  15. Scaling behaviour of Fisher and Shannon entropies for the exponential-cosine screened coulomb potential

    Science.gov (United States)

    Abdelmonem, M. S.; Abdel-Hady, Afaf; Nasser, I.

    2017-07-01

    The scaling laws are given for the entropies in the information theory, including the Shannon's entropy, its power, the Fisher's information and the Fisher-Shannon product, using the exponential-cosine screened Coulomb potential. The scaling laws are specified, in the r-space, as a function of |μ - μc, nℓ|, where μ is the screening parameter and μc, nℓ its critical value for the specific quantum numbers n and ℓ. Scaling laws for other physical quantities, such as energy eigenvalues, the moments, static polarisability, transition probabilities, etc. are also given. Some of these are reported for the first time. The outcome is compared with the available literatures' results.

  16. Double inflation: A possible resolution of the large-scale structure problem

    International Nuclear Information System (INIS)

    Turner, M.S.; Villumsen, J.V.; Vittorio, N.; Silk, J.; Juszkiewicz, R.

    1986-11-01

    A model is presented for the large-scale structure of the universe in which two successive inflationary phases resulted in large small-scale and small large-scale density fluctuations. This bimodal density fluctuation spectrum in an Ω = 1 universe dominated by hot dark matter leads to large-scale structure of the galaxy distribution that is consistent with recent observational results. In particular, large, nearly empty voids and significant large-scale peculiar velocity fields are produced over scales of ∼100 Mpc, while the small-scale structure over ≤ 10 Mpc resembles that in a low density universe, as observed. Detailed analytical calculations and numerical simulations are given of the spatial and velocity correlations. 38 refs., 6 figs

  17. Large-scale fracture mechancis testing -- requirements and possibilities

    International Nuclear Information System (INIS)

    Brumovsky, M.

    1993-01-01

    Application of fracture mechanics to very important and/or complicated structures, like reactor pressure vessels, brings also some questions about the reliability and precision of such calculations. These problems become more pronounced in cases of elastic-plastic conditions of loading and/or in parts with non-homogeneous materials (base metal and austenitic cladding, property gradient changes through material thickness) or with non-homogeneous stress fields (nozzles, bolt threads, residual stresses etc.). For such special cases some verification by large-scale testing is necessary and valuable. This paper discusses problems connected with planning of such experiments with respect to their limitations, requirements to a good transfer of received results to an actual vessel. At the same time, an analysis of possibilities of small-scale model experiments is also shown, mostly in connection with application of results between standard, small-scale and large-scale experiments. Experience from 30 years of large-scale testing in SKODA is used as an example to support this analysis. 1 fig

  18. Ethics of large-scale change

    DEFF Research Database (Denmark)

    Arler, Finn

    2006-01-01

    , which kind of attitude is appropriate when dealing with large-scale changes like these from an ethical point of view. Three kinds of approaches are discussed: Aldo Leopold's mountain thinking, the neoclassical economists' approach, and finally the so-called Concentric Circle Theories approach...

  19. A “Forward Genomics” Approach Links Genotype to Phenotype using Independent Phenotypic Losses among Related Species

    Directory of Open Access Journals (Sweden)

    Michael Hiller

    2012-10-01

    Full Text Available Genotype-phenotype mapping is hampered by countless genomic changes between species. We introduce a computational “forward genomics” strategy that—given only an independently lost phenotype and whole genomes—matches genomic and phenotypic loss patterns to associate specific genomic regions with this phenotype. We conducted genome-wide screens for two metabolic phenotypes. First, our approach correctly matches the inactivated Gulo gene exactly with the species that lost the ability to synthesize vitamin C. Second, we attribute naturally low biliary phospholipid levels in guinea pigs and horses to the inactivated phospholipid transporter Abcb4. Human ABCB4 mutations also result in low phospholipid levels but lead to severe liver disease, suggesting compensatory mechanisms in guinea pig and horse. Our simulation studies, counts of independent changes in existing phenotype surveys, and the forthcoming availability of many new genomes all suggest that forward genomics can be applied to many phenotypes, including those relevant for human evolution and disease.

  20. Comparison Between Overtopping Discharge in Small and Large Scale Models

    DEFF Research Database (Denmark)

    Helgason, Einar; Burcharth, Hans F.

    2006-01-01

    The present paper presents overtopping measurements from small scale model test performed at the Haudraulic & Coastal Engineering Laboratory, Aalborg University, Denmark and large scale model tests performed at the Largde Wave Channel,Hannover, Germany. Comparison between results obtained from...... small and large scale model tests show no clear evidence of scale effects for overtopping above a threshold value. In the large scale model no overtopping was measured for waveheights below Hs = 0.5m as the water sunk into the voids between the stones on the crest. For low overtopping scale effects...

  1. A microliter-scale high-throughput screening system with quantum-dot nanoprobes for amyloid-β aggregation inhibitors.

    Directory of Open Access Journals (Sweden)

    Yukako Ishigaki

    Full Text Available The aggregation of amyloid β protein (Aβ is a key step in the pathogenesis of Alzheimer's disease (AD, and therefore inhibitory substances for Aβ aggregation may have preventive and/or therapeutic potential for AD. Here we report a novel microliter-scale high-throughput screening system for Aβ aggregation inhibitors based on fluorescence microscopy-imaging technology with quantum-dot Nanoprobes. This screening system could be analyzed with a 5-µl sample volume when a 1536-well plate was used, and the inhibitory activity could be estimated as half-maximal effective concentrations (EC50. We attempted to comprehensively screen Aβ aggregation inhibitors from 52 spices using this system to assess whether this novel screening system is actually useful for screening inhibitors. Screening results indicate that approximately 90% of the ethanolic extracts from the spices showed inhibitory activity for Aβ aggregation. Interestingly, spices belonging to the Lamiaceae, the mint family, showed significantly higher activity than the average of tested spices. Furthermore, we tried to isolate the main inhibitory compound from Saturejahortensis, summer savory, a member of the Lamiaceae, using this system, and revealed that the main active compound was rosmarinic acid. These results demonstrate that this novel microliter-scale high-throughput screening system could be applied to the actual screening of Aβ aggregation inhibitors. Since this system can analyze at a microscopic scale, it is likely that further minimization of the system would easily be possible such as protein microarray technology.

  2. Automated NMR fragment based screening identified a novel interface blocker to the LARG/RhoA complex.

    Directory of Open Access Journals (Sweden)

    Jia Gao

    Full Text Available The small GTPase cycles between the inactive GDP form and the activated GTP form, catalyzed by the upstream guanine exchange factors. The modulation of such process by small molecules has been proven to be a fruitful route for therapeutic intervention to prevent the over-activation of the small GTPase. The fragment based approach emerging in the past decade has demonstrated its paramount potential in the discovery of inhibitors targeting such novel and challenging protein-protein interactions. The details regarding the procedure of NMR fragment screening from scratch have been rarely disclosed comprehensively, thus restricts its wider applications. To achieve a consistent screening applicable to a number of targets, we developed a highly automated protocol to cover every aspect of NMR fragment screening as possible, including the construction of small but diverse libray, determination of the aqueous solubility by NMR, grouping compounds with mutual dispersity to a cocktail, and the automated processing and visualization of the ligand based screening spectra. We exemplified our streamlined screening in RhoA alone and the complex of the small GTPase RhoA and its upstream guanine exchange factor LARG. Two hits were confirmed from the primary screening in cocktail and secondary screening over individual hits for LARG/RhoA complex, while one of them was also identified from the screening for RhoA alone. HSQC titration of the two hits over RhoA and LARG alone, respectively, identified one compound binding to RhoA.GDP at a 0.11 mM affinity, and perturbed the residues at the switch II region of RhoA. This hit blocked the formation of the LARG/RhoA complex, validated by the native gel electrophoresis, and the titration of RhoA to ¹⁵N labeled LARG in the absence and presence the compound, respectively. It therefore provides us a starting point toward a more potent inhibitor to RhoA activation catalyzed by LARG.

  3. Large-scale community echocardiographic screening reveals a major burden of undiagnosed valvular heart disease in older people: the OxVALVE Population Cohort Study†

    Science.gov (United States)

    d'Arcy, Joanna L.; Coffey, Sean; Loudon, Margaret A.; Kennedy, Andrew; Pearson-Stuttard, Jonathan; Birks, Jacqueline; Frangou, Eleni; Farmer, Andrew J.; Mant, David; Wilson, Jo; Myerson, Saul G.; Prendergast, Bernard D.

    2016-01-01

    Background Valvular heart disease (VHD) is expected to become more common as the population ages. However, current estimates of its natural history and prevalence are based on historical studies with potential sources of bias. We conducted a cross-sectional analysis of the clinical and epidemiological characteristics of VHD identified at recruitment of a large cohort of older people. Methods and results We enrolled 2500 individuals aged ≥65 years from a primary care population and screened for undiagnosed VHD using transthoracic echocardiography. Newly identified (predominantly mild) VHD was detected in 51% of participants. The most common abnormalities were aortic sclerosis (34%), mitral regurgitation (22%), and aortic regurgitation (15%). Aortic stenosis was present in 1.3%. The likelihood of undiagnosed VHD was two-fold higher in the two most deprived socioeconomic quintiles than in the most affluent quintile, and three-fold higher in individuals with atrial fibrillation. Clinically significant (moderate or severe) undiagnosed VHD was identified in 6.4%. In addition, 4.9% of the cohort had pre-existing VHD (a total prevalence of 11.3%). Projecting these findings using population data, we estimate that the prevalence of clinically significant VHD will double before 2050. Conclusions Previously undetected VHD affects 1 in 2 of the elderly population and is more common in lower socioeconomic classes. These unique data demonstrate the contemporary clinical and epidemiological characteristics of VHD in a large population-based cohort of older people and confirm the scale of the emerging epidemic of VHD, with widespread implications for clinicians and healthcare resources. PMID:27354049

  4. A simple phenotypic method for screening of MCR-1-mediated colistin resistance.

    Science.gov (United States)

    Coppi, M; Cannatelli, A; Antonelli, A; Baccani, I; Di Pilato, V; Sennati, S; Giani, T; Rossolini, G M

    2018-02-01

    To evaluate a novel method, the colistin-MAC test, for phenotypic screening of acquired colistin resistance mediated by transferable mcr-1 resistance determinants, based on colistin MIC reduction in the presence of dipicolinic acid (DPA). The colistin-MAC test consists in a broth microdilution method, in which colistin MIC is tested in the absence or presence of DPA (900 μg/mL). Overall, 74 colistin-resistant strains of Enterobacteriaceae (65 Escherichia coli and nine other species), including 61 strains carrying mcr-1-like genes and 13 strains negative for mcr genes, were evaluated with the colistin-MAC test. The presence of mcr-1-like and mcr-2-like genes was assessed by real-time PCR and end-point PCR. For 20 strains, whole-genome sequencing data were also available. A ≥8-fold reduction of colistin MIC in the presence of DPA was observed with 59 mcr-1-positive strains, including 53 E. coli of clinical origin, three E. coli transconjugants carrying MCR-1-encoding plasmids, one Enterobacter cloacae complex and two Citrobacter spp. Colistin MICs were unchanged, increased or at most reduced by twofold with the 13 mcr-negative colistin-resistant strains (nine E. coli and four Klebsiella pneumoniae), but also with two mcr-1-like-positive K. pneumoniae strains. The colistin-MAC test could be a simple phenotypic test for presumptive identification of mcr-1-positive strains among isolates of colistin-resistant E. coli, based on a ≥8-fold reduction of colistin MIC in the presence of DPA. Evaluation of the test with a larger number of strains, species and mcr-type resistance determinants would be of interest. Copyright © 2017 European Society of Clinical Microbiology and Infectious Diseases. Published by Elsevier Ltd. All rights reserved.

  5. Needs, opportunities, and options for large scale systems research

    Energy Technology Data Exchange (ETDEWEB)

    Thompson, G.L.

    1984-10-01

    The Office of Energy Research was recently asked to perform a study of Large Scale Systems in order to facilitate the development of a true large systems theory. It was decided to ask experts in the fields of electrical engineering, chemical engineering and manufacturing/operations research for their ideas concerning large scale systems research. The author was asked to distribute a questionnaire among these experts to find out their opinions concerning recent accomplishments and future research directions in large scale systems research. He was also requested to convene a conference which included three experts in each area as panel members to discuss the general area of large scale systems research. The conference was held on March 26--27, 1984 in Pittsburgh with nine panel members, and 15 other attendees. The present report is a summary of the ideas presented and the recommendations proposed by the attendees.

  6. Large-scale structure of the Universe

    International Nuclear Information System (INIS)

    Doroshkevich, A.G.

    1978-01-01

    The problems, discussed at the ''Large-scale Structure of the Universe'' symposium are considered on a popular level. Described are the cell structure of galaxy distribution in the Universe, principles of mathematical galaxy distribution modelling. The images of cell structures, obtained after reprocessing with the computer are given. Discussed are three hypothesis - vortical, entropic, adiabatic, suggesting various processes of galaxy and galaxy clusters origin. A considerable advantage of the adiabatic hypothesis is recognized. The relict radiation, as a method of direct studying the processes taking place in the Universe is considered. The large-scale peculiarities and small-scale fluctuations of the relict radiation temperature enable one to estimate the turbance properties at the pre-galaxy stage. The discussion of problems, pertaining to studying the hot gas, contained in galaxy clusters, the interactions within galaxy clusters and with the inter-galaxy medium, is recognized to be a notable contribution into the development of theoretical and observational cosmology

  7. Seismic safety in conducting large-scale blasts

    Science.gov (United States)

    Mashukov, I. V.; Chaplygin, V. V.; Domanov, V. P.; Semin, A. A.; Klimkin, M. A.

    2017-09-01

    In mining enterprises to prepare hard rocks for excavation a drilling and blasting method is used. With the approach of mining operations to settlements the negative effect of large-scale blasts increases. To assess the level of seismic impact of large-scale blasts the scientific staff of Siberian State Industrial University carried out expertise for coal mines and iron ore enterprises. Determination of the magnitude of surface seismic vibrations caused by mass explosions was performed using seismic receivers, an analog-digital converter with recording on a laptop. The registration results of surface seismic vibrations during production of more than 280 large-scale blasts at 17 mining enterprises in 22 settlements are presented. The maximum velocity values of the Earth’s surface vibrations are determined. The safety evaluation of seismic effect was carried out according to the permissible value of vibration velocity. For cases with exceedance of permissible values recommendations were developed to reduce the level of seismic impact.

  8. Single-field consistency relations of large scale structure part III: test of the equivalence principle

    Energy Technology Data Exchange (ETDEWEB)

    Creminelli, Paolo [Abdus Salam International Centre for Theoretical Physics, Strada Costiera 11, Trieste, 34151 (Italy); Gleyzes, Jérôme; Vernizzi, Filippo [CEA, Institut de Physique Théorique, Gif-sur-Yvette cédex, F-91191 France (France); Hui, Lam [Physics Department and Institute for Strings, Cosmology and Astroparticle Physics, Columbia University, New York, NY, 10027 (United States); Simonović, Marko, E-mail: creminel@ictp.it, E-mail: jerome.gleyzes@cea.fr, E-mail: lhui@astro.columbia.edu, E-mail: msimonov@sissa.it, E-mail: filippo.vernizzi@cea.fr [SISSA, via Bonomea 265, Trieste, 34136 (Italy)

    2014-06-01

    The recently derived consistency relations for Large Scale Structure do not hold if the Equivalence Principle (EP) is violated. We show it explicitly in a toy model with two fluids, one of which is coupled to a fifth force. We explore the constraints that galaxy surveys can set on EP violation looking at the squeezed limit of the 3-point function involving two populations of objects. We find that one can explore EP violations of order 10{sup −3}÷10{sup −4} on cosmological scales. Chameleon models are already very constrained by the requirement of screening within the Solar System and only a very tiny region of the parameter space can be explored with this method. We show that no violation of the consistency relations is expected in Galileon models.

  9. Image-based Exploration of Large-Scale Pathline Fields

    KAUST Repository

    Nagoor, Omniah H.

    2014-05-27

    While real-time applications are nowadays routinely used in visualizing large nu- merical simulations and volumes, handling these large-scale datasets requires high-end graphics clusters or supercomputers to process and visualize them. However, not all users have access to powerful clusters. Therefore, it is challenging to come up with a visualization approach that provides insight to large-scale datasets on a single com- puter. Explorable images (EI) is one of the methods that allows users to handle large data on a single workstation. Although it is a view-dependent method, it combines both exploration and modification of visual aspects without re-accessing the original huge data. In this thesis, we propose a novel image-based method that applies the concept of EI in visualizing large flow-field pathlines data. The goal of our work is to provide an optimized image-based method, which scales well with the dataset size. Our approach is based on constructing a per-pixel linked list data structure in which each pixel contains a list of pathlines segments. With this view-dependent method it is possible to filter, color-code and explore large-scale flow data in real-time. In addition, optimization techniques such as early-ray termination and deferred shading are applied, which further improves the performance and scalability of our approach.

  10. COMPUTER APPROACHES TO WHEAT HIGH-THROUGHPUT PHENOTYPING

    Directory of Open Access Journals (Sweden)

    Afonnikov D.

    2012-08-01

    Full Text Available The growing need for rapid and accurate approaches for large-scale assessment of phenotypic characters in plants becomes more and more obvious in the studies looking into relationships between genotype and phenotype. This need is due to the advent of high throughput methods for analysis of genomes. Nowadays, any genetic experiment involves data on thousands and dozens of thousands of plants. Traditional ways of assessing most phenotypic characteristics (those with reliance on the eye, the touch, the ruler are little effective on samples of such sizes. Modern approaches seek to take advantage of automated phenotyping, which warrants a much more rapid data acquisition, higher accuracy of the assessment of phenotypic features, measurement of new parameters of these features and exclusion of human subjectivity from the process. Additionally, automation allows measurement data to be rapidly loaded into computer databases, which reduces data processing time.In this work, we present the WheatPGE information system designed to solve the problem of integration of genotypic and phenotypic data and parameters of the environment, as well as to analyze the relationships between the genotype and phenotype in wheat. The system is used to consolidate miscellaneous data on a plant for storing and processing various morphological traits and genotypes of wheat plants as well as data on various environmental factors. The system is available at www.wheatdb.org. Its potential in genetic experiments has been demonstrated in high-throughput phenotyping of wheat leaf pubescence.

  11. A population study comparing screening performance of prototypes for depression and anxiety with standard scales

    Directory of Open Access Journals (Sweden)

    Christensen Helen

    2011-11-01

    Full Text Available Abstract Background Screening instruments for mental disorders need to be short, engaging, and valid. Current screening instruments are usually questionnaire-based and may be opaque to the user. A prototype approach where individuals identify with a description of an individual with typical symptoms of depression, anxiety, social phobia or panic may be a shorter, faster and more acceptable method for screening. The aim of the study was to evaluate the accuracy of four new prototype screeners for predicting depression and anxiety disorders and to compare their performance with existing scales. Methods Short and ultra-short prototypes were developed for Major Depressive Disorder (MDD, Generalised Anxiety Disorder (GAD, Panic Disorder (PD and Social Phobia (SP. Prototypes were compared to typical short and ultra-short self-report screening scales, such as the Centre for Epidemiology Scale, CES-D and the GAD-7, and their short forms. The Mini International Neuropsychiatric Interview (MINI version 6 1 was used as the gold standard for obtaining clinical criteria through a telephone interview. From a population sample, 225 individuals who endorsed a prototype and 101 who did not were administered the MINI. Receiver operating characteristic (ROC curves were plotted for the short and ultra short prototypes and for the short and ultra short screening scales. Results The study found that the rates of endorsement of the prototypes were commensurate with prevalence estimates. The short-form and ultra short scales outperformed the short and ultra short prototypes for every disorder except GAD, where the GAD prototype outperformed the GAD 7. Conclusions The findings suggest that people may be able to self-identify generalised anxiety more accurately than depression based on a description of a prototypical case. However, levels of identification were lower than expected. Considerable benefits from this method of screening may ensue if our prototypes can be

  12. Homogenization of Large-Scale Movement Models in Ecology

    Science.gov (United States)

    Garlick, M.J.; Powell, J.A.; Hooten, M.B.; McFarlane, L.R.

    2011-01-01

    A difficulty in using diffusion models to predict large scale animal population dispersal is that individuals move differently based on local information (as opposed to gradients) in differing habitat types. This can be accommodated by using ecological diffusion. However, real environments are often spatially complex, limiting application of a direct approach. Homogenization for partial differential equations has long been applied to Fickian diffusion (in which average individual movement is organized along gradients of habitat and population density). We derive a homogenization procedure for ecological diffusion and apply it to a simple model for chronic wasting disease in mule deer. Homogenization allows us to determine the impact of small scale (10-100 m) habitat variability on large scale (10-100 km) movement. The procedure generates asymptotic equations for solutions on the large scale with parameters defined by small-scale variation. The simplicity of this homogenization procedure is striking when compared to the multi-dimensional homogenization procedure for Fickian diffusion,and the method will be equally straightforward for more complex models. ?? 2010 Society for Mathematical Biology.

  13. The role of large-scale, extratropical dynamics in climate change

    Energy Technology Data Exchange (ETDEWEB)

    Shepherd, T.G. [ed.

    1994-02-01

    The climate modeling community has focused recently on improving our understanding of certain processes, such as cloud feedbacks and ocean circulation, that are deemed critical to climate-change prediction. Although attention to such processes is warranted, emphasis on these areas has diminished a general appreciation of the role played by the large-scale dynamics of the extratropical atmosphere. Lack of interest in extratropical dynamics may reflect the assumption that these dynamical processes are a non-problem as far as climate modeling is concerned, since general circulation models (GCMs) calculate motions on this scale from first principles. Nevertheless, serious shortcomings in our ability to understand and simulate large-scale dynamics exist. Partly due to a paucity of standard GCM diagnostic calculations of large-scale motions and their transports of heat, momentum, potential vorticity, and moisture, a comprehensive understanding of the role of large-scale dynamics in GCM climate simulations has not been developed. Uncertainties remain in our understanding and simulation of large-scale extratropical dynamics and their interaction with other climatic processes, such as cloud feedbacks, large-scale ocean circulation, moist convection, air-sea interaction and land-surface processes. To address some of these issues, the 17th Stanstead Seminar was convened at Bishop`s University in Lennoxville, Quebec. The purpose of the Seminar was to promote discussion of the role of large-scale extratropical dynamics in global climate change. Abstracts of the talks are included in this volume. On the basis of these talks, several key issues emerged concerning large-scale extratropical dynamics and their climatic role. Individual records are indexed separately for the database.

  14. The role of large-scale, extratropical dynamics in climate change

    International Nuclear Information System (INIS)

    Shepherd, T.G.

    1994-02-01

    The climate modeling community has focused recently on improving our understanding of certain processes, such as cloud feedbacks and ocean circulation, that are deemed critical to climate-change prediction. Although attention to such processes is warranted, emphasis on these areas has diminished a general appreciation of the role played by the large-scale dynamics of the extratropical atmosphere. Lack of interest in extratropical dynamics may reflect the assumption that these dynamical processes are a non-problem as far as climate modeling is concerned, since general circulation models (GCMs) calculate motions on this scale from first principles. Nevertheless, serious shortcomings in our ability to understand and simulate large-scale dynamics exist. Partly due to a paucity of standard GCM diagnostic calculations of large-scale motions and their transports of heat, momentum, potential vorticity, and moisture, a comprehensive understanding of the role of large-scale dynamics in GCM climate simulations has not been developed. Uncertainties remain in our understanding and simulation of large-scale extratropical dynamics and their interaction with other climatic processes, such as cloud feedbacks, large-scale ocean circulation, moist convection, air-sea interaction and land-surface processes. To address some of these issues, the 17th Stanstead Seminar was convened at Bishop's University in Lennoxville, Quebec. The purpose of the Seminar was to promote discussion of the role of large-scale extratropical dynamics in global climate change. Abstracts of the talks are included in this volume. On the basis of these talks, several key issues emerged concerning large-scale extratropical dynamics and their climatic role. Individual records are indexed separately for the database

  15. Screening for depressive disorders using the MASQ anhedonic depression scale: A receiver-operator characteristic analysis

    Science.gov (United States)

    Bredemeier, Keith; Spielberg, Jeffrey M.; Silton, Rebecca Levin; Berenbaum, Howard; Heller, Wendy; Miller, Gregory A.

    2010-01-01

    The present study examined the utility of the anhedonic depression scale from the Mood and Anxiety Symptoms Questionnaire (MASQ-AD) as a way to screen for depressive disorders. Using receiver-operator characteristic analysis, the sensitivity and specificity of the full 22-item MASQ-AD scale, as well as the 8 and 14-item subscales, were examined in relation to both current and lifetime DSM-IV depressive disorder diagnoses in two nonpatient samples. As a means of comparison, the sensitivity and specificity of a measure of a relevant personality dimension, neuroticism, was also examined. Results from both samples support the clinical utility of the MASQ-AD scale as a means of screening for depressive disorders. Findings were strongest for the MASQ-AD 8-item subscale and when predicting current depression status. Furthermore, the MASQ-AD 8-item subscale outperformed the neuroticism measure under certain conditions. The overall usefulness of the MASQ-AD scale as a screening device is discussed, as well as possible cutoff scores for use in research. PMID:20822283

  16. Large-scale functional purification of recombinant HIV-1 capsid.

    Directory of Open Access Journals (Sweden)

    Magdeleine Hung

    Full Text Available During human immunodeficiency virus type-1 (HIV-1 virion maturation, capsid proteins undergo a major rearrangement to form a conical core that protects the viral nucleoprotein complexes. Mutations in the capsid sequence that alter the stability of the capsid core are deleterious to viral infectivity and replication. Recently, capsid assembly has become an attractive target for the development of a new generation of anti-retroviral agents. Drug screening efforts and subsequent structural and mechanistic studies require gram quantities of active, homogeneous and pure protein. Conventional means of laboratory purification of Escherichia coli expressed recombinant capsid protein rely on column chromatography steps that are not amenable to large-scale production. Here we present a function-based purification of wild-type and quadruple mutant capsid proteins, which relies on the inherent propensity of capsid protein to polymerize and depolymerize. This method does not require the packing of sizable chromatography columns and can generate double-digit gram quantities of functionally and biochemically well-behaved proteins with greater than 98% purity. We have used the purified capsid protein to characterize two known assembly inhibitors in our in-house developed polymerization assay and to measure their binding affinities. Our capsid purification procedure provides a robust method for purifying large quantities of a key protein in the HIV-1 life cycle, facilitating identification of the next generation anti-HIV agents.

  17. Status: Large-scale subatmospheric cryogenic systems

    International Nuclear Information System (INIS)

    Peterson, T.

    1989-01-01

    In the late 1960's and early 1970's an interest in testing and operating RF cavities at 1.8K motivated the development and construction of four large (300 Watt) 1.8K refrigeration systems. in the past decade, development of successful superconducting RF cavities and interest in obtaining higher magnetic fields with the improved Niobium-Titanium superconductors has once again created interest in large-scale 1.8K refrigeration systems. The L'Air Liquide plant for Tore Supra is a recently commissioned 300 Watt 1.8K system which incorporates new technology, cold compressors, to obtain the low vapor pressure for low temperature cooling. CEBAF proposes to use cold compressors to obtain 5KW at 2.0K. Magnetic refrigerators of 10 Watt capacity or higher at 1.8K are now being developed. The state of the art of large-scale refrigeration in the range under 4K will be reviewed. 28 refs., 4 figs., 7 tabs

  18. Systematic screening for skin, hair, and nail abnormalities in a large-scale knockout mouse program.

    Directory of Open Access Journals (Sweden)

    John P Sundberg

    Full Text Available The International Knockout Mouse Consortium was formed in 2007 to inactivate ("knockout" all protein-coding genes in the mouse genome in embryonic stem cells. Production and characterization of these mice, now underway, has generated and phenotyped 3,100 strains with knockout alleles. Skin and adnexa diseases are best defined at the gross clinical level and by histopathology. Representative retired breeders had skin collected from the back, abdomen, eyelids, muzzle, ears, tail, and lower limbs including the nails. To date, 169 novel mutant lines were reviewed and of these, only one was found to have a relatively minor sebaceous gland abnormality associated with follicular dystrophy. The B6N(Cg-Far2tm2b(KOMPWtsi/2J strain, had lesions affecting sebaceous glands with what appeared to be a secondary follicular dystrophy. A second line, B6N(Cg-Ppp1r9btm1.1(KOMPVlcg/J, had follicular dystrophy limited to many but not all mystacial vibrissae in heterozygous but not homozygous mutant mice, suggesting that this was a nonspecific background lesion. We discuss potential reasons for the low frequency of skin and adnexal phenotypes in mice from this project in comparison to those seen in human Mendelian diseases, and suggest alternative approaches to identification of human disease-relevant models.

  19. Large-scale weakly supervised object localization via latent category learning.

    Science.gov (United States)

    Chong Wang; Kaiqi Huang; Weiqiang Ren; Junge Zhang; Maybank, Steve

    2015-04-01

    Localizing objects in cluttered backgrounds is challenging under large-scale weakly supervised conditions. Due to the cluttered image condition, objects usually have large ambiguity with backgrounds. Besides, there is also a lack of effective algorithm for large-scale weakly supervised localization in cluttered backgrounds. However, backgrounds contain useful latent information, e.g., the sky in the aeroplane class. If this latent information can be learned, object-background ambiguity can be largely reduced and background can be suppressed effectively. In this paper, we propose the latent category learning (LCL) in large-scale cluttered conditions. LCL is an unsupervised learning method which requires only image-level class labels. First, we use the latent semantic analysis with semantic object representation to learn the latent categories, which represent objects, object parts or backgrounds. Second, to determine which category contains the target object, we propose a category selection strategy by evaluating each category's discrimination. Finally, we propose the online LCL for use in large-scale conditions. Evaluation on the challenging PASCAL Visual Object Class (VOC) 2007 and the large-scale imagenet large-scale visual recognition challenge 2013 detection data sets shows that the method can improve the annotation precision by 10% over previous methods. More importantly, we achieve the detection precision which outperforms previous results by a large margin and can be competitive to the supervised deformable part model 5.0 baseline on both data sets.

  20. Large-scale networks in engineering and life sciences

    CERN Document Server

    Findeisen, Rolf; Flockerzi, Dietrich; Reichl, Udo; Sundmacher, Kai

    2014-01-01

    This edited volume provides insights into and tools for the modeling, analysis, optimization, and control of large-scale networks in the life sciences and in engineering. Large-scale systems are often the result of networked interactions between a large number of subsystems, and their analysis and control are becoming increasingly important. The chapters of this book present the basic concepts and theoretical foundations of network theory and discuss its applications in different scientific areas such as biochemical reactions, chemical production processes, systems biology, electrical circuits, and mobile agents. The aim is to identify common concepts, to understand the underlying mathematical ideas, and to inspire discussions across the borders of the various disciplines.  The book originates from the interdisciplinary summer school “Large Scale Networks in Engineering and Life Sciences” hosted by the International Max Planck Research School Magdeburg, September 26-30, 2011, and will therefore be of int...

  1. Evaluating usability of the Halden Reactor Large Screen Display. Is the Information Rich Design concept suitable for real-world installations?

    International Nuclear Information System (INIS)

    Braseth, Alf Ove

    2013-01-01

    Large Screen Displays (LSDs) are beginning to supplement desktop displays in modern control rooms, having the potential to display the big picture of complex processes. Information Rich Design (IRD) is a LSD concept used in many real-life installations in the petroleum domain, and more recently in nuclear research applications. The objectives of IRD are to provide the big picture, avoiding keyhole related problems while supporting fast visual perception of larger data sets. Two LSDs based on the IRD concept have been developed for large-scale nuclear simulators for research purposes; they have however suffered from unsatisfying user experience. The new Halden Reactor LSD, used to monitor a nuclear research reactor, was designed according to recent proposed Design Principles compiled in this paper to mitigate previously experienced problems. This paper evaluates the usability of the Halden Reactor LSD, comparing usability data with the replaced analogue panel, and data for an older IRD large screen display. The results suggest that the IRD concept is suitable for use in real-life applications from a user experience point of view, and that the recently proposed Design Principles have had a positive effect on usability. (author)

  2. A phenotypic profile of the Candida albicans regulatory network.

    Directory of Open Access Journals (Sweden)

    Oliver R Homann

    2009-12-01

    Full Text Available Candida albicans is a normal resident of the gastrointestinal tract and also the most prevalent fungal pathogen of humans. It last shared a common ancestor with the model yeast Saccharomyces cerevisiae over 300 million years ago. We describe a collection of 143 genetically matched strains of C. albicans, each of which has been deleted for a specific transcriptional regulator. This collection represents a large fraction of the non-essential transcription circuitry. A phenotypic profile for each mutant was developed using a screen of 55 growth conditions. The results identify the biological roles of many individual transcriptional regulators; for many, this work represents the first description of their functions. For example, a quarter of the strains showed altered colony formation, a phenotype reflecting transitions among yeast, pseudohyphal, and hyphal cell forms. These transitions, which have been closely linked to pathogenesis, have been extensively studied, yet our work nearly doubles the number of transcriptional regulators known to influence them. As a second example, nearly a quarter of the knockout strains affected sensitivity to commonly used antifungal drugs; although a few transcriptional regulators have previously been implicated in susceptibility to these drugs, our work indicates many additional mechanisms of sensitivity and resistance. Finally, our results inform how transcriptional networks evolve. Comparison with the existing S. cerevisiae data (supplemented by additional S. cerevisiae experiments reported here allows the first systematic analysis of phenotypic conservation by orthologous transcriptional regulators over a large evolutionary distance. We find that, despite the many specific wiring changes documented between these species, the general phenotypes of orthologous transcriptional regulator knockouts are largely conserved. These observations support the idea that many wiring changes affect the detailed architecture of

  3. A phenotypic profile of the Candida albicans regulatory network.

    Science.gov (United States)

    Homann, Oliver R; Dea, Jeanselle; Noble, Suzanne M; Johnson, Alexander D

    2009-12-01

    Candida albicans is a normal resident of the gastrointestinal tract and also the most prevalent fungal pathogen of humans. It last shared a common ancestor with the model yeast Saccharomyces cerevisiae over 300 million years ago. We describe a collection of 143 genetically matched strains of C. albicans, each of which has been deleted for a specific transcriptional regulator. This collection represents a large fraction of the non-essential transcription circuitry. A phenotypic profile for each mutant was developed using a screen of 55 growth conditions. The results identify the biological roles of many individual transcriptional regulators; for many, this work represents the first description of their functions. For example, a quarter of the strains showed altered colony formation, a phenotype reflecting transitions among yeast, pseudohyphal, and hyphal cell forms. These transitions, which have been closely linked to pathogenesis, have been extensively studied, yet our work nearly doubles the number of transcriptional regulators known to influence them. As a second example, nearly a quarter of the knockout strains affected sensitivity to commonly used antifungal drugs; although a few transcriptional regulators have previously been implicated in susceptibility to these drugs, our work indicates many additional mechanisms of sensitivity and resistance. Finally, our results inform how transcriptional networks evolve. Comparison with the existing S. cerevisiae data (supplemented by additional S. cerevisiae experiments reported here) allows the first systematic analysis of phenotypic conservation by orthologous transcriptional regulators over a large evolutionary distance. We find that, despite the many specific wiring changes documented between these species, the general phenotypes of orthologous transcriptional regulator knockouts are largely conserved. These observations support the idea that many wiring changes affect the detailed architecture of the circuit, but

  4. Phenotype-genotype correlation in Wilson disease in a large Lebanese family: association of c.2299insC with hepatic and of p. Ala1003Thr with neurologic phenotype.

    Directory of Open Access Journals (Sweden)

    Julnar Usta

    Full Text Available Genotype phenotype correlations in Wilson disease (WD are best established in homozygous patients or in compound heterozygous patients carrying the same set of mutations. We determined the clinical phenotype of patients with WD carrying the c.2298_2299insC in Exon 8 (c.2299insC or the p. Ala1003Thr missense substitution in Exon 13 mutations in the homozygous or compound heterozygous state. We investigated 76 members of a single large Lebanese family. Their genotypes were determined, and clinical assessments were carried out for affected subjects. We also performed a literature search retrieving the phenotypes of patients carrying the same mutations of our patients in the homozygous or compound heterozygous state. There were 7 consanguineous marriages in this family and the prevalence of WD was 8.9% and of carriers of ATP7B mutation 44.7%. WD was confirmed in 9 out of 76 subjects. All 9 had the c.2299insC mutation, 5 homozygous and 4-compound heterozygous with p. Ala1003Thr. Six of our patients had hepatic, 2 had neurologic and 1 had asymptomatic phenotype. Based on our data and a literature review, clear phenotypes were reported for 38 patients worldwide carrying the c.2299insC mutation. About 53% of those have hepatic and 29% have neurologic phenotype. Furthermore, there were 10 compound heterozygous patients carrying the p. Ala1003Thr mutation. Among those, 80% having c.2299insC as the second mutation had hepatic phenotype, and all others had neurologic phenotype. We hereby report an association between the c.2299insC mutation and hepatic phenotype and between the p. Ala1003Thr mutation and neurologic phenotype.

  5. An Novel Architecture of Large-scale Communication in IOT

    Science.gov (United States)

    Ma, Wubin; Deng, Su; Huang, Hongbin

    2018-03-01

    In recent years, many scholars have done a great deal of research on the development of Internet of Things and networked physical systems. However, few people have made the detailed visualization of the large-scale communications architecture in the IOT. In fact, the non-uniform technology between IPv6 and access points has led to a lack of broad principles of large-scale communications architectures. Therefore, this paper presents the Uni-IPv6 Access and Information Exchange Method (UAIEM), a new architecture and algorithm that addresses large-scale communications in the IOT.

  6. Benefits of transactive memory systems in large-scale development

    OpenAIRE

    Aivars, Sablis

    2016-01-01

    Context. Large-scale software development projects are those consisting of a large number of teams, maybe even spread across multiple locations, and working on large and complex software tasks. That means that neither a team member individually nor an entire team holds all the knowledge about the software being developed and teams have to communicate and coordinate their knowledge. Therefore, teams and team members in large-scale software development projects must acquire and manage expertise...

  7. Validation of the Arab Youth Mental Health scale as a screening tool for depression/anxiety in Lebanese children

    Directory of Open Access Journals (Sweden)

    Nakkash Rima

    2011-03-01

    Full Text Available Abstract Background Early detection of common mental disorders, such as depression and anxiety, among children and adolescents requires the use of validated, culturally sensitive, and developmentally appropriate screening instruments. The Arab region has a high proportion of youth, yet Arabic-language screening instruments for mental disorders among this age group are virtually absent. Methods We carried out construct and clinical validation on the recently-developed Arab Youth Mental Health (AYMH scale as a screening tool for depression/anxiety. The scale was administered with 10-14 year old children attending a social service center in Beirut, Lebanon (N = 153. The clinical assessment was conducted by a child and adolescent clinical psychiatrist employing the DSM IV criteria. We tested the scale's sensitivity, specificity, and internal consistency. Results Scale scores were generally significantly associated with how participants responded to standard questions on health, mental health, and happiness, indicating good construct validity. The results revealed that the scale exhibited good internal consistency (Cronbach's alpha = 0.86 and specificity (79%. However, it exhibited moderate sensitivity for girls (71% and poor sensitivity for boys (50%. Conclusions The AYMH scale is useful as a screening tool for general mental health states and a valid screening instrument for common mental disorders among girls. It is not a valid instrument for detecting depression and anxiety among boys in an Arab culture.

  8. Study of a large scale neutron measurement channel

    International Nuclear Information System (INIS)

    Amarouayache, Anissa; Ben Hadid, Hayet.

    1982-12-01

    A large scale measurement channel allows the processing of the signal coming from an unique neutronic sensor, during three different running modes: impulses, fluctuations and current. The study described in this note includes three parts: - A theoretical study of the large scale channel and its brief description are given. The results obtained till now in that domain are presented. - The fluctuation mode is thoroughly studied and the improvements to be done are defined. The study of a fluctuation linear channel with an automatic commutation of scales is described and the results of the tests are given. In this large scale channel, the method of data processing is analogical. - To become independent of the problems generated by the use of a an analogical processing of the fluctuation signal, a digital method of data processing is tested. The validity of that method is improved. The results obtained on a test system realized according to this method are given and a preliminary plan for further research is defined [fr

  9. Comparative analysis of machine learning methods in ligand-based virtual screening of large compound libraries.

    Science.gov (United States)

    Ma, Xiao H; Jia, Jia; Zhu, Feng; Xue, Ying; Li, Ze R; Chen, Yu Z

    2009-05-01

    Machine learning methods have been explored as ligand-based virtual screening tools for facilitating drug lead discovery. These methods predict compounds of specific pharmacodynamic, pharmacokinetic or toxicological properties based on their structure-derived structural and physicochemical properties. Increasing attention has been directed at these methods because of their capability in predicting compounds of diverse structures and complex structure-activity relationships without requiring the knowledge of target 3D structure. This article reviews current progresses in using machine learning methods for virtual screening of pharmacodynamically active compounds from large compound libraries, and analyzes and compares the reported performances of machine learning tools with those of structure-based and other ligand-based (such as pharmacophore and clustering) virtual screening methods. The feasibility to improve the performance of machine learning methods in screening large libraries is discussed.

  10. Large-scale protein-protein interaction analysis in Arabidopsis mesophyll protoplasts by split firefly luciferase complementation.

    Science.gov (United States)

    Li, Jian-Feng; Bush, Jenifer; Xiong, Yan; Li, Lei; McCormack, Matthew

    2011-01-01

    Protein-protein interactions (PPIs) constitute the regulatory network that coordinates diverse cellular functions. There are growing needs in plant research for creating protein interaction maps behind complex cellular processes and at a systems biology level. However, only a few approaches have been successfully used for large-scale surveys of PPIs in plants, each having advantages and disadvantages. Here we present split firefly luciferase complementation (SFLC) as a highly sensitive and noninvasive technique for in planta PPI investigation. In this assay, the separate halves of a firefly luciferase can come into close proximity and transiently restore its catalytic activity only when their fusion partners, namely the two proteins of interest, interact with each other. This assay was conferred with quantitativeness and high throughput potential when the Arabidopsis mesophyll protoplast system and a microplate luminometer were employed for protein expression and luciferase measurement, respectively. Using the SFLC assay, we could monitor the dynamics of rapamycin-induced and ascomycin-disrupted interaction between Arabidopsis FRB and human FKBP proteins in a near real-time manner. As a proof of concept for large-scale PPI survey, we further applied the SFLC assay to testing 132 binary PPIs among 8 auxin response factors (ARFs) and 12 Aux/IAA proteins from Arabidopsis. Our results demonstrated that the SFLC assay is ideal for in vivo quantitative PPI analysis in plant cells and is particularly powerful for large-scale binary PPI screens.

  11. Large-scale protein-protein interaction analysis in Arabidopsis mesophyll protoplasts by split firefly luciferase complementation.

    Directory of Open Access Journals (Sweden)

    Jian-Feng Li

    Full Text Available Protein-protein interactions (PPIs constitute the regulatory network that coordinates diverse cellular functions. There are growing needs in plant research for creating protein interaction maps behind complex cellular processes and at a systems biology level. However, only a few approaches have been successfully used for large-scale surveys of PPIs in plants, each having advantages and disadvantages. Here we present split firefly luciferase complementation (SFLC as a highly sensitive and noninvasive technique for in planta PPI investigation. In this assay, the separate halves of a firefly luciferase can come into close proximity and transiently restore its catalytic activity only when their fusion partners, namely the two proteins of interest, interact with each other. This assay was conferred with quantitativeness and high throughput potential when the Arabidopsis mesophyll protoplast system and a microplate luminometer were employed for protein expression and luciferase measurement, respectively. Using the SFLC assay, we could monitor the dynamics of rapamycin-induced and ascomycin-disrupted interaction between Arabidopsis FRB and human FKBP proteins in a near real-time manner. As a proof of concept for large-scale PPI survey, we further applied the SFLC assay to testing 132 binary PPIs among 8 auxin response factors (ARFs and 12 Aux/IAA proteins from Arabidopsis. Our results demonstrated that the SFLC assay is ideal for in vivo quantitative PPI analysis in plant cells and is particularly powerful for large-scale binary PPI screens.

  12. Capabilities of the Large-Scale Sediment Transport Facility

    Science.gov (United States)

    2016-04-01

    pump flow meters, sediment trap weigh tanks , and beach profiling lidar. A detailed discussion of the original LSTF features and capabilities can be...ERDC/CHL CHETN-I-88 April 2016 Approved for public release; distribution is unlimited. Capabilities of the Large-Scale Sediment Transport...describes the Large-Scale Sediment Transport Facility (LSTF) and recent upgrades to the measurement systems. The purpose of these upgrades was to increase

  13. Spatiotemporal property and predictability of large-scale human mobility

    Science.gov (United States)

    Zhang, Hai-Tao; Zhu, Tao; Fu, Dongfei; Xu, Bowen; Han, Xiao-Pu; Chen, Duxin

    2018-04-01

    Spatiotemporal characteristics of human mobility emerging from complexity on individual scale have been extensively studied due to the application potential on human behavior prediction and recommendation, and control of epidemic spreading. We collect and investigate a comprehensive data set of human activities on large geographical scales, including both websites browse and mobile towers visit. Numerical results show that the degree of activity decays as a power law, indicating that human behaviors are reminiscent of scale-free random walks known as Lévy flight. More significantly, this study suggests that human activities on large geographical scales have specific non-Markovian characteristics, such as a two-segment power-law distribution of dwelling time and a high possibility for prediction. Furthermore, a scale-free featured mobility model with two essential ingredients, i.e., preferential return and exploration, and a Gaussian distribution assumption on the exploration tendency parameter is proposed, which outperforms existing human mobility models under scenarios of large geographical scales.

  14. Problems of large-scale vertically-integrated aquaculture

    Energy Technology Data Exchange (ETDEWEB)

    Webber, H H; Riordan, P F

    1976-01-01

    The problems of vertically-integrated aquaculture are outlined; they are concerned with: species limitations (in the market, biological and technological); site selection, feed, manpower needs, and legal, institutional and financial requirements. The gaps in understanding of, and the constraints limiting, large-scale aquaculture are listed. Future action is recommended with respect to: types and diversity of species to be cultivated, marketing, biotechnology (seed supply, disease control, water quality and concerted effort), siting, feed, manpower, legal and institutional aids (granting of water rights, grants, tax breaks, duty-free imports, etc.), and adequate financing. The last of hard data based on experience suggests that large-scale vertically-integrated aquaculture is a high risk enterprise, and with the high capital investment required, banks and funding institutions are wary of supporting it. Investment in pilot projects is suggested to demonstrate that large-scale aquaculture can be a fully functional and successful business. Construction and operation of such pilot farms is judged to be in the interests of both the public and private sector.

  15. Item response theory analysis of the Lichtenberg Financial Decision Screening Scale.

    Science.gov (United States)

    Teresi, Jeanne A; Ocepek-Welikson, Katja; Lichtenberg, Peter A

    2017-01-01

    The focus of these analyses was to examine the psychometric properties of the Lichtenberg Financial Decision Screening Scale (LFDSS). The purpose of the screen was to evaluate the decisional abilities and vulnerability to exploitation of older adults. Adults aged 60 and over were interviewed by social, legal, financial, or health services professionals who underwent in-person training on the administration and scoring of the scale. Professionals provided a rating of the decision-making abilities of the older adult. The analytic sample included 213 individuals with an average age of 76.9 (SD = 10.1). The majority (57%) were female. Data were analyzed using item response theory (IRT) methodology. The results supported the unidimensionality of the item set. Several IRT models were tested. Ten ordinal and binary items evidenced a slightly higher reliability estimate (0.85) than other versions and better coverage in terms of the range of reliable measurement across the continuum of financial incapacity.

  16. Large-scale computing with Quantum Espresso

    International Nuclear Information System (INIS)

    Giannozzi, P.; Cavazzoni, C.

    2009-01-01

    This paper gives a short introduction to Quantum Espresso: a distribution of software for atomistic simulations in condensed-matter physics, chemical physics, materials science, and to its usage in large-scale parallel computing.

  17. The effect of Interaction Anxiousness Scale and Brief Social Phobia Scale for screening social anxiety disorder in college students: a study on discriminative validity.

    Science.gov (United States)

    Cao, Jianqin; Yang, Jinwei; Zhou, Yuqiu; Chu, Fuliu; Zhao, Xiwu; Wang, Weiren; Wang, Yunlong; Peng, Tao

    2016-12-01

    Social anxiety disorder (SAD) is one of the most prevalent mental health problems, but there is little research concerning the effective screening instruments in practice. This study was designed to examine the discriminative validity of Interaction Anxiousness Scale (IAS) and Brief Social Phobia Scale (BSPS) for the screening of SAD through the compared and combined analysis. Firstly, 421 Chinese undergraduates were screened by the IAS and BSPS. Secondly, in the follow-up stage, 248 students were interviewed by the Structured Clinical Interview for DSM-IV. Receiver operating characteristic (ROC) analysis was used, and the related psychometric characters were checked. The results indicated that the ROC in these two scales demonstrated discrimination is in satisfactory level (range: 0.7-0.8). However, the highest agreement (92.17%) was identified when a cut-off point of 50 measured by the IAS and a cut-off point of 34 by the BSPS were combined, also with higher PPV, SENS, SPEC and OA than that reached when BSPS was used individually, as well as PPV, SPEC and OA in IAS. The findings indicate that the combination of these two scales is valid as the general screening instrument for SAD in maximizing the discriminative validity.

  18. Maximally efficient two-stage screening: Determining intellectual disability in Taiwanese military conscripts.

    Science.gov (United States)

    Chien, Chia-Chang; Huang, Shu-Fen; Lung, For-Wey

    2009-01-27

    The purpose of this study was to apply a two-stage screening method for the large-scale intelligence screening of military conscripts. We collected 99 conscripted soldiers whose educational levels were senior high school level or lower to be the participants. Every participant was required to take the Wisconsin Card Sorting Test (WCST) and the Wechsler Adult Intelligence Scale-Revised (WAIS-R) assessments. Logistic regression analysis showed the conceptual level responses (CLR) index of the WCST was the most significant index for determining intellectual disability (ID; FIQ ≤ 84). We used the receiver operating characteristic curve to determine the optimum cut-off point of CLR. The optimum one cut-off point of CLR was 66; the two cut-off points were 49 and 66. Comparing the two-stage window screening with the two-stage positive screening, the area under the curve and the positive predictive value increased. Moreover, the cost of the two-stage window screening decreased by 59%. The two-stage window screening is more accurate and economical than the two-stage positive screening. Our results provide an example for the use of two-stage screening and the possibility of the WCST to replace WAIS-R in large-scale screenings for ID in the future.

  19. VESPA: Very large-scale Evolutionary and Selective Pressure Analyses

    Directory of Open Access Journals (Sweden)

    Andrew E. Webb

    2017-06-01

    Full Text Available Background Large-scale molecular evolutionary analyses of protein coding sequences requires a number of preparatory inter-related steps from finding gene families, to generating alignments and phylogenetic trees and assessing selective pressure variation. Each phase of these analyses can represent significant challenges, particularly when working with entire proteomes (all protein coding sequences in a genome from a large number of species. Methods We present VESPA, software capable of automating a selective pressure analysis using codeML in addition to the preparatory analyses and summary statistics. VESPA is written in python and Perl and is designed to run within a UNIX environment. Results We have benchmarked VESPA and our results show that the method is consistent, performs well on both large scale and smaller scale datasets, and produces results in line with previously published datasets. Discussion Large-scale gene family identification, sequence alignment, and phylogeny reconstruction are all important aspects of large-scale molecular evolutionary analyses. VESPA provides flexible software for simplifying these processes along with downstream selective pressure variation analyses. The software automatically interprets results from codeML and produces simplified summary files to assist the user in better understanding the results. VESPA may be found at the following website: http://www.mol-evol.org/VESPA.

  20. Plant phenomics and the need for physiological phenotyping across scales to narrow the genotype-to-phenotype knowledge gap

    Czech Academy of Sciences Publication Activity Database

    Grosskinsky, D. K.; Svensgaard, J.; Christensen, S.; Roitsch, Thomas

    2015-01-01

    Roč. 66, č. 18 (2015), s. 5429-5440 ISSN 0022-0957 Institutional support: RVO:67179843 Keywords : External phenotype * genome–environment–management interaction * genome–phenome map * internal phenotype * phenomics * physiological traits * physiology * plant phenotyping * predictors Subject RIV: EH - Ecology, Behaviour Impact factor: 5.677, year: 2015

  1. Large-scale community echocardiographic screening reveals a major burden of undiagnosed valvular heart disease in older people: the OxVALVE Population Cohort Study.

    Science.gov (United States)

    d'Arcy, Joanna L; Coffey, Sean; Loudon, Margaret A; Kennedy, Andrew; Pearson-Stuttard, Jonathan; Birks, Jacqueline; Frangou, Eleni; Farmer, Andrew J; Mant, David; Wilson, Jo; Myerson, Saul G; Prendergast, Bernard D

    2016-12-14

    Valvular heart disease (VHD) is expected to become more common as the population ages. However, current estimates of its natural history and prevalence are based on historical studies with potential sources of bias. We conducted a cross-sectional analysis of the clinical and epidemiological characteristics of VHD identified at recruitment of a large cohort of older people. We enrolled 2500 individuals aged ≥65 years from a primary care population and screened for undiagnosed VHD using transthoracic echocardiography. Newly identified (predominantly mild) VHD was detected in 51% of participants. The most common abnormalities were aortic sclerosis (34%), mitral regurgitation (22%), and aortic regurgitation (15%). Aortic stenosis was present in 1.3%. The likelihood of undiagnosed VHD was two-fold higher in the two most deprived socioeconomic quintiles than in the most affluent quintile, and three-fold higher in individuals with atrial fibrillation. Clinically significant (moderate or severe) undiagnosed VHD was identified in 6.4%. In addition, 4.9% of the cohort had pre-existing VHD (a total prevalence of 11.3%). Projecting these findings using population data, we estimate that the prevalence of clinically significant VHD will double before 2050. Previously undetected VHD affects 1 in 2 of the elderly population and is more common in lower socioeconomic classes. These unique data demonstrate the contemporary clinical and epidemiological characteristics of VHD in a large population-based cohort of older people and confirm the scale of the emerging epidemic of VHD, with widespread implications for clinicians and healthcare resources. Published on behalf of the European Society of Cardiology. All rights reserved. © The Author 2016. For permissions please email: journals.permissions@oup.com.

  2. RESTRUCTURING OF THE LARGE-SCALE SPRINKLERS

    Directory of Open Access Journals (Sweden)

    Paweł Kozaczyk

    2016-09-01

    Full Text Available One of the best ways for agriculture to become independent from shortages of precipitation is irrigation. In the seventies and eighties of the last century a number of large-scale sprinklers in Wielkopolska was built. At the end of 1970’s in the Poznan province 67 sprinklers with a total area of 6400 ha were installed. The average size of the sprinkler reached 95 ha. In 1989 there were 98 sprinklers, and the area which was armed with them was more than 10 130 ha. The study was conducted on 7 large sprinklers with the area ranging from 230 to 520 hectares in 1986÷1998. After the introduction of the market economy in the early 90’s and ownership changes in agriculture, large-scale sprinklers have gone under a significant or total devastation. Land on the State Farms of the State Agricultural Property Agency has leased or sold and the new owners used the existing sprinklers to a very small extent. This involved a change in crop structure, demand structure and an increase in operating costs. There has also been a threefold increase in electricity prices. Operation of large-scale irrigation encountered all kinds of barriers in practice and limitations of system solutions, supply difficulties, high levels of equipment failure which is not inclined to rational use of available sprinklers. An effect of a vision of the local area was to show the current status of the remaining irrigation infrastructure. The adopted scheme for the restructuring of Polish agriculture was not the best solution, causing massive destruction of assets previously invested in the sprinkler system.

  3. Large-scale synthesis of YSZ nanopowder by Pechini method

    Indian Academy of Sciences (India)

    Administrator

    structure and chemical purity of 99⋅1% by inductively coupled plasma optical emission spectroscopy on a large scale. Keywords. Sol–gel; yttria-stabilized zirconia; large scale; nanopowder; Pechini method. 1. Introduction. Zirconia has attracted the attention of many scientists because of its tremendous thermal, mechanical ...

  4. Large Scale Screening of Ethnomedicinal Plants for Identification of Potential Antibacterial Compounds

    Directory of Open Access Journals (Sweden)

    Sujogya Kumar Panda

    2016-03-01

    Full Text Available The global burden of bacterial infections is very high and has been exacerbated by increasing resistance to multiple antibiotics. Antibiotic resistance leads to failed treatment of infections, which can ultimately lead to death. To overcome antibiotic resistance, it is necessary to identify new antibacterial agents. In this study, a total of 662 plant extracts (diverse parts from 222 plant species (82 families, 177 genera were screened for antibacterial activity using the agar cup plate method. The aqueous and methanolic extracts were prepared from diverse plant parts and screened against eight bacterial (two Gram-positive and six Gram-negative species, most of which are involved in common infections with multiple antibiotic resistance. The methanolic extracts of several plants were shown to have zones of inhibition ≥ 12 mm against both Gram-positive and Gram-negative bacteria. The minimum inhibitory concentration was calculated only with methanolic extracts of selected plants, those showed zone of inhibition ≥ 12 mm against both Gram-positive and Gram-negative bacteria. Several extracts had minimum inhibitory concentration ≤ 1 mg/mL. Specifically Adhatoda vasica, Ageratum conyzoides, Alangium salvifolium, Alpinia galanga, Andrographis paniculata, Anogeissus latifolia, Annona squamosa, A. reticulate, Azadirachta indica, Buchanania lanzan, Cassia fistula, Celastrus paniculatus, Centella asiatica, Clausena excavate, Cleome viscosa, Cleistanthus collinus, Clerodendrum indicum, Croton roxburghii, Diospyros melanoxylon, Eleutherine bulbosa, Erycibe paniculata, Eryngium foetidum, Garcinia cowa, Helicteres isora, Hemidesmus indicus, Holarrhena antidysenterica, Lannea coromandelica, Millettia extensa, Mimusops elengi, Nyctanthes arbor-tristis, Oroxylum indicum, Paederia foetida, Pterospermum acerifolium, Punica granatum, Semecarpus anacardium, Spondias pinnata, Terminalia alata and Vitex negundo were shown to have significant antimicrobial

  5. Large Scale Screening of Ethnomedicinal Plants for Identification of Potential Antibacterial Compounds.

    Science.gov (United States)

    Panda, Sujogya Kumar; Mohanta, Yugal Kishore; Padhi, Laxmipriya; Park, Young-Hwan; Mohanta, Tapan Kumar; Bae, Hanhong

    2016-03-14

    The global burden of bacterial infections is very high and has been exacerbated by increasing resistance to multiple antibiotics. Antibiotic resistance leads to failed treatment of infections, which can ultimately lead to death. To overcome antibiotic resistance, it is necessary to identify new antibacterial agents. In this study, a total of 662 plant extracts (diverse parts) from 222 plant species (82 families, 177 genera) were screened for antibacterial activity using the agar cup plate method. The aqueous and methanolic extracts were prepared from diverse plant parts and screened against eight bacterial (two Gram-positive and six Gram-negative) species, most of which are involved in common infections with multiple antibiotic resistance. The methanolic extracts of several plants were shown to have zones of inhibition ≥ 12 mm against both Gram-positive and Gram-negative bacteria. The minimum inhibitory concentration was calculated only with methanolic extracts of selected plants, those showed zone of inhibition ≥ 12 mm against both Gram-positive and Gram-negative bacteria. Several extracts had minimum inhibitory concentration ≤ 1 mg/mL. Specifically Adhatoda vasica, Ageratum conyzoides, Alangium salvifolium, Alpinia galanga, Andrographis paniculata, Anogeissus latifolia, Annona squamosa, A. reticulate, Azadirachta indica, Buchanania lanzan, Cassia fistula, Celastrus paniculatus, Centella asiatica, Clausena excavate, Cleome viscosa, Cleistanthus collinus, Clerodendrum indicum, Croton roxburghii, Diospyros melanoxylon, Eleutherine bulbosa, Erycibe paniculata, Eryngium foetidum, Garcinia cowa, Helicteres isora, Hemidesmus indicus, Holarrhena antidysenterica, Lannea coromandelica, Millettia extensa, Mimusops elengi, Nyctanthes arbor-tristis, Oroxylum indicum, Paederia foetida, Pterospermum acerifolium, Punica granatum, Semecarpus anacardium, Spondias pinnata, Terminalia alata and Vitex negundo were shown to have significant antimicrobial activity. The species

  6. The Phoenix series large scale LNG pool fire experiments.

    Energy Technology Data Exchange (ETDEWEB)

    Simpson, Richard B.; Jensen, Richard Pearson; Demosthenous, Byron; Luketa, Anay Josephine; Ricks, Allen Joseph; Hightower, Marion Michael; Blanchat, Thomas K.; Helmick, Paul H.; Tieszen, Sheldon Robert; Deola, Regina Anne; Mercier, Jeffrey Alan; Suo-Anttila, Jill Marie; Miller, Timothy J.

    2010-12-01

    The increasing demand for natural gas could increase the number and frequency of Liquefied Natural Gas (LNG) tanker deliveries to ports across the United States. Because of the increasing number of shipments and the number of possible new facilities, concerns about the potential safety of the public and property from an accidental, and even more importantly intentional spills, have increased. While improvements have been made over the past decade in assessing hazards from LNG spills, the existing experimental data is much smaller in size and scale than many postulated large accidental and intentional spills. Since the physics and hazards from a fire change with fire size, there are concerns about the adequacy of current hazard prediction techniques for large LNG spills and fires. To address these concerns, Congress funded the Department of Energy (DOE) in 2008 to conduct a series of laboratory and large-scale LNG pool fire experiments at Sandia National Laboratories (Sandia) in Albuquerque, New Mexico. This report presents the test data and results of both sets of fire experiments. A series of five reduced-scale (gas burner) tests (yielding 27 sets of data) were conducted in 2007 and 2008 at Sandia's Thermal Test Complex (TTC) to assess flame height to fire diameter ratios as a function of nondimensional heat release rates for extrapolation to large-scale LNG fires. The large-scale LNG pool fire experiments were conducted in a 120 m diameter pond specially designed and constructed in Sandia's Area III large-scale test complex. Two fire tests of LNG spills of 21 and 81 m in diameter were conducted in 2009 to improve the understanding of flame height, smoke production, and burn rate and therefore the physics and hazards of large LNG spills and fires.

  7. Examination of the lumbar vertebral column using large-screen image intensifier photofluorography

    Energy Technology Data Exchange (ETDEWEB)

    Soimakallio, S.; Manninen, H.; Mahlamaeki, S.

    1985-01-01

    The OPTILUX 57 device with its large image intensifying screen is very efficient in visualizing the lumbar vertebrae. The article explains the techniques and summarizes results obtained in the examination of young sportsmen.

  8. Phenotypic high-throughput screening elucidates target pathway in breast cancer stem cell-like cells.

    Science.gov (United States)

    Carmody, Leigh C; Germain, Andrew R; VerPlank, Lynn; Nag, Partha P; Muñoz, Benito; Perez, Jose R; Palmer, Michelle A J

    2012-10-01

    Cancer stem cells (CSCs) are resistant to standard cancer treatments and are likely responsible for cancer recurrence, but few therapies target this subpopulation. Due to the difficulty in propagating CSCs outside of the tumor environment, previous work identified CSC-like cells by inducing human breast epithelial cells into an epithelial-to-mesenchymal transdifferentiated state (HMLE_sh_ECad). A phenotypic screen was conducted against HMLE_sh_ECad with 300 718 compounds from the Molecular Libraries Small Molecule Repository to identify selective inhibitors of CSC growth. The screen yielded 2244 hits that were evaluated for toxicity and selectivity toward an isogenic control cell line. An acyl hydrazone scaffold emerged as a potent and selective scaffold targeting HMLE_sh_ECad. Fifty-three analogues were acquired and tested; compounds ranged in potency from 790 nM to inactive against HMLE_sh_ECad. Of the analogues, ML239 was best-in-class with an IC(50)= 1.18 µM against HMLE_sh_ECad, demonstrated a >23-fold selectivity over the control line, and was toxic to another CSC-like line, HMLE_shTwist, and a breast carcinoma cell line, MDA-MB-231. Gene expression studies conducted with ML239-treated cells showed altered gene expression in the NF-κB pathway in the HMLE_sh_ECad line but not in the isogenic control line. Future studies will be directed toward the identification of ML239 target(s).

  9. Geospatial Optimization of Siting Large-Scale Solar Projects

    Energy Technology Data Exchange (ETDEWEB)

    Macknick, Jordan [National Renewable Energy Lab. (NREL), Golden, CO (United States); Quinby, Ted [National Renewable Energy Lab. (NREL), Golden, CO (United States); Caulfield, Emmet [Stanford Univ., CA (United States); Gerritsen, Margot [Stanford Univ., CA (United States); Diffendorfer, Jay [U.S. Geological Survey, Boulder, CO (United States); Haines, Seth [U.S. Geological Survey, Boulder, CO (United States)

    2014-03-01

    Recent policy and economic conditions have encouraged a renewed interest in developing large-scale solar projects in the U.S. Southwest. However, siting large-scale solar projects is complex. In addition to the quality of the solar resource, solar developers must take into consideration many environmental, social, and economic factors when evaluating a potential site. This report describes a proof-of-concept, Web-based Geographical Information Systems (GIS) tool that evaluates multiple user-defined criteria in an optimization algorithm to inform discussions and decisions regarding the locations of utility-scale solar projects. Existing siting recommendations for large-scale solar projects from governmental and non-governmental organizations are not consistent with each other, are often not transparent in methods, and do not take into consideration the differing priorities of stakeholders. The siting assistance GIS tool we have developed improves upon the existing siting guidelines by being user-driven, transparent, interactive, capable of incorporating multiple criteria, and flexible. This work provides the foundation for a dynamic siting assistance tool that can greatly facilitate siting decisions among multiple stakeholders.

  10. State-of-the-Art in GPU-Based Large-Scale Volume Visualization

    KAUST Repository

    Beyer, Johanna

    2015-05-01

    This survey gives an overview of the current state of the art in GPU techniques for interactive large-scale volume visualization. Modern techniques in this field have brought about a sea change in how interactive visualization and analysis of giga-, tera- and petabytes of volume data can be enabled on GPUs. In addition to combining the parallel processing power of GPUs with out-of-core methods and data streaming, a major enabler for interactivity is making both the computational and the visualization effort proportional to the amount and resolution of data that is actually visible on screen, i.e. \\'output-sensitive\\' algorithms and system designs. This leads to recent output-sensitive approaches that are \\'ray-guided\\', \\'visualization-driven\\' or \\'display-aware\\'. In this survey, we focus on these characteristics and propose a new categorization of GPU-based large-scale volume visualization techniques based on the notions of actual output-resolution visibility and the current working set of volume bricks-the current subset of data that is minimally required to produce an output image of the desired display resolution. Furthermore, we discuss the differences and similarities of different rendering and data traversal strategies in volume rendering by putting them into a common context-the notion of address translation. For our purposes here, we view parallel (distributed) visualization using clusters as an orthogonal set of techniques that we do not discuss in detail but that can be used in conjunction with what we present in this survey. © 2015 The Eurographics Association and John Wiley & Sons Ltd.

  11. State-of-the-Art in GPU-Based Large-Scale Volume Visualization

    KAUST Repository

    Beyer, Johanna; Hadwiger, Markus; Pfister, Hanspeter

    2015-01-01

    This survey gives an overview of the current state of the art in GPU techniques for interactive large-scale volume visualization. Modern techniques in this field have brought about a sea change in how interactive visualization and analysis of giga-, tera- and petabytes of volume data can be enabled on GPUs. In addition to combining the parallel processing power of GPUs with out-of-core methods and data streaming, a major enabler for interactivity is making both the computational and the visualization effort proportional to the amount and resolution of data that is actually visible on screen, i.e. 'output-sensitive' algorithms and system designs. This leads to recent output-sensitive approaches that are 'ray-guided', 'visualization-driven' or 'display-aware'. In this survey, we focus on these characteristics and propose a new categorization of GPU-based large-scale volume visualization techniques based on the notions of actual output-resolution visibility and the current working set of volume bricks-the current subset of data that is minimally required to produce an output image of the desired display resolution. Furthermore, we discuss the differences and similarities of different rendering and data traversal strategies in volume rendering by putting them into a common context-the notion of address translation. For our purposes here, we view parallel (distributed) visualization using clusters as an orthogonal set of techniques that we do not discuss in detail but that can be used in conjunction with what we present in this survey. © 2015 The Eurographics Association and John Wiley & Sons Ltd.

  12. Large-scale Agricultural Land Acquisitions in West Africa | IDRC ...

    International Development Research Centre (IDRC) Digital Library (Canada)

    This project will examine large-scale agricultural land acquisitions in nine West African countries -Burkina Faso, Guinea-Bissau, Guinea, Benin, Mali, Togo, Senegal, Niger, and Côte d'Ivoire. ... They will use the results to increase public awareness and knowledge about the consequences of large-scale land acquisitions.

  13. Large-scale motions in the universe: a review

    International Nuclear Information System (INIS)

    Burstein, D.

    1990-01-01

    The expansion of the universe can be retarded in localised regions within the universe both by the presence of gravity and by non-gravitational motions generated in the post-recombination universe. The motions of galaxies thus generated are called 'peculiar motions', and the amplitudes, size scales and coherence of these peculiar motions are among the most direct records of the structure of the universe. As such, measurements of these properties of the present-day universe provide some of the severest tests of cosmological theories. This is a review of the current evidence for large-scale motions of galaxies out to a distance of ∼5000 km s -1 (in an expanding universe, distance is proportional to radial velocity). 'Large-scale' in this context refers to motions that are correlated over size scales larger than the typical sizes of groups of galaxies, up to and including the size of the volume surveyed. To orient the reader into this relatively new field of study, a short modern history is given together with an explanation of the terminology. Careful consideration is given to the data used to measure the distances, and hence the peculiar motions, of galaxies. The evidence for large-scale motions is presented in a graphical fashion, using only the most reliable data for galaxies spanning a wide range in optical properties and over the complete range of galactic environments. The kinds of systematic errors that can affect this analysis are discussed, and the reliability of these motions is assessed. The predictions of two models of large-scale motion are compared to the observations, and special emphasis is placed on those motions in which our own Galaxy directly partakes. (author)

  14. State of the Art in Large-Scale Soil Moisture Monitoring

    Science.gov (United States)

    Ochsner, Tyson E.; Cosh, Michael Harold; Cuenca, Richard H.; Dorigo, Wouter; Draper, Clara S.; Hagimoto, Yutaka; Kerr, Yan H.; Larson, Kristine M.; Njoku, Eni Gerald; Small, Eric E.; hide

    2013-01-01

    Soil moisture is an essential climate variable influencing land atmosphere interactions, an essential hydrologic variable impacting rainfall runoff processes, an essential ecological variable regulating net ecosystem exchange, and an essential agricultural variable constraining food security. Large-scale soil moisture monitoring has advanced in recent years creating opportunities to transform scientific understanding of soil moisture and related processes. These advances are being driven by researchers from a broad range of disciplines, but this complicates collaboration and communication. For some applications, the science required to utilize large-scale soil moisture data is poorly developed. In this review, we describe the state of the art in large-scale soil moisture monitoring and identify some critical needs for research to optimize the use of increasingly available soil moisture data. We review representative examples of 1) emerging in situ and proximal sensing techniques, 2) dedicated soil moisture remote sensing missions, 3) soil moisture monitoring networks, and 4) applications of large-scale soil moisture measurements. Significant near-term progress seems possible in the use of large-scale soil moisture data for drought monitoring. Assimilation of soil moisture data for meteorological or hydrologic forecasting also shows promise, but significant challenges related to model structures and model errors remain. Little progress has been made yet in the use of large-scale soil moisture observations within the context of ecological or agricultural modeling. Opportunities abound to advance the science and practice of large-scale soil moisture monitoring for the sake of improved Earth system monitoring, modeling, and forecasting.

  15. Tracking and tracing of participants in two large cancer screening trials.

    Science.gov (United States)

    Marcus, Pamela M; Childs, Jeffery; Gahagan, Betsy; Gren, Lisa H

    2012-07-01

    Many clinical trials rely on participant report to first learn about study events. It is therefore important to have current contact information and the ability to locate participants should information become outdated. The Prostate, Lung, Colorectal and Ovarian Cancer Screening Trial (PLCO) and the Lung Screening Study (LSS) component of the National Lung Screening Trial, two large randomized cancer screening trials, enrolled almost 190,000 participants on whom annual contact was necessary. Ten screening centers participated in both trials. Centers developed methods to track participants and trace them when necessary. We describe the methods used to keep track of participants and trace them when lost, and the extent to which each method was used. Screening center coordinators were asked, using a self-administered paper questionnaire, to rate the extent to which specific tracking and tracing methods were used. Many methods were used by the screening centers, including telephone calls, mail, and internet searches. The most extensively used methods involved telephoning the participant on his or her home or cell phone, or telephoning a person identified by the participant as someone who would know about the participant's whereabouts. Internet searches were used extensively as well; these included searches on names, reverse-lookup searches (on addresses or telephone numbers) and searches of the Social Security Death Index. Over time, the percentage of participants requiring tracing decreased. Telephone communication and internet services were useful in keeping track of PLCO and LSS participants and tracing them when contact information was no longer valid. Published by Elsevier Inc.

  16. Evaluation of French version of the Vulnerability to abuse screen scale (VASS), a elder abuse screening tool.

    Science.gov (United States)

    Grenier, Florian; Capriz, Françoise; Lacroix-Hugues, Virginie; Paysant, François; Pradier, Christian; Franco, Alain

    2016-06-01

    The elder abuse is a major public health problem. In the world, almost 4 to 10% of people of more than 65 years would be abuse. The generalist practitioners report only 2% of the elder abuse. Furthermore, the evaluations of elder abuse screenings test found in the scientist literature were unsatisfactory. Evaluate the elder abuse screening capacities of the Vulnerability to abuse screen scale (VASS) in order to propose it to the doctors. VASS was translated in French. It's a quantitative and a forward-looking study whose the answers of people of more than 65 years old were analysed and compared in blind way to the answers of socials workers. 200 patients were included between March and May 2012 in the CHU of Cimiez, Nice. We found 104 patients in danger of abuse, 40 cases of abuse revealed by the socials workers, so 20% of abuses were reported by the gold standard. It means a sensibility of 90,9%, a specificity of 49,7% and a predictive value of 96,1% to a score of 1 to the test. The screening test VASS shown it useful to detect elder people in danger of abuse but a few discriminants and not adapted to patients who have cognitive pathologies. It's a screening tool usable by default, more sensitive than others tests in the scientist literature. However, these results ask the question of the useful of these tools of elder abuse screening in comparison with the education of doctors which made proofs of success in this subject.

  17. Examination of the lumbar vertebral column using large-screen image intensifier photofluorography

    International Nuclear Information System (INIS)

    Soimakallio, S.; Manninen, H.; Mahlamaeki, S.; Kuopio Central Hospital

    1985-01-01

    The OPTILUX 57 device with its large image intensifying screen is very efficient in visualizing the lumbar vertebrae. The article explains the techniques and summarizes results obtained in the examination of young sportsmen. (orig.) [de

  18. A route to explosive large-scale magnetic reconnection in a super-ion-scale current sheet

    Directory of Open Access Journals (Sweden)

    K. G. Tanaka

    2009-01-01

    Full Text Available How to trigger magnetic reconnection is one of the most interesting and important problems in space plasma physics. Recently, electron temperature anisotropy (αeo=Te⊥/Te|| at the center of a current sheet and non-local effect of the lower-hybrid drift instability (LHDI that develops at the current sheet edges have attracted attention in this context. In addition to these effects, here we also study the effects of ion temperature anisotropy (αio=Ti⊥/Ti||. Electron anisotropy effects are known to be helpless in a current sheet whose thickness is of ion-scale. In this range of current sheet thickness, the LHDI effects are shown to weaken substantially with a small increase in thickness and the obtained saturation level is too low for a large-scale reconnection to be achieved. Then we investigate whether introduction of electron and ion temperature anisotropies in the initial stage would couple with the LHDI effects to revive quick triggering of large-scale reconnection in a super-ion-scale current sheet. The results are as follows. (1 The initial electron temperature anisotropy is consumed very quickly when a number of minuscule magnetic islands (each lateral length is 1.5~3 times the ion inertial length form. These minuscule islands do not coalesce into a large-scale island to enable large-scale reconnection. (2 The subsequent LHDI effects disturb the current sheet filled with the small islands. This makes the triggering time scale to be accelerated substantially but does not enhance the saturation level of reconnected flux. (3 When the ion temperature anisotropy is added, it survives through the small island formation stage and makes even quicker triggering to happen when the LHDI effects set-in. Furthermore the saturation level is seen to be elevated by a factor of ~2 and large-scale reconnection is achieved only in this case. Comparison with two-dimensional simulations that exclude the LHDI effects confirms that the saturation level

  19. Large-scale Labeled Datasets to Fuel Earth Science Deep Learning Applications

    Science.gov (United States)

    Maskey, M.; Ramachandran, R.; Miller, J.

    2017-12-01

    Deep learning has revolutionized computer vision and natural language processing with various algorithms scaled using high-performance computing. However, generic large-scale labeled datasets such as the ImageNet are the fuel that drives the impressive accuracy of deep learning results. Large-scale labeled datasets already exist in domains such as medical science, but creating them in the Earth science domain is a challenge. While there are ways to apply deep learning using limited labeled datasets, there is a need in the Earth sciences for creating large-scale labeled datasets for benchmarking and scaling deep learning applications. At the NASA Marshall Space Flight Center, we are using deep learning for a variety of Earth science applications where we have encountered the need for large-scale labeled datasets. We will discuss our approaches for creating such datasets and why these datasets are just as valuable as deep learning algorithms. We will also describe successful usage of these large-scale labeled datasets with our deep learning based applications.

  20. Mobile Phenotyping System Using an Aeromotively Stabilized Cable-Driven Robot

    Science.gov (United States)

    Newman, M. B.; Zygielbaum, A. I.

    2017-12-01

    Agricultural researchers are constantly attempting to generate superior agricultural crops. Whether this means creating crops with greater yield, crops that are more resilient to disease, or crops that can tolerate harsh environments with fewer failures, test plots of these experimental crops must be studied in real-world environments with minimal invasion to determine how they will perform in full-scale agricultural settings. To monitor these crops without interfering with their natural growth, a noninvasive sensor system has been implemented. This system, instituted by the College of Agricultural Sciences and Natural Resources at the University of Nebraska - Lincoln (UNL), uses a system of poles, cables, and winches to support and maneuver a sensor platform above the crops at an outdoor phenotyping site. In this work, we improve upon the UNL outdoor phenotyping system presenting the concept design for a mobile, cable-driven phenotyping system as opposed to a permanent phenotyping facility. One major challenge in large-scale, cable-driven robots is stability of the end-effector. As a result, this mobile system seeks to use a novel method of end-effector stabilization using an onboard rotor drive system, herein referred to as the Instrument Platform Aeromotive Stabilization System (IPASS). A prototype system is developed and analyzed to determine the viability of IPASS.

  1. Methodology for high-throughput field phenotyping of canopy temperature using airborne thermography

    Directory of Open Access Journals (Sweden)

    David Matthew Deery

    2016-12-01

    Full Text Available Lower canopy temperature (CT, resulting from increased stomatal conductance, has been associated with increased yield in wheat. Historically, CT has been measured with hand-held infrared thermometers. Using the hand-held CT method on large field trials is problematic, mostly because measurements are confounded by temporal weather changes during the time requiredto measure all plots. The hand-held CT method is laborious and yet the resulting heritability low, thereby reducing confidence in selection in large scale breeding endeavours.We have developed a reliable and scalable crop phenotyping method for assessing CT in large field experiments. The method involves airborne thermography from a manned helicopter using a radiometrically-calibrated thermal camera. Thermal image data is acquired from large experiments in the order of seconds, thereby enabling simultaneous measurement of CT on potentially 1,000s of plots. Effects of temporal weather variation when phenotyping large experiments using hand-held infrared thermometers are therefore reduced. The method is designed for cost-effective and large-scale use by the non-technical user and includes custom-developed software for data processing to obtain CT data on a single-plot basis for analysis.Broad-sense heritability was routinely greater than 0.50, and as high as 0.79, for airborne thermography CT measured near anthesis on a wheat experiment comprising 768 plots of size 2 x 6 m. Image analysis based on the frequency distribution of temperature pixels to remove the possible influence of background soil did not improve broad-sense heritability. Total imageacquisition and processing time was ca. 25 min and required only one person (excluding the helicopter pilot. The results indicate the potential to phenotype CT on large populations in genetics studies or for selection within a plant breeding program.

  2. A Class of Diacylglycerol Acyltransferase 1 Inhibitors Identified by a Combination of Phenotypic High-throughput Screening, Genomics, and Genetics

    Directory of Open Access Journals (Sweden)

    Kirsten Tschapalda

    2016-06-01

    Full Text Available Excess lipid storage is an epidemic problem in human populations. Thus, the identification of small molecules to treat or prevent lipid storage-related metabolic complications is of great interest. Here we screened >320.000 compounds for their ability to prevent a cellular lipid accumulation phenotype. We used fly cells because the multifarious tools available for this organism should facilitate unraveling the mechanism-of-action of active small molecules. Of the several hundred lipid storage inhibitors identified in the primary screen we concentrated on three structurally diverse and potent compound classes active in cells of multiple species (including human and negligible cytotoxicity. Together with Drosophila in vivo epistasis experiments, RNA-Seq expression profiles suggested that the target of one of the small molecules was diacylglycerol acyltransferase 1 (DGAT1, a key enzyme in the production of triacylglycerols and prominent human drug target. We confirmed this prediction by biochemical and enzymatic activity tests.

  3. Large-scale structure observables in general relativity

    International Nuclear Information System (INIS)

    Jeong, Donghui; Schmidt, Fabian

    2015-01-01

    We review recent studies that rigorously define several key observables of the large-scale structure of the Universe in a general relativistic context. Specifically, we consider (i) redshift perturbation of cosmic clock events; (ii) distortion of cosmic rulers, including weak lensing shear and magnification; and (iii) observed number density of tracers of the large-scale structure. We provide covariant and gauge-invariant expressions of these observables. Our expressions are given for a linearly perturbed flat Friedmann–Robertson–Walker metric including scalar, vector, and tensor metric perturbations. While we restrict ourselves to linear order in perturbation theory, the approach can be straightforwardly generalized to higher order. (paper)

  4. Fatigue Analysis of Large-scale Wind turbine

    Directory of Open Access Journals (Sweden)

    Zhu Yongli

    2017-01-01

    Full Text Available The paper does research on top flange fatigue damage of large-scale wind turbine generator. It establishes finite element model of top flange connection system with finite element analysis software MSC. Marc/Mentat, analyzes its fatigue strain, implements load simulation of flange fatigue working condition with Bladed software, acquires flange fatigue load spectrum with rain-flow counting method, finally, it realizes fatigue analysis of top flange with fatigue analysis software MSC. Fatigue and Palmgren-Miner linear cumulative damage theory. The analysis result indicates that its result provides new thinking for flange fatigue analysis of large-scale wind turbine generator, and possesses some practical engineering value.

  5. Real-time simulation of large-scale floods

    Science.gov (United States)

    Liu, Q.; Qin, Y.; Li, G. D.; Liu, Z.; Cheng, D. J.; Zhao, Y. H.

    2016-08-01

    According to the complex real-time water situation, the real-time simulation of large-scale floods is very important for flood prevention practice. Model robustness and running efficiency are two critical factors in successful real-time flood simulation. This paper proposed a robust, two-dimensional, shallow water model based on the unstructured Godunov- type finite volume method. A robust wet/dry front method is used to enhance the numerical stability. An adaptive method is proposed to improve the running efficiency. The proposed model is used for large-scale flood simulation on real topography. Results compared to those of MIKE21 show the strong performance of the proposed model.

  6. [Adverse Effect Predictions Based on Computational Toxicology Techniques and Large-scale Databases].

    Science.gov (United States)

    Uesawa, Yoshihiro

    2018-01-01

     Understanding the features of chemical structures related to the adverse effects of drugs is useful for identifying potential adverse effects of new drugs. This can be based on the limited information available from post-marketing surveillance, assessment of the potential toxicities of metabolites and illegal drugs with unclear characteristics, screening of lead compounds at the drug discovery stage, and identification of leads for the discovery of new pharmacological mechanisms. This present paper describes techniques used in computational toxicology to investigate the content of large-scale spontaneous report databases of adverse effects, and it is illustrated with examples. Furthermore, volcano plotting, a new visualization method for clarifying the relationships between drugs and adverse effects via comprehensive analyses, will be introduced. These analyses may produce a great amount of data that can be applied to drug repositioning.

  7. Large-scale numerical simulations of plasmas

    International Nuclear Information System (INIS)

    Hamaguchi, Satoshi

    2004-01-01

    The recent trend of large scales simulations of fusion plasma and processing plasmas is briefly summarized. Many advanced simulation techniques have been developed for fusion plasmas and some of these techniques are now applied to analyses of processing plasmas. (author)

  8. Breast cancer screening halves the risk of breast cancer death: a case-referent study

    NARCIS (Netherlands)

    Paap, Ellen; Verbeek, André L. M.; Botterweck, Anita A. M.; van Doorne-Nagtegaal, Heidi J.; Imhof-Tas, Mechli; de Koning, Harry J.; Otto, Suzie J.; de Munck, Linda; van der Steen, Annemieke; Holland, Roland; den Heeten, Gerard J.; Broeders, Mireille J. M.

    2014-01-01

    Large-scale epidemiologic studies have consistently demonstrated the effectiveness of mammographic screening programs, however the benefits are still subject to debate. We estimated the effect of the Dutch screening program on breast cancer mortality. In a large multi-region case-referent study, we

  9. Nearly incompressible fluids: Hydrodynamics and large scale inhomogeneity

    International Nuclear Information System (INIS)

    Hunana, P.; Zank, G. P.; Shaikh, D.

    2006-01-01

    A system of hydrodynamic equations in the presence of large-scale inhomogeneities for a high plasma beta solar wind is derived. The theory is derived under the assumption of low turbulent Mach number and is developed for the flows where the usual incompressible description is not satisfactory and a full compressible treatment is too complex for any analytical studies. When the effects of compressibility are incorporated only weakly, a new description, referred to as 'nearly incompressible hydrodynamics', is obtained. The nearly incompressible theory, was originally applied to homogeneous flows. However, large-scale gradients in density, pressure, temperature, etc., are typical in the solar wind and it was unclear how inhomogeneities would affect the usual incompressible and nearly incompressible descriptions. In the homogeneous case, the lowest order expansion of the fully compressible equations leads to the usual incompressible equations, followed at higher orders by the nearly incompressible equations, as introduced by Zank and Matthaeus. With this work we show that the inclusion of large-scale inhomogeneities (in this case time-independent and radially symmetric background solar wind) modifies the leading-order incompressible description of solar wind flow. We find, for example, that the divergence of velocity fluctuations is nonsolenoidal and that density fluctuations can be described to leading order as a passive scalar. Locally (for small lengthscales), this system of equations converges to the usual incompressible equations and we therefore use the term 'locally incompressible' to describe the equations. This term should be distinguished from the term 'nearly incompressible', which is reserved for higher-order corrections. Furthermore, we find that density fluctuations scale with Mach number linearly, in contrast to the original homogeneous nearly incompressible theory, in which density fluctuations scale with the square of Mach number. Inhomogeneous nearly

  10. The Microphenotron: a robotic miniaturized plant phenotyping platform with diverse applications in chemical biology

    KAUST Repository

    Burrell, Thomas

    2017-03-01

    Background Chemical genetics provides a powerful alternative to conventional genetics for understanding gene function. However, its application to plants has been limited by the lack of a technology that allows detailed phenotyping of whole-seedling development in the context of a high-throughput chemical screen. We have therefore sought to develop an automated micro-phenotyping platform that would allow both root and shoot development to be monitored under conditions where the phenotypic effects of large numbers of small molecules can be assessed. Results The ‘Microphenotron’ platform uses 96-well microtitre plates to deliver chemical treatments to seedlings of Arabidopsis thaliana L. and is based around four components: (a) the ‘Phytostrip’, a novel seedling growth device that enables chemical treatments to be combined with the automated capture of images of developing roots and shoots; (b) an illuminated robotic platform that uses a commercially available robotic manipulator to capture images of developing shoots and roots; (c) software to control the sequence of robotic movements and integrate these with the image capture process; (d) purpose-made image analysis software for automated extraction of quantitative phenotypic data. Imaging of each plate (representing 80 separate assays) takes 4 min and can easily be performed daily for time-course studies. As currently configured, the Microphenotron has a capacity of 54 microtitre plates in a growth room footprint of 2.1 m2, giving a potential throughput of up to 4320 chemical treatments in a typical 10 days experiment. The Microphenotron has been validated by using it to screen a collection of 800 natural compounds for qualitative effects on root development and to perform a quantitative analysis of the effects of a range of concentrations of nitrate and ammonium on seedling development. Conclusions The Microphenotron is an automated screening platform that for the first time is able to combine large

  11. Performance Health Monitoring of Large-Scale Systems

    Energy Technology Data Exchange (ETDEWEB)

    Rajamony, Ram [IBM Research, Austin, TX (United States)

    2014-11-20

    This report details the progress made on the ASCR funded project Performance Health Monitoring for Large Scale Systems. A large-­scale application may not achieve its full performance potential due to degraded performance of even a single subsystem. Detecting performance faults, isolating them, and taking remedial action is critical for the scale of systems on the horizon. PHM aims to develop techniques and tools that can be used to identify and mitigate such performance problems. We accomplish this through two main aspects. The PHM framework encompasses diagnostics, system monitoring, fault isolation, and performance evaluation capabilities that indicates when a performance fault has been detected, either due to an anomaly present in the system itself or due to contention for shared resources between concurrently executing jobs. Software components called the PHM Control system then build upon the capabilities provided by the PHM framework to mitigate degradation caused by performance problems.

  12. The impact of new forms of large-scale general practice provider collaborations on England's NHS: a systematic review.

    Science.gov (United States)

    Pettigrew, Luisa M; Kumpunen, Stephanie; Mays, Nicholas; Rosen, Rebecca; Posaner, Rachel

    2018-03-01

    Over the past decade, collaboration between general practices in England to form new provider networks and large-scale organisations has been driven largely by grassroots action among GPs. However, it is now being increasingly advocated for by national policymakers. Expectations of what scaling up general practice in England will achieve are significant. To review the evidence of the impact of new forms of large-scale general practice provider collaborations in England. Systematic review. Embase, MEDLINE, Health Management Information Consortium, and Social Sciences Citation Index were searched for studies reporting the impact on clinical processes and outcomes, patient experience, workforce satisfaction, or costs of new forms of provider collaborations between general practices in England. A total of 1782 publications were screened. Five studies met the inclusion criteria and four examined the same general practice networks, limiting generalisability. Substantial financial investment was required to establish the networks and the associated interventions that were targeted at four clinical areas. Quality improvements were achieved through standardised processes, incentives at network level, information technology-enabled performance dashboards, and local network management. The fifth study of a large-scale multisite general practice organisation showed that it may be better placed to implement safety and quality processes than conventional practices. However, unintended consequences may arise, such as perceptions of disenfranchisement among staff and reductions in continuity of care. Good-quality evidence of the impacts of scaling up general practice provider organisations in England is scarce. As more general practice collaborations emerge, evaluation of their impacts will be important to understand which work, in which settings, how, and why. © British Journal of General Practice 2018.

  13. Data management in large-scale collaborative toxicity studies: how to file experimental data for automated statistical analysis.

    Science.gov (United States)

    Stanzel, Sven; Weimer, Marc; Kopp-Schneider, Annette

    2013-06-01

    High-throughput screening approaches are carried out for the toxicity assessment of a large number of chemical compounds. In such large-scale in vitro toxicity studies several hundred or thousand concentration-response experiments are conducted. The automated evaluation of concentration-response data using statistical analysis scripts saves time and yields more consistent results in comparison to data analysis performed by the use of menu-driven statistical software. Automated statistical analysis requires that concentration-response data are available in a standardised data format across all compounds. To obtain consistent data formats, a standardised data management workflow must be established, including guidelines for data storage, data handling and data extraction. In this paper two procedures for data management within large-scale toxicological projects are proposed. Both procedures are based on Microsoft Excel files as the researcher's primary data format and use a computer programme to automate the handling of data files. The first procedure assumes that data collection has not yet started whereas the second procedure can be used when data files already exist. Successful implementation of the two approaches into the European project ACuteTox is illustrated. Copyright © 2012 Elsevier Ltd. All rights reserved.

  14. Large scale phenotyping and data analysis of pepper genotypes in the EU-SPICY project

    NARCIS (Netherlands)

    Dieleman, J.A.; Magán, J.J.; Wubs, A.M.; Palloix, A.; Lenk, S.; Glasbey, C.; Eeuwijk, van F.A.

    2012-01-01

    In breeding the best genotypes for diverse conditions, ideally the breeder should test all his crossings under all these conditions. Especially with complex physiological traits like yield, which exhibit large variation, this would require many expensive and large field trials. The EU project “Smart

  15. Large-scale chondroitin sulfate proteoglycan digestion with chondroitinase gene therapy leads to reduced pathology and modulates macrophage phenotype following spinal cord contusion injury

    NARCIS (Netherlands)

    Bartus, Katalin; James, Nicholas D; Didangelos, Athanasios; Bosch, Karen D; Verhaagen, J.; Yáñez-Muñoz, Rafael J; Rogers, John H; Schneider, Bernard L; Muir, Elizabeth M; Bradbury, Elizabeth J

    2014-01-01

    Chondroitin sulfate proteoglycans (CSPGs) inhibit repair following spinal cord injury. Here we use mammalian-compatible engineered chondroitinase ABC (ChABC) delivered via lentiviral vector (LV-ChABC) to explore the consequences of large-scale CSPG digestion for spinal cord repair. We demonstrate

  16. Learning from large scale neural simulations

    DEFF Research Database (Denmark)

    Serban, Maria

    2017-01-01

    Large-scale neural simulations have the marks of a distinct methodology which can be fruitfully deployed to advance scientific understanding of the human brain. Computer simulation studies can be used to produce surrogate observational data for better conceptual models and new how...

  17. Phenomenology of two-dimensional stably stratified turbulence under large-scale forcing

    KAUST Repository

    Kumar, Abhishek; Verma, Mahendra K.; Sukhatme, Jai

    2017-01-01

    In this paper, we characterise the scaling of energy spectra, and the interscale transfer of energy and enstrophy, for strongly, moderately and weakly stably stratified two-dimensional (2D) turbulence, restricted in a vertical plane, under large-scale random forcing. In the strongly stratified case, a large-scale vertically sheared horizontal flow (VSHF) coexists with small scale turbulence. The VSHF consists of internal gravity waves and the turbulent flow has a kinetic energy (KE) spectrum that follows an approximate k−3 scaling with zero KE flux and a robust positive enstrophy flux. The spectrum of the turbulent potential energy (PE) also approximately follows a k−3 power-law and its flux is directed to small scales. For moderate stratification, there is no VSHF and the KE of the turbulent flow exhibits Bolgiano–Obukhov scaling that transitions from a shallow k−11/5 form at large scales, to a steeper approximate k−3 scaling at small scales. The entire range of scales shows a strong forward enstrophy flux, and interestingly, large (small) scales show an inverse (forward) KE flux. The PE flux in this regime is directed to small scales, and the PE spectrum is characterised by an approximate k−1.64 scaling. Finally, for weak stratification, KE is transferred upscale and its spectrum closely follows a k−2.5 scaling, while PE exhibits a forward transfer and its spectrum shows an approximate k−1.6 power-law. For all stratification strengths, the total energy always flows from large to small scales and almost all the spectral indicies are well explained by accounting for the scale-dependent nature of the corresponding flux.

  18. Phenomenology of two-dimensional stably stratified turbulence under large-scale forcing

    KAUST Repository

    Kumar, Abhishek

    2017-01-11

    In this paper, we characterise the scaling of energy spectra, and the interscale transfer of energy and enstrophy, for strongly, moderately and weakly stably stratified two-dimensional (2D) turbulence, restricted in a vertical plane, under large-scale random forcing. In the strongly stratified case, a large-scale vertically sheared horizontal flow (VSHF) coexists with small scale turbulence. The VSHF consists of internal gravity waves and the turbulent flow has a kinetic energy (KE) spectrum that follows an approximate k−3 scaling with zero KE flux and a robust positive enstrophy flux. The spectrum of the turbulent potential energy (PE) also approximately follows a k−3 power-law and its flux is directed to small scales. For moderate stratification, there is no VSHF and the KE of the turbulent flow exhibits Bolgiano–Obukhov scaling that transitions from a shallow k−11/5 form at large scales, to a steeper approximate k−3 scaling at small scales. The entire range of scales shows a strong forward enstrophy flux, and interestingly, large (small) scales show an inverse (forward) KE flux. The PE flux in this regime is directed to small scales, and the PE spectrum is characterised by an approximate k−1.64 scaling. Finally, for weak stratification, KE is transferred upscale and its spectrum closely follows a k−2.5 scaling, while PE exhibits a forward transfer and its spectrum shows an approximate k−1.6 power-law. For all stratification strengths, the total energy always flows from large to small scales and almost all the spectral indicies are well explained by accounting for the scale-dependent nature of the corresponding flux.

  19. A Metric and Workflow for Quality Control in the Analysis of Heterogeneity in Phenotypic Profiles and Screens

    Science.gov (United States)

    Gough, Albert; Shun, Tongying; Taylor, D. Lansing; Schurdak, Mark

    2016-01-01

    workflow for analysis of heterogeneity in large scale biology projects. PMID:26476369

  20. Exploring the large-scale structure of Taylor–Couette turbulence through Large-Eddy Simulations

    Science.gov (United States)

    Ostilla-Mónico, Rodolfo; Zhu, Xiaojue; Verzicco, Roberto

    2018-04-01

    Large eddy simulations (LES) of Taylor-Couette (TC) flow, the flow between two co-axial and independently rotating cylinders are performed in an attempt to explore the large-scale axially-pinned structures seen in experiments and simulations. Both static and dynamic LES models are used. The Reynolds number is kept fixed at Re = 3.4 · 104, and the radius ratio η = ri /ro is set to η = 0.909, limiting the effects of curvature and resulting in frictional Reynolds numbers of around Re τ ≈ 500. Four rotation ratios from Rot = ‑0.0909 to Rot = 0.3 are simulated. First, the LES of TC is benchmarked for different rotation ratios. Both the Smagorinsky model with a constant of cs = 0.1 and the dynamic model are found to produce reasonable results for no mean rotation and cyclonic rotation, but deviations increase for increasing rotation. This is attributed to the increasing anisotropic character of the fluctuations. Second, “over-damped” LES, i.e. LES with a large Smagorinsky constant is performed and is shown to reproduce some features of the large-scale structures, even when the near-wall region is not adequately modeled. This shows the potential for using over-damped LES for fast explorations of the parameter space where large-scale structures are found.

  1. Kernel methods for large-scale genomic data analysis

    Science.gov (United States)

    Xing, Eric P.; Schaid, Daniel J.

    2015-01-01

    Machine learning, particularly kernel methods, has been demonstrated as a promising new tool to tackle the challenges imposed by today’s explosive data growth in genomics. They provide a practical and principled approach to learning how a large number of genetic variants are associated with complex phenotypes, to help reveal the complexity in the relationship between the genetic markers and the outcome of interest. In this review, we highlight the potential key role it will have in modern genomic data processing, especially with regard to integration with classical methods for gene prioritizing, prediction and data fusion. PMID:25053743

  2. Large-scale preparation of hollow graphitic carbon nanospheres

    International Nuclear Information System (INIS)

    Feng, Jun; Li, Fu; Bai, Yu-Jun; Han, Fu-Dong; Qi, Yong-Xin; Lun, Ning; Lu, Xi-Feng

    2013-01-01

    Hollow graphitic carbon nanospheres (HGCNSs) were synthesized on large scale by a simple reaction between glucose and Mg at 550 °C in an autoclave. Characterization by X-ray diffraction, Raman spectroscopy and transmission electron microscopy demonstrates the formation of HGCNSs with an average diameter of 10 nm or so and a wall thickness of a few graphenes. The HGCNSs exhibit a reversible capacity of 391 mAh g −1 after 60 cycles when used as anode materials for Li-ion batteries. -- Graphical abstract: Hollow graphitic carbon nanospheres could be prepared on large scale by the simple reaction between glucose and Mg at 550 °C, which exhibit superior electrochemical performance to graphite. Highlights: ► Hollow graphitic carbon nanospheres (HGCNSs) were prepared on large scale at 550 °C ► The preparation is simple, effective and eco-friendly. ► The in situ yielded MgO nanocrystals promote the graphitization. ► The HGCNSs exhibit superior electrochemical performance to graphite.

  3. Accelerating large-scale phase-field simulations with GPU

    Directory of Open Access Journals (Sweden)

    Xiaoming Shi

    2017-10-01

    Full Text Available A new package for accelerating large-scale phase-field simulations was developed by using GPU based on the semi-implicit Fourier method. The package can solve a variety of equilibrium equations with different inhomogeneity including long-range elastic, magnetostatic, and electrostatic interactions. Through using specific algorithm in Compute Unified Device Architecture (CUDA, Fourier spectral iterative perturbation method was integrated in GPU package. The Allen-Cahn equation, Cahn-Hilliard equation, and phase-field model with long-range interaction were solved based on the algorithm running on GPU respectively to test the performance of the package. From the comparison of the calculation results between the solver executed in single CPU and the one on GPU, it was found that the speed on GPU is enormously elevated to 50 times faster. The present study therefore contributes to the acceleration of large-scale phase-field simulations and provides guidance for experiments to design large-scale functional devices.

  4. First Mile Challenges for Large-Scale IoT

    KAUST Repository

    Bader, Ahmed

    2017-03-16

    The Internet of Things is large-scale by nature. This is not only manifested by the large number of connected devices, but also by the sheer scale of spatial traffic intensity that must be accommodated, primarily in the uplink direction. To that end, cellular networks are indeed a strong first mile candidate to accommodate the data tsunami to be generated by the IoT. However, IoT devices are required in the cellular paradigm to undergo random access procedures as a precursor to resource allocation. Such procedures impose a major bottleneck that hinders cellular networks\\' ability to support large-scale IoT. In this article, we shed light on the random access dilemma and present a case study based on experimental data as well as system-level simulations. Accordingly, a case is built for the latent need to revisit random access procedures. A call for action is motivated by listing a few potential remedies and recommendations.

  5. Development of an algorithm for phenotypic screening of carbapenemase-producing Enterobacteriaceae in the routine laboratory.

    Science.gov (United States)

    Robert, Jérôme; Pantel, Alix; Merens, Audrey; Meiller, Elodie; Lavigne, Jean-Philippe; Nicolas-Chanoine, Marie-Hélène

    2017-01-17

    Carbapenemase-producing Enterobacteriaceae (CPE) are difficult to identify among carbapenem non-susceptible Enterobacteriaceae (NSE). We designed phenotypic strategies giving priority to high sensitivity for screening putative CPE before further testing. Presence of carbapenemase-encoding genes in ertapenem NSE (MIC > 0.5 mg/l) consecutively isolated in 80 French laboratories between November 2011 and April 2012 was determined by the Check-MDR-CT103 array method. Using the Mueller-Hinton (MH) disk diffusion method, clinical diameter breakpoints of carbapenems other than ertapenem, piperazicillin+tazobactam, ticarcillin+clavulanate and cefepime as well as diameter cut-offs for these antibiotics and temocillin were evaluated alone or combined to determine their performances (sensitivity, specificity, positive and negative likelihood ratios) for identifying putative CPE among these ertapenem-NSE isolates. To increase the screening specificity, these antibiotics were also tested on cloxacillin-containing MH when carbapenem NSE isolates belonged to species producing chromosomal cephalosporinase (AmpC) but Escherichia coli. Out of the 349 ertapenem NSE, 52 (14.9%) were CPE, including 39 producing OXA-48 group carbapenemase, eight KPC and five MBL. A screening strategy based on the following diameter cut offs, ticarcillin+clavulanate <15 mm, temocillin <15 mm, meropenem or imipenem <22 mm, and cefepime <26 mm, showed 100% sensitivity and 68.1% specificity with the better likelihood ratios combination. The specificity increased when a diameter cut-off <32 mm for imipenem (76.1%) or meropenem (78.8%) further tested on cloxacillin-containing MH was added to the previous strategy for AmpC-producing isolates. The proposed strategies that allowed for increasing the likelihood of CPE among ertapenem-NSE isolates should be considered as a surrogate for carbapenemase production before further CPE confirmatory testing.

  6. Influence of Speed and Rainfall on Large-Scale Wheat Lodging from 2007 to 2014 in China.

    Directory of Open Access Journals (Sweden)

    Liyuan Niu

    Full Text Available Strong wind and heavy rain remain the two most important causes of large acreage wheat (Triticum aestivum L. lodging in China. For research the influence of wind speed and rainfall-separately as well as together-on the extent and degree of lodging, five levels of the severity of lodging were defined based on a combination of the lodging area and the degree of tilting. Detailed meteorological information was studied on 52 instances of large-scale lodging that occurred from 2007 to 2014. The results showed that strong wind's lodging accounted for 8% of the instances studied, continuous rainfall's lodging accounted for 19% and strong winds-heavy rainfall's accounted for 73%. The minimum instantaneous wind speed that could cause large-scale lodging was closely related to rainfall. Without rainfall, the wind speed that resulted in lodging ranging in severity from slight to severe (Level 2 to Level 5 was 14.9 m/s, 19.3 m/s, 21.5 m/s, and 26.5 m/s, respectively; when accompanied by rainfall, the wind speed that resulted in lodging of the same severity decreased linearly with the increase of rainfall. These results will be particularly useful in preventing and alleviating wheat lodging as well screening wheat varieties with good lodging resistance.

  7. A probabilistic model to predict clinical phenotypic traits from genome sequencing.

    Science.gov (United States)

    Chen, Yun-Ching; Douville, Christopher; Wang, Cheng; Niknafs, Noushin; Yeo, Grace; Beleva-Guthrie, Violeta; Carter, Hannah; Stenson, Peter D; Cooper, David N; Li, Biao; Mooney, Sean; Karchin, Rachel

    2014-09-01

    Genetic screening is becoming possible on an unprecedented scale. However, its utility remains controversial. Although most variant genotypes cannot be easily interpreted, many individuals nevertheless attempt to interpret their genetic information. Initiatives such as the Personal Genome Project (PGP) and Illumina's Understand Your Genome are sequencing thousands of adults, collecting phenotypic information and developing computational pipelines to identify the most important variant genotypes harbored by each individual. These pipelines consider database and allele frequency annotations and bioinformatics classifications. We propose that the next step will be to integrate these different sources of information to estimate the probability that a given individual has specific phenotypes of clinical interest. To this end, we have designed a Bayesian probabilistic model to predict the probability of dichotomous phenotypes. When applied to a cohort from PGP, predictions of Gilbert syndrome, Graves' disease, non-Hodgkin lymphoma, and various blood groups were accurate, as individuals manifesting the phenotype in question exhibited the highest, or among the highest, predicted probabilities. Thirty-eight PGP phenotypes (26%) were predicted with area-under-the-ROC curve (AUC)>0.7, and 23 (15.8%) of these were statistically significant, based on permutation tests. Moreover, in a Critical Assessment of Genome Interpretation (CAGI) blinded prediction experiment, the models were used to match 77 PGP genomes to phenotypic profiles, generating the most accurate prediction of 16 submissions, according to an independent assessor. Although the models are currently insufficiently accurate for diagnostic utility, we expect their performance to improve with growth of publicly available genomics data and model refinement by domain experts.

  8. The immature electrophysiological phenotype of iPSC-CMs still hampers in vitro drug screening: Special focus on IK1.

    Science.gov (United States)

    Goversen, Birgit; van der Heyden, Marcel A G; van Veen, Toon A B; de Boer, Teun P

    2018-03-01

    Preclinical drug screens are not based on human physiology, possibly complicating predictions on cardiotoxicity. Drug screening can be humanised with in vitro assays using human induced pluripotent stem cell-derived cardiomyocytes (iPSC-CMs). However, in contrast to adult ventricular cardiomyocytes, iPSC-CMs beat spontaneously due to presence of the pacemaking current I f and reduced densities of the hyperpolarising current I K1 . In adult cardiomyocytes, I K1 finalises repolarisation by stabilising the resting membrane potential while also maintaining excitability. The reduced I K1 density contributes to proarrhythmic traits in iPSC-CMs, which leads to an electrophysiological phenotype that might bias drug responses. The proarrhythmic traits can be suppressed by increasing I K1 in a balanced manner. We systematically evaluated all studies that report strategies to mature iPSC-CMs and found that only few studies report I K1 current densities. Furthermore, these studies did not succeed in establishing sufficient I K1 levels as they either added too little or too much I K1 . We conclude that reduced densities of I K1 remain a major flaw in iPSC-CMs, which hampers their use for in vitro drug screening. Copyright © 2017 The Authors. Published by Elsevier Inc. All rights reserved.

  9. Thermal power generation projects ``Large Scale Solar Heating``; EU-Thermie-Projekte ``Large Scale Solar Heating``

    Energy Technology Data Exchange (ETDEWEB)

    Kuebler, R.; Fisch, M.N. [Steinbeis-Transferzentrum Energie-, Gebaeude- und Solartechnik, Stuttgart (Germany)

    1998-12-31

    The aim of this project is the preparation of the ``Large-Scale Solar Heating`` programme for an Europe-wide development of subject technology. The following demonstration programme was judged well by the experts but was not immediately (1996) accepted for financial subsidies. In November 1997 the EU-commission provided 1,5 million ECU which allowed the realisation of an updated project proposal. By mid 1997 a small project was approved, that had been requested under the lead of Chalmes Industriteteknik (CIT) in Sweden and is mainly carried out for the transfer of technology. (orig.) [Deutsch] Ziel dieses Vorhabens ist die Vorbereitung eines Schwerpunktprogramms `Large Scale Solar Heating`, mit dem die Technologie europaweit weiterentwickelt werden sollte. Das daraus entwickelte Demonstrationsprogramm wurde von den Gutachtern positiv bewertet, konnte jedoch nicht auf Anhieb (1996) in die Foerderung aufgenommen werden. Im November 1997 wurden von der EU-Kommission dann kurzfristig noch 1,5 Mio ECU an Foerderung bewilligt, mit denen ein aktualisierter Projektvorschlag realisiert werden kann. Bereits Mitte 1997 wurde ein kleineres Vorhaben bewilligt, das unter Federfuehrung von Chalmers Industriteknik (CIT) in Schweden beantragt worden war und das vor allem dem Technologietransfer dient. (orig.)

  10. Cyclebase 3.0: a multi-organism database on cell-cycle regulation and phenotypes.

    Science.gov (United States)

    Santos, Alberto; Wernersson, Rasmus; Jensen, Lars Juhl

    2015-01-01

    The eukaryotic cell division cycle is a highly regulated process that consists of a complex series of events and involves thousands of proteins. Researchers have studied the regulation of the cell cycle in several organisms, employing a wide range of high-throughput technologies, such as microarray-based mRNA expression profiling and quantitative proteomics. Due to its complexity, the cell cycle can also fail or otherwise change in many different ways if important genes are knocked out, which has been studied in several microscopy-based knockdown screens. The data from these many large-scale efforts are not easily accessed, analyzed and combined due to their inherent heterogeneity. To address this, we have created Cyclebase--available at http://www.cyclebase.org--an online database that allows users to easily visualize and download results from genome-wide cell-cycle-related experiments. In Cyclebase version 3.0, we have updated the content of the database to reflect changes to genome annotation, added new mRNA and protein expression data, and integrated cell-cycle phenotype information from high-content screens and model-organism databases. The new version of Cyclebase also features a new web interface, designed around an overview figure that summarizes all the cell-cycle-related data for a gene. © The Author(s) 2014. Published by Oxford University Press on behalf of Nucleic Acids Research.

  11. Large-scale retrieval for medical image analytics: A comprehensive review.

    Science.gov (United States)

    Li, Zhongyu; Zhang, Xiaofan; Müller, Henning; Zhang, Shaoting

    2018-01-01

    Over the past decades, medical image analytics was greatly facilitated by the explosion of digital imaging techniques, where huge amounts of medical images were produced with ever-increasing quality and diversity. However, conventional methods for analyzing medical images have achieved limited success, as they are not capable to tackle the huge amount of image data. In this paper, we review state-of-the-art approaches for large-scale medical image analysis, which are mainly based on recent advances in computer vision, machine learning and information retrieval. Specifically, we first present the general pipeline of large-scale retrieval, summarize the challenges/opportunities of medical image analytics on a large-scale. Then, we provide a comprehensive review of algorithms and techniques relevant to major processes in the pipeline, including feature representation, feature indexing, searching, etc. On the basis of existing work, we introduce the evaluation protocols and multiple applications of large-scale medical image retrieval, with a variety of exploratory and diagnostic scenarios. Finally, we discuss future directions of large-scale retrieval, which can further improve the performance of medical image analysis. Copyright © 2017 Elsevier B.V. All rights reserved.

  12. Genomic screening for dissection of a complex disease: The multiple sclerosis phenotype

    Energy Technology Data Exchange (ETDEWEB)

    Haines, J.L.; Bazyk, A.; Gusella, J.F. [Massachusetts General Hospital, Boston, MA (United States)] [and others

    1994-09-01

    Application of positional cloning to diseases with a complex etiology is fraught with problems. These include undefined modes of inheritance, heterogeneity, and epistasis. Although microsatellite markers now make genotyping the genome a straightforward task, no single analytical method is available to efficiently and accurately use these data for a complex disease. We have developed a multi-stage genomic screening strategy which uses a combination of non-parametric approaches (Affected Pedigree Member (APM) linkage analysis and robust sib pair analysis (SP)), and the parametric lod score approach (using four different genetic models). To warrant follow-up, a marker must have two or more of: a nominal P value of 0.05 or less on the non-parametric tests, or a lod score greater than 1.0 for any model. Two adjacent markers each fulfilling one criterion are also considered for follow-up. These criteria were determined both by simulation studies and our empirical experience in screening a large number of other disorders. We applied this approach to multiple sclerosis (MS), a complex neurological disorder with a strong but ill-defined genetic component. Analysis of the first 91 markers from our screen of 55 multiplex families found 5 markers which met the SP criteria, 13 markers which met the APM criteria, and 8 markers which met the lod score criteria. Five regions (on chromosomes 2, 4, 7, 14, and 19) met our overall criteria. However, no single method identified all of these regions, suggesting that each method is sensitive to various (unknown) influences. The chromosome 14 results were not supported by follow-up typing and analysis of markers in that region, but the chromosome 19 results remain well supported. Updated screening results will be presented.

  13. [Psychometric attributes of the Spanish version of A-TAC screening scale for autism spectrum disorders].

    Science.gov (United States)

    Cubo, E; Sáez Velasco, S; Delgado Benito, V; Ausín Villaverde, V; García Soto, X R; Trejo Gabriel Y Galán, J M; Martín Santidrián, A; Macarrón, J V; Cordero Guevara, J; Benito-León, J; Louis, E D

    2011-07-01

    As there are no biological markers for Autism Spectrum Disorders (ASD), screening must focus on behaviour and the presence of a markedly abnormal development or a deficiency in verbal and non-verbal social interaction and communication. To evaluate the psychometric attributes of a Spanish version of the autism domain of the Autism-Tics, AD/HD and other Comorbidities Inventory (A-TAC) scale for ASD screening. A total of 140 subjects (43% male, 57% female) aged 6-16, with ASD (n=15), Mental Retardation (n=40), Psychiatric Illness (n=22), Tics (n=12) and controls (n=51), were included for ASD screening. The predictive validity, acceptability, scale assumptions, internal consistency, and precision were analysed. The internal consistency was high (α=0.93), and the standard error was adequate (1.13 [95% CI, -1.08 a 3.34]). The mean scores of the Autism module were higher in patients diagnosed with ASD and mental disability compared to the rest of the patients (P<.001). The area under the curve was 0.96 for the ASD group. The autism domain of the A-TAC scale seems to be a reliable, valid and precise tool for ASD screening in the Spanish school population. Copyright © 2010 Asociación Española de Pediatría. Published by Elsevier Espana. All rights reserved.

  14. Clustering patterns of LOD scores for asthma-related phenotypes revealed by a genome-wide screen in 295 French EGEA families.

    Science.gov (United States)

    Bouzigon, Emmanuelle; Dizier, Marie-Hélène; Krähenbühl, Christine; Lemainque, Arnaud; Annesi-Maesano, Isabella; Betard, Christine; Bousquet, Jean; Charpin, Denis; Gormand, Frédéric; Guilloud-Bataille, Michel; Just, Jocelyne; Le Moual, Nicole; Maccario, Jean; Matran, Régis; Neukirch, Françoise; Oryszczyn, Marie-Pierre; Paty, Evelyne; Pin, Isabelle; Rosenberg-Bourgin, Myriam; Vervloet, Daniel; Kauffmann, Francine; Lathrop, Mark; Demenais, Florence

    2004-12-15

    A genome-wide scan for asthma phenotypes was conducted in the whole sample of 295 EGEA families selected through at least one asthmatic subject. In addition to asthma, seven phenotypes involved in the main asthma physiopathological pathways were considered: SPT (positive skin prick test response to at least one of 11 allergens), SPTQ score being the number of positive skin test responses to 11 allergens, Phadiatop (positive specific IgE response to a mixture of allergens), total IgE levels, eosinophils, bronchial responsiveness (BR) to methacholine challenge and %predicted FEV(1). Four regions showed evidence for linkage (Pscreens, 6q14 appears to be a new region potentially linked to %FEV(1). To determine which of these various asthma phenotypes are more likely to share common genetic determinants, a principal component analysis was applied to the genome-wide LOD scores. This analysis revealed clustering of LODs for asthma, SPT and Phadiatop on one axis and clustering of LODs for %FEV(1), BR and SPTQ on the other, while LODs for IgE and eosinophils appeared to be independent from all other LODs. These results provide new insights into the potential sharing of genetic determinants by asthma-related phenotypes.

  15. Photorealistic large-scale urban city model reconstruction.

    Science.gov (United States)

    Poullis, Charalambos; You, Suya

    2009-01-01

    The rapid and efficient creation of virtual environments has become a crucial part of virtual reality applications. In particular, civil and defense applications often require and employ detailed models of operations areas for training, simulations of different scenarios, planning for natural or man-made events, monitoring, surveillance, games, and films. A realistic representation of the large-scale environments is therefore imperative for the success of such applications since it increases the immersive experience of its users and helps reduce the difference between physical and virtual reality. However, the task of creating such large-scale virtual environments still remains a time-consuming and manual work. In this work, we propose a novel method for the rapid reconstruction of photorealistic large-scale virtual environments. First, a novel, extendible, parameterized geometric primitive is presented for the automatic building identification and reconstruction of building structures. In addition, buildings with complex roofs containing complex linear and nonlinear surfaces are reconstructed interactively using a linear polygonal and a nonlinear primitive, respectively. Second, we present a rendering pipeline for the composition of photorealistic textures, which unlike existing techniques, can recover missing or occluded texture information by integrating multiple information captured from different optical sensors (ground, aerial, and satellite).

  16. Prototype Vector Machine for Large Scale Semi-Supervised Learning

    Energy Technology Data Exchange (ETDEWEB)

    Zhang, Kai; Kwok, James T.; Parvin, Bahram

    2009-04-29

    Practicaldataminingrarelyfalls exactlyinto the supervisedlearning scenario. Rather, the growing amount of unlabeled data poses a big challenge to large-scale semi-supervised learning (SSL). We note that the computationalintensivenessofgraph-based SSLarises largely from the manifold or graph regularization, which in turn lead to large models that are dificult to handle. To alleviate this, we proposed the prototype vector machine (PVM), a highlyscalable,graph-based algorithm for large-scale SSL. Our key innovation is the use of"prototypes vectors" for effcient approximation on both the graph-based regularizer and model representation. The choice of prototypes are grounded upon two important criteria: they not only perform effective low-rank approximation of the kernel matrix, but also span a model suffering the minimum information loss compared with the complete model. We demonstrate encouraging performance and appealing scaling properties of the PVM on a number of machine learning benchmark data sets.

  17. COLA with scale-dependent growth: applications to screened modified gravity models

    Energy Technology Data Exchange (ETDEWEB)

    Winther, Hans A.; Koyama, Kazuya; Wright, Bill S. [Institute of Cosmology and Gravitation, University of Portsmouth, Dennis Sciama Building, Burnaby Road, Portsmouth, PO1 3FX (United Kingdom); Manera, Marc [Centre for Theoretical Cosmology, Department of Applied Mathematics and Theoretical Physics, University of Cambridge, Wilberforce Road, Cambridge CB3 0WA (United Kingdom); Zhao, Gong-Bo, E-mail: hans.a.winther@gmail.com, E-mail: kazuya.koyama@port.ac.uk, E-mail: manera.work@gmail.com, E-mail: bill.wright@port.ac.uk, E-mail: gong-bo.Zhao@port.ac.uk [National Astronomy Observatories, Chinese Academy of Science, Beijing, 100012 (China)

    2017-08-01

    We present a general parallelized and easy-to-use code to perform numerical simulations of structure formation using the COLA (COmoving Lagrangian Acceleration) method for cosmological models that exhibit scale-dependent growth at the level of first and second order Lagrangian perturbation theory. For modified gravity theories we also include screening using a fast approximate method that covers all the main examples of screening mechanisms in the literature. We test the code by comparing it to full simulations of two popular modified gravity models, namely f ( R ) gravity and nDGP, and find good agreement in the modified gravity boost-factors relative to ΛCDM even when using a fairly small number of COLA time steps.

  18. What’s old is new again: yeast mutant screens in the era of pooled segregant analysis by genome sequencing

    Directory of Open Access Journals (Sweden)

    Chris Curtin

    2016-04-01

    Full Text Available While once de-rigueur for identification of genes involved in biological processes, screening of chemically induced mutant populations is an approach that has largely been superseded for model organisms such as Saccharomyces cerevisiae. Availability of single gene deletion/overexpression libraries and combinatorial synthetic genetic arrays provide yeast researchers more structured ways to probe genetic networks. Furthermore, in the age of inexpensive DNA sequencing, methodologies such as mapping of quantitative trait loci (QTL by pooled segregant analysis and genome-wide association enable the identification of multiple naturally occurring allelic variants that contribute to polygenic phenotypes of interest. This is, however, contingent on the capacity to screen large numbers of individuals and existence of sufficient natural phenotypic variation within the available population. The latter cannot be guaranteed and non-selectable, industrially relevant phenotypes, such as production of volatile aroma compounds, pose severe limitations on the use of modern genetic techniques due to expensive and time-consuming downstream analyses. An interesting approach to overcome these issues can be found in Den Abt et al.[1] (this issue of Microbial Cell, where a combination of repeated rounds of chemical mutagenesis and pooled segregant analysis by whole genome sequencing was applied to identify genes involved in ethyl acetate formation, demonstrating a new path for industrial yeast strain development and bringing classical mutant screens into the 21st century.

  19. Integration of curated databases to identify genotype-phenotype associations

    Directory of Open Access Journals (Sweden)

    Li Jianrong

    2006-10-01

    Full Text Available Abstract Background The ability to rapidly characterize an unknown microorganism is critical in both responding to infectious disease and biodefense. To do this, we need some way of anticipating an organism's phenotype based on the molecules encoded by its genome. However, the link between molecular composition (i.e. genotype and phenotype for microbes is not obvious. While there have been several studies that address this challenge, none have yet proposed a large-scale method integrating curated biological information. Here we utilize a systematic approach to discover genotype-phenotype associations that combines phenotypic information from a biomedical informatics database, GIDEON, with the molecular information contained in National Center for Biotechnology Information's Clusters of Orthologous Groups database (NCBI COGs. Results Integrating the information in the two databases, we are able to correlate the presence or absence of a given protein in a microbe with its phenotype as measured by certain morphological characteristics or survival in a particular growth media. With a 0.8 correlation score threshold, 66% of the associations found were confirmed by the literature and at a 0.9 correlation threshold, 86% were positively verified. Conclusion Our results suggest possible phenotypic manifestations for proteins biochemically associated with sugar metabolism and electron transport. Moreover, we believe our approach can be extended to linking pathogenic phenotypes with functionally related proteins.

  20. Patient phenotypes in fibromyalgia comorbid with systemic sclerosis or rheumatoid arthritis: influence of diagnostic and screening tests. Screening with the FiRST questionnaire, diagnosis with the ACR 1990 and revised ACR 2010 criteria.

    Science.gov (United States)

    Perrot, Serge; Peixoto, Mariana; Dieudé, Philippe; Hachulla, Eric; Avouac, Jerome; Ottaviani, Sebastien; Allanore, Yannick

    2017-01-01

    Fibromyalgia (FM) may occur with rheumatoid arthritis (RA) and systemic sclerosis (SSc), and debate remains about its diagnosis. We aimed to use three FM tools (a screening tool (FiRST), diagnostic criteria (ACR 1990 and revised 2010), to compare FM prevalence between RA and SSc patients, to describe the phenotypes of patients with comorbid FM, and to analyze links between FM and secondary Sjögren's syndrome (SS). Consecutive adult patients with confirmed RA or SSc from four university hospitals were tested with the three FM tools. FiRST detected FM in 22.6% of the 172 RA patients, with confirmation in 22.1% (ACR1990) and 19.1% (ACR2010). ACR1990FM+ RA patients had more diffuse pain, whereas ACR2010FM+ RA patients had higher BMI and pain intensity, more diffuse pain, active disease, disability, and associated SS. FiRST detected FM in 27.8% of the 122 SSc patients, with confirmation in 30.3% (ACR1990) and 23.7% (ACR2010). ACR1990FM+ SSc patients had greater disability and pain intensity, and more diffuse pain, whereas ACR2010FM+ SSc patients had higher BMI, pain intensity, more disability and diffuse pain, and associated SS. Correlations between FM diagnostic and screening tool results were modest in both conditions. Secondary SS was associated with comorbid FM. The prevalence of FM is high in SSc and RA, whatever the FM diagnostic tool used. Secondary SS is associated with FM in both RA and SSc. The revised ACR 2010 FM criteria and FiRST screening tool reveal specific phenotypes potentially useful for improving disease management.

  1. HIV coreceptor phenotyping in the clinical setting.

    Science.gov (United States)

    Low, Andrew J; Swenson, Luke C; Harrigan, P Richard

    2008-01-01

    The introduction of CCR5 antagonists increases the options available for constructing antiretroviral regimens. However, this option is coupled with the caveat that patients should be tested for HIV coreceptor tropism prior to initiating CCR5 antagonist-based therapy. Failure to screen for CXCR4 usage increases the risk of using an ineffective drug, thus reducing the likelihood of viral suppression and increasing their risk for developing antiretroviral resistance. This review discusses current and future methods of determining HIV tropism, with a focus on their utility in the clinical setting for screening purposes. Some of these methods include recombinant phenotypic tests, such as the Monogram Trofile assay, as well as genotype-based predictors, heteroduplex tracking assays, and flow cytometry based methods. Currently, the best evidence supports the use of phenotypic methods, although other methods of screening for HIV coreceptor usage prior to the administration of CCR5 antagonists may reduce costs and increase turnaround time over phenotypic methods. The presence of low levels of X4 virus is a challenge to all assay methods, resulting in reduced sensitivity in clinical, patient-derived samples when compared to clonally derived samples. Gaining a better understanding of the output of these assays and correlating them with clinical progression and therapy response will provide some indication on how both genotype-based, and phenotypic assays for determining HIV coreceptor usage can be improved. In addition, leveraging new technologies capable of detecting low-level minority species may provide the most significant advances in ensuring that individuals with low levels of dual/mixed tropic virus are not inadvertently prescribed CCR5 antagonists.

  2. Screening older adults at risk of falling with the Tinetti balance scale.

    Science.gov (United States)

    Raîche, M; Hébert, R; Prince, F; Corriveau, H

    2000-09-16

    In a prospective study of 225 community dwelling people 75 years and older, we tested the validity of the Tinetti balance scale to predict individuals who will fall at least once during the following year. A score of 36 or less identified 7 of 10 fallers with 70% sensitivity and 52% specificity. With this cut-off score, 53% of the individuals were screened positive and presented a two-fold risk of falling. These characteristics support the use of this test to screen older people at risk of falling in order to include them in a preventive intervention.

  3. Disrupted coupling of large-scale networks is associated with relapse behaviour in heroin-dependent men

    Science.gov (United States)

    Li, Qiang; Liu, Jierong; Wang, Wei; Wang, Yarong; Li, Wei; Chen, Jiajie; Zhu, Jia; Yan, Xuejiao; Li, Yongbin; Li, Zhe; Ye, Jianjun; Wang, Wei

    2018-01-01

    Background It is unknown whether impaired coupling among 3 core large-scale brain networks (salience [SN], default mode [DMN] and executive control networks [ECN]) is associated with relapse behaviour in treated heroin-dependent patients. Methods We conducted a prospective resting-state functional MRI study comparing the functional connectivity strength among healthy controls and heroin-dependent men who had either relapsed or were in early remission. Men were considered to be either relapsed or in early remission based on urine drug screens during a 3-month follow-up period. We also examined how the coupling of large-scale networks correlated with relapse behaviour among heroin-dependent men. Results We included 20 controls and 50 heroin-dependent men (26 relapsed and 24 early remission) in our analyses. The relapsed men showed greater connectivity than the early remission and control groups between the dorsal anterior cingulate cortex (key node of the SN) and the dorsomedial prefrontal cortex (included in the DMN). The relapsed men and controls showed lower connectivity than the early remission group between the left dorsolateral prefrontal cortex (key node of the left ECN) and the dorsomedial prefrontal cortex. The percentage of positive urine drug screens positively correlated with the coupling between the dorsal anterior cingulate cortex and dorsomedial prefrontal cortex, but negatively correlated with the coupling between the left dorsolateral prefrontal cortex and dorsomedial prefrontal cortex. Limitations We examined deficits in only 3 core networks leading to relapse behaviour. Other networks may also contribute to relapse. Conclusion Greater coupling between the SN and DMN and lower coupling between the left ECN and DMN is associated with relapse behaviour. These findings may shed light on the development of new treatments for heroin addiction. PMID:29252165

  4. Accelerating Relevance Vector Machine for Large-Scale Data on Spark

    Directory of Open Access Journals (Sweden)

    Liu Fang

    2017-01-01

    Full Text Available Relevance vector machine (RVM is a machine learning algorithm based on a sparse Bayesian framework, which performs well when running classification and regression tasks on small-scale datasets. However, RVM also has certain drawbacks which restricts its practical applications such as (1 slow training process, (2 poor performance on training large-scale datasets. In order to solve these problem, we propose Discrete AdaBoost RVM (DAB-RVM which incorporate ensemble learning in RVM at first. This method performs well with large-scale low-dimensional datasets. However, as the number of features increases, the training time of DAB-RVM increases as well. To avoid this phenomenon, we utilize the sufficient training samples of large-scale datasets and propose all features boosting RVM (AFB-RVM, which modifies the way of obtaining weak classifiers. In our experiments we study the differences between various boosting techniques with RVM, demonstrating the performance of the proposed approaches on Spark. As a result of this paper, two proposed approaches on Spark for different types of large-scale datasets are available.

  5. A combined disc method with resazurin agar plate assay for early phenotypic screening of KPC, MBL and OXA-48 carbapenemases among Enterobacteriaceae.

    Science.gov (United States)

    Teethaisong, Y; Eumkeb, G; Nakouti, I; Evans, K; Hobbs, G

    2016-08-01

    To validate a combined disc method along with resazurin chromogenic agar for early screening and differentiation of Klebsiella pneumoniae carbapenemase, metallo-β-lactamase and OXA-48 carbapenemase-producing Enterobacteriaceae. The combined disc test comprising of meropenem alone and with EDTA, phenylboronic acid or both EDTA and phenylboronic acid, and temocillin alone were evaluated with the resazurin chromogenic agar plate assay against a total of 86 molecularly confirmed Enterobacteriaceae clinical isolates (11 metallo-β-lactamases, eight Kl. pneumoniae carbapenemases, 11 OXA-48, 32 AmpC and 15 extended-spectrum-β-lactamase producers and nine co-producers of extended-spectrum-β-lactamase and AmpC). The inhibition zone diameters were measured and interpreted at 7 h for the presence of carbapenemase. All carbapenemase producers were phenotypically distinguished by this assay with 100% sensitivity and specificity. This early phenotypic method is very simple, inexpensive, and reliable in the detection and differentiation of carbapenemase-producing Enterobacteriaceae. It could be exploited in any microbiological laboratory for diagnosis of these recalcitrant bacteria. This assay poses excellent performance in discrimination of Kl. pneumoniae carbapenemase, metallo-β-lactamase and OXA-48 carbapenemases within 7 h, which is much faster than conventional disc diffusion methods. The rapid detection could help clinicians screen patients, control infection and provide epidemiological surveillance. © 2016 The Society for Applied Microbiology.

  6. Bayesian hierarchical model for large-scale covariance matrix estimation.

    Science.gov (United States)

    Zhu, Dongxiao; Hero, Alfred O

    2007-12-01

    Many bioinformatics problems implicitly depend on estimating large-scale covariance matrix. The traditional approaches tend to give rise to high variance and low accuracy due to "overfitting." We cast the large-scale covariance matrix estimation problem into the Bayesian hierarchical model framework, and introduce dependency between covariance parameters. We demonstrate the advantages of our approaches over the traditional approaches using simulations and OMICS data analysis.

  7. Concurrent Driving Method with Fast Scan Rate for Large Mutual Capacitance Touch Screens

    Directory of Open Access Journals (Sweden)

    Mohamed Gamal Ahmed Mohamed

    2015-01-01

    Full Text Available A novel touch screen control technique is introduced, which scans each frame in two steps of concurrent multichannel driving and differential sensing. The proposed technique substantially increases the scan rate and reduces the ambient noise effectively. It is also extended to a multichip architecture to support excessively large touch screens with great scan rate improvement. The proposed method has been implemented using 0.18 μm CMOS TowerJazz process and tested with FPGA and AFE board connecting a 23-inch touch screen. Experimental results show a scan rate improvement of up to 23.8 times and an SNR improvement of 24.6 dB over the conventional method.

  8. Development of Type 2 Diabetes Mellitus Phenotyping Framework Using Expert Knowledge and Machine Learning Approach.

    Science.gov (United States)

    Kagawa, Rina; Kawazoe, Yoshimasa; Ida, Yusuke; Shinohara, Emiko; Tanaka, Katsuya; Imai, Takeshi; Ohe, Kazuhiko

    2017-07-01

    Phenotyping is an automated technique that can be used to distinguish patients based on electronic health records. To improve the quality of medical care and advance type 2 diabetes mellitus (T2DM) research, the demand for T2DM phenotyping has been increasing. Some existing phenotyping algorithms are not sufficiently accurate for screening or identifying clinical research subjects. We propose a practical phenotyping framework using both expert knowledge and a machine learning approach to develop 2 phenotyping algorithms: one is for screening; the other is for identifying research subjects. We employ expert knowledge as rules to exclude obvious control patients and machine learning to increase accuracy for complicated patients. We developed phenotyping algorithms on the basis of our framework and performed binary classification to determine whether a patient has T2DM. To facilitate development of practical phenotyping algorithms, this study introduces new evaluation metrics: area under the precision-sensitivity curve (AUPS) with a high sensitivity and AUPS with a high positive predictive value. The proposed phenotyping algorithms based on our framework show higher performance than baseline algorithms. Our proposed framework can be used to develop 2 types of phenotyping algorithms depending on the tuning approach: one for screening, the other for identifying research subjects. We develop a novel phenotyping framework that can be easily implemented on the basis of proper evaluation metrics, which are in accordance with users' objectives. The phenotyping algorithms based on our framework are useful for extraction of T2DM patients in retrospective studies.

  9. Dissecting the Contributions of Cooperating Gene Mutations to Cancer Phenotypes and Drug Responses with Patient-Derived iPSCs

    Directory of Open Access Journals (Sweden)

    Chan-Jung Chang

    2018-05-01

    Full Text Available Summary: Connecting specific cancer genotypes with phenotypes and drug responses constitutes the central premise of precision oncology but is hindered by the genetic complexity and heterogeneity of primary cancer cells. Here, we use patient-derived induced pluripotent stem cells (iPSCs and CRISPR/Cas9 genome editing to dissect the individual contributions of two recurrent genetic lesions, the splicing factor SRSF2 P95L mutation and the chromosome 7q deletion, to the development of myeloid malignancy. Using a comprehensive panel of isogenic iPSCs—with none, one, or both genetic lesions—we characterize their relative phenotypic contributions and identify drug sensitivities specific to each one through a candidate drug approach and an unbiased large-scale small-molecule screen. To facilitate drug testing and discovery, we also derive SRSF2-mutant and isogenic normal expandable hematopoietic progenitor cells. We thus describe here an approach to dissect the individual effects of two cooperating mutations to clinically relevant features of malignant diseases. : Papapetrou and colleagues develop a comprehensive panel of isogenic iPSC lines with SRSF2 P95L mutation and chr7q deletion. They use these cells to identify cellular phenotypes contributed by each genetic lesion and therapeutic vulnerabilities specific to each one and develop expandable hematopoietic progenitor cell lines to facilitate drug discovery. Keywords: induced pluripotent stem cells, myelodysplastic syndrome, CRISPR/Cas9, gene editing, mutational cooperation, splicing factor mutations, spliceosomal mutations, SRSF2, chr7q deletion

  10. Creating Large Scale Database Servers

    International Nuclear Information System (INIS)

    Becla, Jacek

    2001-01-01

    The BaBar experiment at the Stanford Linear Accelerator Center (SLAC) is designed to perform a high precision investigation of the decays of the B-meson produced from electron-positron interactions. The experiment, started in May 1999, will generate approximately 300TB/year of data for 10 years. All of the data will reside in Objectivity databases accessible via the Advanced Multi-threaded Server (AMS). To date, over 70TB of data have been placed in Objectivity/DB, making it one of the largest databases in the world. Providing access to such a large quantity of data through a database server is a daunting task. A full-scale testbed environment had to be developed to tune various software parameters and a fundamental change had to occur in the AMS architecture to allow it to scale past several hundred terabytes of data. Additionally, several protocol extensions had to be implemented to provide practical access to large quantities of data. This paper will describe the design of the database and the changes that we needed to make in the AMS for scalability reasons and how the lessons we learned would be applicable to virtually any kind of database server seeking to operate in the Petabyte region

  11. Creating Large Scale Database Servers

    Energy Technology Data Exchange (ETDEWEB)

    Becla, Jacek

    2001-12-14

    The BaBar experiment at the Stanford Linear Accelerator Center (SLAC) is designed to perform a high precision investigation of the decays of the B-meson produced from electron-positron interactions. The experiment, started in May 1999, will generate approximately 300TB/year of data for 10 years. All of the data will reside in Objectivity databases accessible via the Advanced Multi-threaded Server (AMS). To date, over 70TB of data have been placed in Objectivity/DB, making it one of the largest databases in the world. Providing access to such a large quantity of data through a database server is a daunting task. A full-scale testbed environment had to be developed to tune various software parameters and a fundamental change had to occur in the AMS architecture to allow it to scale past several hundred terabytes of data. Additionally, several protocol extensions had to be implemented to provide practical access to large quantities of data. This paper will describe the design of the database and the changes that we needed to make in the AMS for scalability reasons and how the lessons we learned would be applicable to virtually any kind of database server seeking to operate in the Petabyte region.

  12. Large-scale pool fires

    Directory of Open Access Journals (Sweden)

    Steinhaus Thomas

    2007-01-01

    Full Text Available A review of research into the burning behavior of large pool fires and fuel spill fires is presented. The features which distinguish such fires from smaller pool fires are mainly associated with the fire dynamics at low source Froude numbers and the radiative interaction with the fire source. In hydrocarbon fires, higher soot levels at increased diameters result in radiation blockage effects around the perimeter of large fire plumes; this yields lower emissive powers and a drastic reduction in the radiative loss fraction; whilst there are simplifying factors with these phenomena, arising from the fact that soot yield can saturate, there are other complications deriving from the intermittency of the behavior, with luminous regions of efficient combustion appearing randomly in the outer surface of the fire according the turbulent fluctuations in the fire plume. Knowledge of the fluid flow instabilities, which lead to the formation of large eddies, is also key to understanding the behavior of large-scale fires. Here modeling tools can be effectively exploited in order to investigate the fluid flow phenomena, including RANS- and LES-based computational fluid dynamics codes. The latter are well-suited to representation of the turbulent motions, but a number of challenges remain with their practical application. Massively-parallel computational resources are likely to be necessary in order to be able to adequately address the complex coupled phenomena to the level of detail that is necessary.

  13. Identifying gene-environment interactions in schizophrenia: contemporary challenges for integrated, large-scale investigations.

    Science.gov (United States)

    van Os, Jim; Rutten, Bart P; Myin-Germeys, Inez; Delespaul, Philippe; Viechtbauer, Wolfgang; van Zelst, Catherine; Bruggeman, Richard; Reininghaus, Ulrich; Morgan, Craig; Murray, Robin M; Di Forti, Marta; McGuire, Philip; Valmaggia, Lucia R; Kempton, Matthew J; Gayer-Anderson, Charlotte; Hubbard, Kathryn; Beards, Stephanie; Stilo, Simona A; Onyejiaka, Adanna; Bourque, Francois; Modinos, Gemma; Tognin, Stefania; Calem, Maria; O'Donovan, Michael C; Owen, Michael J; Holmans, Peter; Williams, Nigel; Craddock, Nicholas; Richards, Alexander; Humphreys, Isla; Meyer-Lindenberg, Andreas; Leweke, F Markus; Tost, Heike; Akdeniz, Ceren; Rohleder, Cathrin; Bumb, J Malte; Schwarz, Emanuel; Alptekin, Köksal; Üçok, Alp; Saka, Meram Can; Atbaşoğlu, E Cem; Gülöksüz, Sinan; Gumus-Akay, Guvem; Cihan, Burçin; Karadağ, Hasan; Soygür, Haldan; Cankurtaran, Eylem Şahin; Ulusoy, Semra; Akdede, Berna; Binbay, Tolga; Ayer, Ahmet; Noyan, Handan; Karadayı, Gülşah; Akturan, Elçin; Ulaş, Halis; Arango, Celso; Parellada, Mara; Bernardo, Miguel; Sanjuán, Julio; Bobes, Julio; Arrojo, Manuel; Santos, Jose Luis; Cuadrado, Pedro; Rodríguez Solano, José Juan; Carracedo, Angel; García Bernardo, Enrique; Roldán, Laura; López, Gonzalo; Cabrera, Bibiana; Cruz, Sabrina; Díaz Mesa, Eva Ma; Pouso, María; Jiménez, Estela; Sánchez, Teresa; Rapado, Marta; González, Emiliano; Martínez, Covadonga; Sánchez, Emilio; Olmeda, Ma Soledad; de Haan, Lieuwe; Velthorst, Eva; van der Gaag, Mark; Selten, Jean-Paul; van Dam, Daniella; van der Ven, Elsje; van der Meer, Floor; Messchaert, Elles; Kraan, Tamar; Burger, Nadine; Leboyer, Marion; Szoke, Andrei; Schürhoff, Franck; Llorca, Pierre-Michel; Jamain, Stéphane; Tortelli, Andrea; Frijda, Flora; Vilain, Jeanne; Galliot, Anne-Marie; Baudin, Grégoire; Ferchiou, Aziz; Richard, Jean-Romain; Bulzacka, Ewa; Charpeaud, Thomas; Tronche, Anne-Marie; De Hert, Marc; van Winkel, Ruud; Decoster, Jeroen; Derom, Catherine; Thiery, Evert; Stefanis, Nikos C; Sachs, Gabriele; Aschauer, Harald; Lasser, Iris; Winklbaur, Bernadette; Schlögelhofer, Monika; Riecher-Rössler, Anita; Borgwardt, Stefan; Walter, Anna; Harrisberger, Fabienne; Smieskova, Renata; Rapp, Charlotte; Ittig, Sarah; Soguel-dit-Piquard, Fabienne; Studerus, Erich; Klosterkötter, Joachim; Ruhrmann, Stephan; Paruch, Julia; Julkowski, Dominika; Hilboll, Desiree; Sham, Pak C; Cherny, Stacey S; Chen, Eric Y H; Campbell, Desmond D; Li, Miaoxin; Romeo-Casabona, Carlos María; Emaldi Cirión, Aitziber; Urruela Mora, Asier; Jones, Peter; Kirkbride, James; Cannon, Mary; Rujescu, Dan; Tarricone, Ilaria; Berardi, Domenico; Bonora, Elena; Seri, Marco; Marcacci, Thomas; Chiri, Luigi; Chierzi, Federico; Storbini, Viviana; Braca, Mauro; Minenna, Maria Gabriella; Donegani, Ivonne; Fioritti, Angelo; La Barbera, Daniele; La Cascia, Caterina Erika; Mulè, Alice; Sideli, Lucia; Sartorio, Rachele; Ferraro, Laura; Tripoli, Giada; Seminerio, Fabio; Marinaro, Anna Maria; McGorry, Patrick; Nelson, Barnaby; Amminger, G Paul; Pantelis, Christos; Menezes, Paulo R; Del-Ben, Cristina M; Gallo Tenan, Silvia H; Shuhama, Rosana; Ruggeri, Mirella; Tosato, Sarah; Lasalvia, Antonio; Bonetto, Chiara; Ira, Elisa; Nordentoft, Merete; Krebs, Marie-Odile; Barrantes-Vidal, Neus; Cristóbal, Paula; Kwapil, Thomas R; Brietzke, Elisa; Bressan, Rodrigo A; Gadelha, Ary; Maric, Nadja P; Andric, Sanja; Mihaljevic, Marina; Mirjanic, Tijana

    2014-07-01

    Recent years have seen considerable progress in epidemiological and molecular genetic research into environmental and genetic factors in schizophrenia, but methodological uncertainties remain with regard to validating environmental exposures, and the population risk conferred by individual molecular genetic variants is small. There are now also a limited number of studies that have investigated molecular genetic candidate gene-environment interactions (G × E), however, so far, thorough replication of findings is rare and G × E research still faces several conceptual and methodological challenges. In this article, we aim to review these recent developments and illustrate how integrated, large-scale investigations may overcome contemporary challenges in G × E research, drawing on the example of a large, international, multi-center study into the identification and translational application of G × E in schizophrenia. While such investigations are now well underway, new challenges emerge for G × E research from late-breaking evidence that genetic variation and environmental exposures are, to a significant degree, shared across a range of psychiatric disorders, with potential overlap in phenotype. © The Author 2014. Published by Oxford University Press on behalf of the Maryland Psychiatric Research Center. All rights reserved. For permissions, please email: journals.permissions@oup.com.

  14. Decentralised stabilising controllers for a class of large-scale linear ...

    Indian Academy of Sciences (India)

    subsystems resulting from a new aggregation-decomposition technique. The method has been illustrated through a numerical example of a large-scale linear system consisting of three subsystems each of the fourth order. Keywords. Decentralised stabilisation; large-scale linear systems; optimal feedback control; algebraic ...

  15. Large Scale Survey Data in Career Development Research

    Science.gov (United States)

    Diemer, Matthew A.

    2008-01-01

    Large scale survey datasets have been underutilized but offer numerous advantages for career development scholars, as they contain numerous career development constructs with large and diverse samples that are followed longitudinally. Constructs such as work salience, vocational expectations, educational expectations, work satisfaction, and…

  16. Maximally efficient two-stage screening: Determining intellectual disability in Taiwanese military conscripts

    Directory of Open Access Journals (Sweden)

    Chia-Chang Chien

    2009-01-01

    Full Text Available Chia-Chang Chien1, Shu-Fen Huang1,2,3,4, For-Wey Lung1,2,3,41Department of Psychiatry, Kaohsiung Armed Forces General Hospital, Kaohsiung, Taiwan; 2Graduate Institute of Behavioral Sciences, Kaohsiung Medical University, Kaohsiung, Taiwan; 3Department of Psychiatry, National Defense Medical Center, Taipei, Taiwan; 4Calo Psychiatric Center, Pingtung County, TaiwanObjective: The purpose of this study was to apply a two-stage screening method for the large-scale intelligence screening of military conscripts.Methods: We collected 99 conscripted soldiers whose educational levels were senior high school level or lower to be the participants. Every participant was required to take the Wisconsin Card Sorting Test (WCST and the Wechsler Adult Intelligence Scale-Revised (WAIS-R assessments.Results: Logistic regression analysis showed the conceptual level responses (CLR index of the WCST was the most significant index for determining intellectual disability (ID; FIQ ≤ 84. We used the receiver operating characteristic curve to determine the optimum cut-off point of CLR. The optimum one cut-off point of CLR was 66; the two cut-off points were 49 and 66. Comparing the two-stage window screening with the two-stage positive screening, the area under the curve and the positive predictive value increased. Moreover, the cost of the two-stage window screening decreased by 59%.Conclusion: The two-stage window screening is more accurate and economical than the two-stage positive screening. Our results provide an example for the use of two-stage screening and the possibility of the WCST to replace WAIS-R in large-scale screenings for ID in the future.Keywords: intellectual disability, intelligence screening, two-stage positive screening, Wisconsin Card Sorting Test, Wechsler Adult Intelligence Scale-Revised

  17. Moving into a new era of periodontal genetic studies: relevance of large case-control samples using severe phenotypes for genome-wide association studies.

    Science.gov (United States)

    Vaithilingam, R D; Safii, S H; Baharuddin, N A; Ng, C C; Cheong, S C; Bartold, P M; Schaefer, A S; Loos, B G

    2014-12-01

    Studies to elucidate the role of genetics as a risk factor for periodontal disease have gone through various phases. In the majority of cases, the initial 'hypothesis-dependent' candidate-gene polymorphism studies did not report valid genetic risk loci. Following a large-scale replication study, these initially positive results are believed to be caused by type 1 errors. However, susceptibility genes, such as CDKN2BAS (Cyclin Dependend KiNase 2B AntiSense RNA; alias ANRIL [ANtisense Rna In the Ink locus]), glycosyltransferase 6 domain containing 1 (GLT6D1) and cyclooxygenase 2 (COX2), have been reported as conclusive risk loci of periodontitis. The search for genetic risk factors accelerated with the advent of 'hypothesis-free' genome-wide association studies (GWAS). However, despite many different GWAS being performed for almost all human diseases, only three GWAS on periodontitis have been published - one reported genome-wide association of GLT6D1 with aggressive periodontitis (a severe phenotype of periodontitis), whereas the remaining two, which were performed on patients with chronic periodontitis, were not able to find significant associations. This review discusses the problems faced and the lessons learned from the search for genetic risk variants of periodontitis. Current and future strategies for identifying genetic variance in periodontitis, and the importance of planning a well-designed genetic study with large and sufficiently powered case-control samples of severe phenotypes, are also discussed. © 2014 John Wiley & Sons A/S. Published by John Wiley & Sons Ltd.

  18. Similitude and scaling of large structural elements: Case study

    Directory of Open Access Journals (Sweden)

    M. Shehadeh

    2015-06-01

    Full Text Available Scaled down models are widely used for experimental investigations of large structures due to the limitation in the capacities of testing facilities along with the expenses of the experimentation. The modeling accuracy depends upon the model material properties, fabrication accuracy and loading techniques. In the present work the Buckingham π theorem is used to develop the relations (i.e. geometry, loading and properties between the model and a large structural element as that is present in the huge existing petroleum oil drilling rigs. The model is to be designed, loaded and treated according to a set of similitude requirements that relate the model to the large structural element. Three independent scale factors which represent three fundamental dimensions, namely mass, length and time need to be selected for designing the scaled down model. Numerical prediction of the stress distribution within the model and its elastic deformation under steady loading is to be made. The results are compared with those obtained from the full scale structure numerical computations. The effect of scaled down model size and material on the accuracy of the modeling technique is thoroughly examined.

  19. Collaborative Work without Large, Shared Displays: Looking for “the Big Picture” on a Small Screen?

    DEFF Research Database (Denmark)

    Hertzum, Morten

    2017-01-01

    Large, shared displays – such as electronic whiteboards – have proven successful in supporting actors in forming and maintaining an overview of tightly coupled collaborative activities. However, in many developing countries the technology of choice is mobile phones, which have neither a large nor...... a shared screen. It therefore appears relevant to ask: How may mobile devices with small screens support, or fail to support, actors in forming and maintaining an overview of their collaborative activities?...

  20. Large-scale preparation of hollow graphitic carbon nanospheres

    Energy Technology Data Exchange (ETDEWEB)

    Feng, Jun; Li, Fu [Key Laboratory for Liquid-Solid Structural Evolution and Processing of Materials, Ministry of Education, Shandong University, Jinan 250061 (China); Bai, Yu-Jun, E-mail: byj97@126.com [Key Laboratory for Liquid-Solid Structural Evolution and Processing of Materials, Ministry of Education, Shandong University, Jinan 250061 (China); State Key laboratory of Crystal Materials, Shandong University, Jinan 250100 (China); Han, Fu-Dong; Qi, Yong-Xin; Lun, Ning [Key Laboratory for Liquid-Solid Structural Evolution and Processing of Materials, Ministry of Education, Shandong University, Jinan 250061 (China); Lu, Xi-Feng [Lunan Institute of Coal Chemical Engineering, Jining 272000 (China)

    2013-01-15

    Hollow graphitic carbon nanospheres (HGCNSs) were synthesized on large scale by a simple reaction between glucose and Mg at 550 Degree-Sign C in an autoclave. Characterization by X-ray diffraction, Raman spectroscopy and transmission electron microscopy demonstrates the formation of HGCNSs with an average diameter of 10 nm or so and a wall thickness of a few graphenes. The HGCNSs exhibit a reversible capacity of 391 mAh g{sup -1} after 60 cycles when used as anode materials for Li-ion batteries. -- Graphical abstract: Hollow graphitic carbon nanospheres could be prepared on large scale by the simple reaction between glucose and Mg at 550 Degree-Sign C, which exhibit superior electrochemical performance to graphite. Highlights: Black-Right-Pointing-Pointer Hollow graphitic carbon nanospheres (HGCNSs) were prepared on large scale at 550 Degree-Sign C Black-Right-Pointing-Pointer The preparation is simple, effective and eco-friendly. Black-Right-Pointing-Pointer The in situ yielded MgO nanocrystals promote the graphitization. Black-Right-Pointing-Pointer The HGCNSs exhibit superior electrochemical performance to graphite.

  1. Large-scale impact cratering on the terrestrial planets

    International Nuclear Information System (INIS)

    Grieve, R.A.F.

    1982-01-01

    The crater densities on the earth and moon form the basis for a standard flux-time curve that can be used in dating unsampled planetary surfaces and constraining the temporal history of endogenic geologic processes. Abundant evidence is seen not only that impact cratering was an important surface process in planetary history but also that large imapact events produced effects that were crucial in scale. By way of example, it is noted that the formation of multiring basins on the early moon was as important in defining the planetary tectonic framework as plate tectonics is on the earth. Evidence from several planets suggests that the effects of very-large-scale impacts go beyond the simple formation of an impact structure and serve to localize increased endogenic activity over an extended period of geologic time. Even though no longer occurring with the frequency and magnitude of early solar system history, it is noted that large scale impact events continue to affect the local geology of the planets. 92 references

  2. Optical interconnect for large-scale systems

    Science.gov (United States)

    Dress, William

    2013-02-01

    This paper presents a switchless, optical interconnect module that serves as a node in a network of identical distribution modules for large-scale systems. Thousands to millions of hosts or endpoints may be interconnected by a network of such modules, avoiding the need for multi-level switches. Several common network topologies are reviewed and their scaling properties assessed. The concept of message-flow routing is discussed in conjunction with the unique properties enabled by the optical distribution module where it is shown how top-down software control (global routing tables, spanning-tree algorithms) may be avoided.

  3. ScreenBEAM: a novel meta-analysis algorithm for functional genomics screens via Bayesian hierarchical modeling.

    Science.gov (United States)

    Yu, Jiyang; Silva, Jose; Califano, Andrea

    2016-01-15

    Functional genomics (FG) screens, using RNAi or CRISPR technology, have become a standard tool for systematic, genome-wide loss-of-function studies for therapeutic target discovery. As in many large-scale assays, however, off-target effects, variable reagents' potency and experimental noise must be accounted for appropriately control for false positives. Indeed, rigorous statistical analysis of high-throughput FG screening data remains challenging, particularly when integrative analyses are used to combine multiple sh/sgRNAs targeting the same gene in the library. We use large RNAi and CRISPR repositories that are publicly available to evaluate a novel meta-analysis approach for FG screens via Bayesian hierarchical modeling, Screening Bayesian Evaluation and Analysis Method (ScreenBEAM). Results from our analysis show that the proposed strategy, which seamlessly combines all available data, robustly outperforms classical algorithms developed for microarray data sets as well as recent approaches designed for next generation sequencing technologies. Remarkably, the ScreenBEAM algorithm works well even when the quality of FG screens is relatively low, which accounts for about 80-95% of the public datasets. R package and source code are available at: https://github.com/jyyu/ScreenBEAM. ac2248@columbia.edu, jose.silva@mssm.edu, yujiyang@gmail.com Supplementary data are available at Bioinformatics online. © The Author 2015. Published by Oxford University Press. All rights reserved. For Permissions, please e-mail: journals.permissions@oup.com.

  4. The Nonmydriatic Fundus Camera in Diabetic Retinopathy Screening: A Cost-Effective Study with Evaluation for Future Large-Scale Application

    Directory of Open Access Journals (Sweden)

    Giuseppe Scarpa

    2016-01-01

    Full Text Available Aims. The study aimed to present the experience of a screening programme for early detection of diabetic retinopathy (DR using a nonmydriatic fundus camera, evaluating the feasibility in terms of validity, resources absorption, and future advantages of a potential application, in an Italian local health authority. Methods. Diabetic patients living in the town of Ponzano, Veneto Region (Northern Italy, were invited to be enrolled in the screening programme. The “no prevention strategy” with the inclusion of the estimation of blindness related costs was compared with screening costs in order to evaluate a future extensive and feasible implementation of the procedure, through a budget impact approach. Results. Out of 498 diabetic patients eligible, 80% was enrolled in the screening programme. 115 patients (34% were referred to an ophthalmologist and 9 cases required prompt treatment for either proliferative DR or macular edema. Based on the pilot data, it emerged that an extensive use of the investigated screening programme, within the Greater Treviso area, could prevent 6 cases of blindness every year, resulting in a saving of €271,543.32 (−13.71%. Conclusions. Fundus images obtained with a nonmydriatic fundus camera could be considered an effective, cost-sparing, and feasible screening tool for the early detection of DR, preventing blindness as a result of diabetes.

  5. Attitude stabilization of a spacecraft equipped with large electrostatic protection screens

    Science.gov (United States)

    Nikitin, D. Yu.; Tikhonov, A. A.

    2018-05-01

    A satellite with a system of three electrostatic radiation protection (ERP) screens is under consideration. The screens are constructed as electrostatically charged toroidal shields with characteristic size of order equal to 100 m. The interaction of electric charge with the Earth's magnetic field (EMF) give rise to the Lorentz torque acting upon a satellite attitude motion. As the sizes of ERP system are large, we derive the Lorentz torque taking into account the complex form of ERP screens and gradient of the EMF in the screen volume. It is assumed that the satellite center of charge coincides with the satellite mass center. The EMF is modeled by the straight magnetic dipole. In the paper we investigate the usage of Lorentz torque for passive attitude stabilization for satellite in a circular equatorial orbit. Mathematical model for attitude dynamics of a satellite equipped with ERP interacting with the EMF is derived and first integral of corresponding differential equations is constructed. The straight equilibrium position of the satellite in the orbital frame is found. Sufficient conditions for stability of satellite equilibrium position are constructed with the use of the first integral. The gravity gradient torque is taken into account. The satellite equilibrium stability domain is constructed.

  6. [A large-scale accident in Alpine terrain].

    Science.gov (United States)

    Wildner, M; Paal, P

    2015-02-01

    Due to the geographical conditions, large-scale accidents amounting to mass casualty incidents (MCI) in Alpine terrain regularly present rescue teams with huge challenges. Using an example incident, specific conditions and typical problems associated with such a situation are presented. The first rescue team members to arrive have the elementary tasks of qualified triage and communication to the control room, which is required to dispatch the necessary additional support. Only with a clear "concept", to which all have to adhere, can the subsequent chaos phase be limited. In this respect, a time factor confounded by adverse weather conditions or darkness represents enormous pressure. Additional hazards are frostbite and hypothermia. If priorities can be established in terms of urgency, then treatment and procedure algorithms have proven successful. For evacuation of causalities, a helicopter should be strived for. Due to the low density of hospitals in Alpine regions, it is often necessary to distribute the patients over a wide area. Rescue operations in Alpine terrain have to be performed according to the particular conditions and require rescue teams to have specific knowledge and expertise. The possibility of a large-scale accident should be considered when planning events. With respect to optimization of rescue measures, regular training and exercises are rational, as is the analysis of previous large-scale Alpine accidents.

  7. Hierarchical Cantor set in the large scale structure with torus geometry

    Energy Technology Data Exchange (ETDEWEB)

    Murdzek, R. [Physics Department, ' Al. I. Cuza' University, Blvd. Carol I, Nr. 11, Iassy 700506 (Romania)], E-mail: rmurdzek@yahoo.com

    2008-12-15

    The formation of large scale structures is considered within a model with string on toroidal space-time. Firstly, the space-time geometry is presented. In this geometry, the Universe is represented by a string describing a torus surface. Thereafter, the large scale structure of the Universe is derived from the string oscillations. The results are in agreement with the cellular structure of the large scale distribution and with the theory of a Cantorian space-time.

  8. Large-scale Motion of Solar Filaments

    Indian Academy of Sciences (India)

    tribpo

    Large-scale Motion of Solar Filaments. Pavel Ambrož, Astronomical Institute of the Acad. Sci. of the Czech Republic, CZ-25165. Ondrejov, The Czech Republic. e-mail: pambroz@asu.cas.cz. Alfred Schroll, Kanzelhöehe Solar Observatory of the University of Graz, A-9521 Treffen,. Austria. e-mail: schroll@solobskh.ac.at.

  9. Sensitivity analysis for large-scale problems

    Science.gov (United States)

    Noor, Ahmed K.; Whitworth, Sandra L.

    1987-01-01

    The development of efficient techniques for calculating sensitivity derivatives is studied. The objective is to present a computational procedure for calculating sensitivity derivatives as part of performing structural reanalysis for large-scale problems. The scope is limited to framed type structures. Both linear static analysis and free-vibration eigenvalue problems are considered.

  10. Data-driven process decomposition and robust online distributed modelling for large-scale processes

    Science.gov (United States)

    Shu, Zhang; Lijuan, Li; Lijuan, Yao; Shipin, Yang; Tao, Zou

    2018-02-01

    With the increasing attention of networked control, system decomposition and distributed models show significant importance in the implementation of model-based control strategy. In this paper, a data-driven system decomposition and online distributed subsystem modelling algorithm was proposed for large-scale chemical processes. The key controlled variables are first partitioned by affinity propagation clustering algorithm into several clusters. Each cluster can be regarded as a subsystem. Then the inputs of each subsystem are selected by offline canonical correlation analysis between all process variables and its controlled variables. Process decomposition is then realised after the screening of input and output variables. When the system decomposition is finished, the online subsystem modelling can be carried out by recursively block-wise renewing the samples. The proposed algorithm was applied in the Tennessee Eastman process and the validity was verified.

  11. Topology Optimization of Large Scale Stokes Flow Problems

    DEFF Research Database (Denmark)

    Aage, Niels; Poulsen, Thomas Harpsøe; Gersborg-Hansen, Allan

    2008-01-01

    This note considers topology optimization of large scale 2D and 3D Stokes flow problems using parallel computations. We solve problems with up to 1.125.000 elements in 2D and 128.000 elements in 3D on a shared memory computer consisting of Sun UltraSparc IV CPUs.......This note considers topology optimization of large scale 2D and 3D Stokes flow problems using parallel computations. We solve problems with up to 1.125.000 elements in 2D and 128.000 elements in 3D on a shared memory computer consisting of Sun UltraSparc IV CPUs....

  12. Development of a first-principles code based on the screened KKR method for large super-cells

    International Nuclear Information System (INIS)

    Doi, S; Ogura, M; Akai, H

    2013-01-01

    The procedures of performing first-principles electronic structure calculation using the Korringa-Kohn-Rostoker (KKR) and the screened KKR methods are reviewed with an emphasis put on their numerical efficiency. It is shown that an iterative matrix inversion combined with a suitable preconditioning greatly improves the computational time of screened KKR method. The method is well parallelized and also has an O(N) scaling property

  13. The Cosmology Large Angular Scale Surveyor

    Science.gov (United States)

    Harrington, Kathleen; Marriage, Tobias; Ali, Aamir; Appel, John; Bennett, Charles; Boone, Fletcher; Brewer, Michael; Chan, Manwei; Chuss, David T.; Colazo, Felipe; hide

    2016-01-01

    The Cosmology Large Angular Scale Surveyor (CLASS) is a four telescope array designed to characterize relic primordial gravitational waves from inflation and the optical depth to reionization through a measurement of the polarized cosmic microwave background (CMB) on the largest angular scales. The frequencies of the four CLASS telescopes, one at 38 GHz, two at 93 GHz, and one dichroic system at 145217 GHz, are chosen to avoid spectral regions of high atmospheric emission and span the minimum of the polarized Galactic foregrounds: synchrotron emission at lower frequencies and dust emission at higher frequencies. Low-noise transition edge sensor detectors and a rapid front-end polarization modulator provide a unique combination of high sensitivity, stability, and control of systematics. The CLASS site, at 5200 m in the Chilean Atacama desert, allows for daily mapping of up to 70% of the sky and enables the characterization of CMB polarization at the largest angular scales. Using this combination of a broad frequency range, large sky coverage, control over systematics, and high sensitivity, CLASS will observe the reionization and recombination peaks of the CMB E- and B-mode power spectra. CLASS will make a cosmic variance limited measurement of the optical depth to reionization and will measure or place upper limits on the tensor-to-scalar ratio, r, down to a level of 0.01 (95% C.L.).

  14. Prehospital Acute Stroke Severity Scale to Predict Large Artery Occlusion: Design and Comparison With Other Scales.

    Science.gov (United States)

    Hastrup, Sidsel; Damgaard, Dorte; Johnsen, Søren Paaske; Andersen, Grethe

    2016-07-01

    We designed and validated a simple prehospital stroke scale to identify emergent large vessel occlusion (ELVO) in patients with acute ischemic stroke and compared the scale to other published scales for prediction of ELVO. A national historical test cohort of 3127 patients with information on intracranial vessel status (angiography) before reperfusion therapy was identified. National Institutes of Health Stroke Scale (NIHSS) items with the highest predictive value of occlusion of a large intracranial artery were identified, and the most optimal combination meeting predefined criteria to ensure usefulness in the prehospital phase was determined. The predictive performance of Prehospital Acute Stroke Severity (PASS) scale was compared with other published scales for ELVO. The PASS scale was composed of 3 NIHSS scores: level of consciousness (month/age), gaze palsy/deviation, and arm weakness. In derivation of PASS 2/3 of the test cohort was used and showed accuracy (area under the curve) of 0.76 for detecting large arterial occlusion. Optimal cut point ≥2 abnormal scores showed: sensitivity=0.66 (95% CI, 0.62-0.69), specificity=0.83 (0.81-0.85), and area under the curve=0.74 (0.72-0.76). Validation on 1/3 of the test cohort showed similar performance. Patients with a large artery occlusion on angiography with PASS ≥2 had a median NIHSS score of 17 (interquartile range=6) as opposed to PASS <2 with a median NIHSS score of 6 (interquartile range=5). The PASS scale showed equal performance although more simple when compared with other scales predicting ELVO. The PASS scale is simple and has promising accuracy for prediction of ELVO in the field. © 2016 American Heart Association, Inc.

  15. 1001 Ways to run AutoDock Vina for virtual screening

    NARCIS (Netherlands)

    Jaghoori, Mohammad Mahdi; Bleijlevens, Boris; Olabarriaga, Silvia D.

    2016-01-01

    Large-scale computing technologies have enabled high-throughput virtual screening involving thousands to millions of drug candidates. It is not trivial, however, for biochemical scientists to evaluate the technical alternatives and their implications for running such large experiments. Besides

  16. Homogenizing bacterial cell factories: Analysis and engineering of phenotypic heterogeneity.

    Science.gov (United States)

    Binder, Dennis; Drepper, Thomas; Jaeger, Karl-Erich; Delvigne, Frank; Wiechert, Wolfgang; Kohlheyer, Dietrich; Grünberger, Alexander

    2017-07-01

    In natural habitats, microbes form multispecies communities that commonly face rapidly changing and highly competitive environments. Thus, phenotypic heterogeneity has evolved as an innate and important survival strategy to gain an overall fitness advantage over cohabiting competitors. However, in defined artificial environments such as monocultures in small- to large-scale bioreactors, cell-to-cell variations are presumed to cause reduced production yields as well as process instability. Hence, engineering microbial production toward phenotypic homogeneity is a highly promising approach for synthetic biology and bioprocess optimization. In this review, we discuss recent studies that have unraveled the cell-to-cell heterogeneity observed during bacterial gene expression and metabolite production as well as the molecular mechanisms involved. In addition, current single-cell technologies are briefly reviewed with respect to their applicability in exploring cell-to-cell variations. We highlight emerging strategies and tools to reduce phenotypic heterogeneity in biotechnological expression setups. Here, strain or inducer modifications are combined with cell physiology manipulations to achieve the ultimate goal of equalizing bacterial populations. In this way, the majority of cells can be forced into high productivity, thus reducing less productive subpopulations that tend to consume valuable resources during production. Modifications in uptake systems, inducer molecules or nutrients represent valuable tools for diminishing heterogeneity. Finally, we address the challenge of transferring homogeneously responding cells into large-scale bioprocesses. Environmental heterogeneity originating from extrinsic factors such as stirring speed and pH, oxygen, temperature or nutrient distribution can significantly influence cellular physiology. We conclude that engineering microbial populations toward phenotypic homogeneity is an increasingly important task to take biotechnological

  17. Proportional incidence and radiological review of large (T2+) breast cancers as surrogate indicators of screening programme performance

    International Nuclear Information System (INIS)

    Ciatto, S.; Bernardi, D.; Pellegrini, M.; Borsato, G.; Peterlongo, P.; Gentilini, M.A.; Caumo, F.; Frigerio, A.; Houssami, N.

    2012-01-01

    Surrogate measures of screening performance [e.g. interval cancer (IC) proportional incidence] allow timely monitoring of sensitivity and quality. This study explored measures using large (T2+) breast cancers as potential indicators of screening performance. The proportional incidence of T2+ cancers (observed/expected cases) in a population-based screening programme (Trento, 2001-2009) was estimated. A parallel review of 'negative' preceding mammograms for screen-detected T2+ and for all ICs, using 'blinded' independent readings and case-mixes (54 T2+, 50 ICs, 170 controls) was also performed. T2+ cancers were observed in 168 screening participants: 48 at first screen, 67 at repeat screening and 53 ICs. The T2+ estimated proportional incidence was 68% (observed/expected = 168/247), corresponding to an estimated 32% reduction in the rate of T2+ cancers in screening participants relative to that expected without screening. Majority review classified 27.8% (15/54) of T2+ and 28% (14/50) of ICs as screening error (P = 0.84), with variable recall rates amongst radiologists (8.8-15.2%). T2+ review could be integrated as part of quality monitoring and potentially prove more feasible than IC review for some screening services. circle Interval breast cancers, assumed as screening failures, are monitored to estimate screening performance circle Large (T2+) cancers at screening may also represent failed prior screening detection circle Analysis of T2+ lesions may be more feasible than assessing interval cancers circle Analysis of T2+ cancers is a potential further measure of screening performance. (orig.)

  18. Proportional incidence and radiological review of large (T2+) breast cancers as surrogate indicators of screening programme performance

    Energy Technology Data Exchange (ETDEWEB)

    Ciatto, S.; Bernardi, D.; Pellegrini, M.; Borsato, G.; Peterlongo, P. [APSS, U.O. Senologia Clinica e Screening Mammografico, Dipartimento di Radiodiagnostica, Trento (Italy); Gentilini, M.A. [APSS, Servizio Osservatorio Epidemiologico, Direzione promozione ed educazione alla salute, Trento (Italy); Caumo, F. [Centro di Prevenzione Senologica, Verona (Italy); Frigerio, A. [CRR, Centro di Riferimento Regionale per lo Screening Mammografico, Torino (Italy); Houssami, N. [University of Sydney, Screening and Test Evaluation Program, School of Public Health, Sydney Medical School, Sydney (Australia)

    2012-06-15

    Surrogate measures of screening performance [e.g. interval cancer (IC) proportional incidence] allow timely monitoring of sensitivity and quality. This study explored measures using large (T2+) breast cancers as potential indicators of screening performance. The proportional incidence of T2+ cancers (observed/expected cases) in a population-based screening programme (Trento, 2001-2009) was estimated. A parallel review of 'negative' preceding mammograms for screen-detected T2+ and for all ICs, using 'blinded' independent readings and case-mixes (54 T2+, 50 ICs, 170 controls) was also performed. T2+ cancers were observed in 168 screening participants: 48 at first screen, 67 at repeat screening and 53 ICs. The T2+ estimated proportional incidence was 68% (observed/expected = 168/247), corresponding to an estimated 32% reduction in the rate of T2+ cancers in screening participants relative to that expected without screening. Majority review classified 27.8% (15/54) of T2+ and 28% (14/50) of ICs as screening error (P = 0.84), with variable recall rates amongst radiologists (8.8-15.2%). T2+ review could be integrated as part of quality monitoring and potentially prove more feasible than IC review for some screening services. circle Interval breast cancers, assumed as screening failures, are monitored to estimate screening performance circle Large (T2+) cancers at screening may also represent failed prior screening detection circle Analysis of T2+ lesions may be more feasible than assessing interval cancers circle Analysis of T2+ cancers is a potential further measure of screening performance. (orig.)

  19. Analysis using large-scale ringing data

    Directory of Open Access Journals (Sweden)

    Baillie, S. R.

    2004-06-01

    Full Text Available Birds are highly mobile organisms and there is increasing evidence that studies at large spatial scales are needed if we are to properly understand their population dynamics. While classical metapopulation models have rarely proved useful for birds, more general metapopulation ideas involving collections of populations interacting within spatially structured landscapes are highly relevant (Harrison, 1994. There is increasing interest in understanding patterns of synchrony, or lack of synchrony, between populations and the environmental and dispersal mechanisms that bring about these patterns (Paradis et al., 2000. To investigate these processes we need to measure abundance, demographic rates and dispersal at large spatial scales, in addition to gathering data on relevant environmental variables. There is an increasing realisation that conservation needs to address rapid declines of common and widespread species (they will not remain so if such trends continue as well as the management of small populations that are at risk of extinction. While the knowledge needed to support the management of small populations can often be obtained from intensive studies in a few restricted areas, conservation of widespread species often requires information on population trends and processes measured at regional, national and continental scales (Baillie, 2001. While management prescriptions for widespread populations may initially be developed from a small number of local studies or experiments, there is an increasing need to understand how such results will scale up when applied across wider areas. There is also a vital role for monitoring at large spatial scales both in identifying such population declines and in assessing population recovery. Gathering data on avian abundance and demography at large spatial scales usually relies on the efforts of large numbers of skilled volunteers. Volunteer studies based on ringing (for example Constant Effort Sites [CES

  20. Clinical events in a large prospective cohort of children with sickle cell disease in Nagpur, India: evidence against a milder clinical phenotype in India.

    Science.gov (United States)

    Jain, Dipty; Arjunan, Aishwarya; Sarathi, Vijaya; Jain, Harshwardhan; Bhandarwar, Amol; Vuga, Marike; Krishnamurti, Lakshmanan

    2016-10-01

    The clinical phenotype of sickle cell disease (SCD) has been reported to be milder in India than in the United States. The objective of this large single-center study was to examine the rate of complications to define the phenotype of SCD in India. The rate of complications per 100 person-years in 833 pediatric SCD patients for 1954 person-years in Nagpur, India including those diagnosed on newborn screen (NBS) and those presenting later in childhood (non-NBS) was compared to those reported in the cooperative study of sickle cell disease (CSSCD). Event rates were also compared between patients belonging to scheduled castes (SCs), scheduled tribes (STs), and other backward classes (OBC). Comparison of CSSCD versus Nagpur NBS versus Nagpur non-NBS for rates of pain (32.4 vs. 85.2 vs. 62.4), severe anemia (7.1 vs. 27 vs. 6.6), stroke (0.7 vs. 0.8 vs. 1.4), splenic sequestration (3.4 vs. 6.7 vs. 1.6), acute chest syndrome (24.5 vs. 23.6 vs. 1.0), and meningitis (0.8 vs. 0 vs. 0.1) revealed more frequent complications in Nagpur compared to CSSCD. Comparison of ST, SC, and OBC for rates of pain (84.6 vs. 71.9 vs. 63.5), acute chest syndrome (3.6 vs. 2.8 vs. 2.2), severe anemia (5.4 vs. 9.5 vs. 11.4), stroke (1.2 vs. 0.4 vs. 0.3), splenic sequestration (0.6 vs. 2.4 vs. 1.9), and meningitis (0.8 vs. 0 vs. 0.1) revealed significantly more frequent complications among ST. SCD-related complications are more frequent in Indian children than that observed in CSSCD. Further study is indicated to define SCD phenotype in India. © 2016 Wiley Periodicals, Inc.

  1. Large screen mimic display design research for advanced main control room in nuclear power plant

    International Nuclear Information System (INIS)

    Zheng Mingguang; Yang Yanhua; Xu Jijun; Zhang Qinshun; Ning Zhonghe

    2002-01-01

    Firstly the evolution of mimic diagrams or displays used in the main control room of nuclear power plant was introduced. The active functions of mimic diagrams were analyzed on the release of operator psychological burden and pressure, the assistance of operator for the information searching, status understanding, manual actuation, correct decision making as well as the safe and reliable operation of the nuclear power plant. The importance and necessity to use the (large screen) mimic diagrams in advanced main control room of nuclear power plant, the design principle, design details and verification measures of large screen mimic display are also described

  2. Fast Simulation of Large-Scale Floods Based on GPU Parallel Computing

    OpenAIRE

    Qiang Liu; Yi Qin; Guodong Li

    2018-01-01

    Computing speed is a significant issue of large-scale flood simulations for real-time response to disaster prevention and mitigation. Even today, most of the large-scale flood simulations are generally run on supercomputers due to the massive amounts of data and computations necessary. In this work, a two-dimensional shallow water model based on an unstructured Godunov-type finite volume scheme was proposed for flood simulation. To realize a fast simulation of large-scale floods on a personal...

  3. Managing Risk and Uncertainty in Large-Scale University Research Projects

    Science.gov (United States)

    Moore, Sharlissa; Shangraw, R. F., Jr.

    2011-01-01

    Both publicly and privately funded research projects managed by universities are growing in size and scope. Complex, large-scale projects (over $50 million) pose new management challenges and risks for universities. This paper explores the relationship between project success and a variety of factors in large-scale university projects. First, we…

  4. Parallel clustering algorithm for large-scale biological data sets.

    Science.gov (United States)

    Wang, Minchao; Zhang, Wu; Ding, Wang; Dai, Dongbo; Zhang, Huiran; Xie, Hao; Chen, Luonan; Guo, Yike; Xie, Jiang

    2014-01-01

    Recent explosion of biological data brings a great challenge for the traditional clustering algorithms. With increasing scale of data sets, much larger memory and longer runtime are required for the cluster identification problems. The affinity propagation algorithm outperforms many other classical clustering algorithms and is widely applied into the biological researches. However, the time and space complexity become a great bottleneck when handling the large-scale data sets. Moreover, the similarity matrix, whose constructing procedure takes long runtime, is required before running the affinity propagation algorithm, since the algorithm clusters data sets based on the similarities between data pairs. Two types of parallel architectures are proposed in this paper to accelerate the similarity matrix constructing procedure and the affinity propagation algorithm. The memory-shared architecture is used to construct the similarity matrix, and the distributed system is taken for the affinity propagation algorithm, because of its large memory size and great computing capacity. An appropriate way of data partition and reduction is designed in our method, in order to minimize the global communication cost among processes. A speedup of 100 is gained with 128 cores. The runtime is reduced from serval hours to a few seconds, which indicates that parallel algorithm is capable of handling large-scale data sets effectively. The parallel affinity propagation also achieves a good performance when clustering large-scale gene data (microarray) and detecting families in large protein superfamilies.

  5. Adaptive visualization for large-scale graph

    International Nuclear Information System (INIS)

    Nakamura, Hiroko; Shinano, Yuji; Ohzahata, Satoshi

    2010-01-01

    We propose an adoptive visualization technique for representing a large-scale hierarchical dataset within limited display space. A hierarchical dataset has nodes and links showing the parent-child relationship between the nodes. These nodes and links are described using graphics primitives. When the number of these primitives is large, it is difficult to recognize the structure of the hierarchical data because many primitives are overlapped within a limited region. To overcome this difficulty, we propose an adaptive visualization technique for hierarchical datasets. The proposed technique selects an appropriate graph style according to the nodal density in each area. (author)

  6. Stabilization Algorithms for Large-Scale Problems

    DEFF Research Database (Denmark)

    Jensen, Toke Koldborg

    2006-01-01

    The focus of the project is on stabilization of large-scale inverse problems where structured models and iterative algorithms are necessary for computing approximate solutions. For this purpose, we study various iterative Krylov methods and their abilities to produce regularized solutions. Some......-curve. This heuristic is implemented as a part of a larger algorithm which is developed in collaboration with G. Rodriguez and P. C. Hansen. Last, but not least, a large part of the project has, in different ways, revolved around the object-oriented Matlab toolbox MOORe Tools developed by PhD Michael Jacobsen. New...

  7. Screening for anxiety, depression, and anxious depression in primary care

    DEFF Research Database (Denmark)

    Goldberg, David P.; Reed, Geoffrey M.; Robles, Rebeca

    2017-01-01

    Background In this field study of WHO's revised classification of mental disorders for primary care settings, the ICD-11 PHC, we tested the usefulness of two five-item screening scales for anxiety and depression to be administered in primary care settings. Methods The study was conducted in primary...... in primary care settings. Conclusions The two five-item screening scales for anxiety and depression provide a practical way for PCPs to evaluate the likelihood of mood and anxiety disorders without paper and pencil measures that are not feasible in many settings. These scales may provide substantially...... care settings in four large middle-income countries. Primary care physicians (PCPs) referred individuals who they suspected might be psychologically distressed to the study. Screening scales as well as a structured diagnostic interview, the revised Clinical Interview Schedule (CIS-R), adapted...

  8. Design study on sodium cooled large-scale reactor

    International Nuclear Information System (INIS)

    Murakami, Tsutomu; Hishida, Masahiko; Kisohara, Naoyuki

    2004-07-01

    In Phase 1 of the 'Feasibility Studies on Commercialized Fast Reactor Cycle Systems (F/S)', an advanced loop type reactor has been selected as a promising concept of sodium-cooled large-scale reactor, which has a possibility to fulfill the design requirements of the F/S. In Phase 2, design improvement for further cost reduction of establishment of the plant concept has been performed. This report summarizes the results of the design study on the sodium-cooled large-scale reactor performed in JFY2003, which is the third year of Phase 2. In the JFY2003 design study, critical subjects related to safety, structural integrity and thermal hydraulics which found in the last fiscal year has been examined and the plant concept has been modified. Furthermore, fundamental specifications of main systems and components have been set and economy has been evaluated. In addition, as the interim evaluation of the candidate concept of the FBR fuel cycle is to be conducted, cost effectiveness and achievability for the development goal were evaluated and the data of the three large-scale reactor candidate concepts were prepared. As a results of this study, the plant concept of the sodium-cooled large-scale reactor has been constructed, which has a prospect to satisfy the economic goal (construction cost: less than 200,000 yens/kWe, etc.) and has a prospect to solve the critical subjects. From now on, reflecting the results of elemental experiments, the preliminary conceptual design of this plant will be preceded toward the selection for narrowing down candidate concepts at the end of Phase 2. (author)

  9. Design study on sodium-cooled large-scale reactor

    International Nuclear Information System (INIS)

    Shimakawa, Yoshio; Nibe, Nobuaki; Hori, Toru

    2002-05-01

    In Phase 1 of the 'Feasibility Study on Commercialized Fast Reactor Cycle Systems (F/S)', an advanced loop type reactor has been selected as a promising concept of sodium-cooled large-scale reactor, which has a possibility to fulfill the design requirements of the F/S. In Phase 2 of the F/S, it is planed to precede a preliminary conceptual design of a sodium-cooled large-scale reactor based on the design of the advanced loop type reactor. Through the design study, it is intended to construct such a plant concept that can show its attraction and competitiveness as a commercialized reactor. This report summarizes the results of the design study on the sodium-cooled large-scale reactor performed in JFY2001, which is the first year of Phase 2. In the JFY2001 design study, a plant concept has been constructed based on the design of the advanced loop type reactor, and fundamental specifications of main systems and components have been set. Furthermore, critical subjects related to safety, structural integrity, thermal hydraulics, operability, maintainability and economy have been examined and evaluated. As a result of this study, the plant concept of the sodium-cooled large-scale reactor has been constructed, which has a prospect to satisfy the economic goal (construction cost: less than 200,000yens/kWe, etc.) and has a prospect to solve the critical subjects. From now on, reflecting the results of elemental experiments, the preliminary conceptual design of this plant will be preceded toward the selection for narrowing down candidate concepts at the end of Phase 2. (author)

  10. Large scale CMB anomalies from thawing cosmic strings

    Energy Technology Data Exchange (ETDEWEB)

    Ringeval, Christophe [Centre for Cosmology, Particle Physics and Phenomenology, Institute of Mathematics and Physics, Louvain University, 2 Chemin du Cyclotron, 1348 Louvain-la-Neuve (Belgium); Yamauchi, Daisuke; Yokoyama, Jun' ichi [Research Center for the Early Universe (RESCEU), Graduate School of Science, The University of Tokyo, Tokyo 113-0033 (Japan); Bouchet, François R., E-mail: christophe.ringeval@uclouvain.be, E-mail: yamauchi@resceu.s.u-tokyo.ac.jp, E-mail: yokoyama@resceu.s.u-tokyo.ac.jp, E-mail: bouchet@iap.fr [Institut d' Astrophysique de Paris, UMR 7095-CNRS, Université Pierre et Marie Curie, 98bis boulevard Arago, 75014 Paris (France)

    2016-02-01

    Cosmic strings formed during inflation are expected to be either diluted over super-Hubble distances, i.e., invisible today, or to have crossed our past light cone very recently. We discuss the latter situation in which a few strings imprint their signature in the Cosmic Microwave Background (CMB) Anisotropies after recombination. Being almost frozen in the Hubble flow, these strings are quasi static and evade almost all of the previously derived constraints on their tension while being able to source large scale anisotropies in the CMB sky. Using a local variance estimator on thousand of numerically simulated Nambu-Goto all sky maps, we compute the expected signal and show that it can mimic a dipole modulation at large angular scales while being negligible at small angles. Interestingly, such a scenario generically produces one cold spot from the thawing of a cosmic string loop. Mixed with anisotropies of inflationary origin, we find that a few strings of tension GU = O(1) × 10{sup −6} match the amplitude of the dipole modulation reported in the Planck satellite measurements and could be at the origin of other large scale anomalies.

  11. Penile Dysmorphic Disorder: Development of a Screening Scale.

    Science.gov (United States)

    Veale, David; Miles, Sarah; Read, Julie; Troglia, Andrea; Carmona, Lina; Fiorito, Chiara; Wells, Hannah; Wylie, Kevan; Muir, Gordon

    2015-11-01

    Penile dysmorphic disorder (PDD) is shorthand for men diagnosed with body dysmorphic disorder, in whom the size or shape of the penis is their main, if not their exclusive, preoccupation causing significant shame or handicap. There are no specific measures for identifying men with PDD compared to men who are anxious about the size of their penis but do not have PDD. Such a measure might be helpful for treatment planning, reducing unrealistic expectations, and measuring outcome after any psychological or physical intervention. Our aim was, therefore, to validate a specific measure, termed the Cosmetic Procedure Screening Scale for PDD (COPS-P). Eighty-one male participants were divided into three groups: a PDD group (n = 21), a small penis anxiety group (n = 37), and a control group (n = 23). All participants completed the COPS-P as well as standardized measures of depression, anxiety, social phobia, body image, quality of life, and erectile function. Penis size was also measured. The final COPS-P was based on nine items. The scale had good internal reliability and significant convergent validity with measures of related constructs. It discriminated between the PDD group, the small penis anxiety group, and the control group. This is the first study to develop a scale able to discriminate between those with PDD and men anxious about their size who did not have PDD. Clinicians and researchers may use the scale as part of an assessment for men presenting with anxiety about penis size and as an audit or outcome measure after any intervention for this population.

  12. Exploiting multi-scale parallelism for large scale numerical modelling of laser wakefield accelerators

    International Nuclear Information System (INIS)

    Fonseca, R A; Vieira, J; Silva, L O; Fiuza, F; Davidson, A; Tsung, F S; Mori, W B

    2013-01-01

    A new generation of laser wakefield accelerators (LWFA), supported by the extreme accelerating fields generated in the interaction of PW-Class lasers and underdense targets, promises the production of high quality electron beams in short distances for multiple applications. Achieving this goal will rely heavily on numerical modelling to further understand the underlying physics and identify optimal regimes, but large scale modelling of these scenarios is computationally heavy and requires the efficient use of state-of-the-art petascale supercomputing systems. We discuss the main difficulties involved in running these simulations and the new developments implemented in the OSIRIS framework to address these issues, ranging from multi-dimensional dynamic load balancing and hybrid distributed/shared memory parallelism to the vectorization of the PIC algorithm. We present the results of the OASCR Joule Metric program on the issue of large scale modelling of LWFA, demonstrating speedups of over 1 order of magnitude on the same hardware. Finally, scalability to over ∼10 6 cores and sustained performance over ∼2 P Flops is demonstrated, opening the way for large scale modelling of LWFA scenarios. (paper)

  13. Balancing modern Power System with large scale of wind power

    DEFF Research Database (Denmark)

    Basit, Abdul; Altin, Müfit; Hansen, Anca Daniela

    2014-01-01

    Power system operators must ensure robust, secure and reliable power system operation even with a large scale integration of wind power. Electricity generated from the intermittent wind in large propor-tion may impact on the control of power system balance and thus deviations in the power system...... frequency in small or islanded power systems or tie line power flows in interconnected power systems. Therefore, the large scale integration of wind power into the power system strongly concerns the secure and stable grid operation. To ensure the stable power system operation, the evolving power system has...... to be analysed with improved analytical tools and techniques. This paper proposes techniques for the active power balance control in future power systems with the large scale wind power integration, where power balancing model provides the hour-ahead dispatch plan with reduced planning horizon and the real time...

  14. Immobilized metal-affinity chromatography protein-recovery screening is predictive of crystallographic structure success

    International Nuclear Information System (INIS)

    Choi, Ryan; Kelley, Angela; Leibly, David; Nakazawa Hewitt, Stephen; Napuli, Alberto; Van Voorhis, Wesley

    2011-01-01

    An overview of the methods used for high-throughput cloning and protein-expression screening of SSGCID hexahistidine recombinant proteins is provided. It is demonstrated that screening for recombinant proteins that are highly recoverable from immobilized metal-affinity chromatography improves the likelihood that a protein will produce a structure. The recombinant expression of soluble proteins in Escherichia coli continues to be a major bottleneck in structural genomics. The establishment of reliable protocols for the performance of small-scale expression and solubility testing is an essential component of structural genomic pipelines. The SSGCID Protein Production Group at the University of Washington (UW-PPG) has developed a high-throughput screening (HTS) protocol for the measurement of protein recovery from immobilized metal-affinity chromatography (IMAC) which predicts successful purification of hexahistidine-tagged proteins. The protocol is based on manual transfer of samples using multichannel pipettors and 96-well plates and does not depend on the use of robotic platforms. This protocol has been applied to evaluate the expression and solubility of more than 4000 proteins expressed in E. coli. The UW-PPG also screens large-scale preparations for recovery from IMAC prior to purification. Analysis of these results show that our low-cost non-automated approach is a reliable method for the HTS demands typical of large structural genomic projects. This paper provides a detailed description of these protocols and statistical analysis of the SSGCID screening results. The results demonstrate that screening for proteins that yield high recovery after IMAC, both after small-scale and large-scale expression, improves the selection of proteins that can be successfully purified and will yield a crystal structure

  15. Retrospective analysis of cohort database: Phenotypic variability in a large dataset of patients confirmed to have homozygous familial hypercholesterolemia

    NARCIS (Netherlands)

    Raal, Frederick J.; Sjouke, Barbara; Hovingh, G. Kees; Isaac, Barton F.

    2016-01-01

    These data describe the phenotypic variability in a large cohort of patients confirmed to have homozygous familial hypercholesterolemia. Herein, we describe the observed relationship of treated low-density lipoprotein cholesterol with age. We also overlay the low-density lipoprotein receptor gene

  16. Large-Scale Graph Processing Using Apache Giraph

    KAUST Repository

    Sakr, Sherif

    2017-01-07

    This book takes its reader on a journey through Apache Giraph, a popular distributed graph processing platform designed to bring the power of big data processing to graph data. Designed as a step-by-step self-study guide for everyone interested in large-scale graph processing, it describes the fundamental abstractions of the system, its programming models and various techniques for using the system to process graph data at scale, including the implementation of several popular and advanced graph analytics algorithms.

  17. Large-Scale Graph Processing Using Apache Giraph

    KAUST Repository

    Sakr, Sherif; Orakzai, Faisal Moeen; Abdelaziz, Ibrahim; Khayyat, Zuhair

    2017-01-01

    This book takes its reader on a journey through Apache Giraph, a popular distributed graph processing platform designed to bring the power of big data processing to graph data. Designed as a step-by-step self-study guide for everyone interested in large-scale graph processing, it describes the fundamental abstractions of the system, its programming models and various techniques for using the system to process graph data at scale, including the implementation of several popular and advanced graph analytics algorithms.

  18. An interactive display system for large-scale 3D models

    Science.gov (United States)

    Liu, Zijian; Sun, Kun; Tao, Wenbing; Liu, Liman

    2018-04-01

    With the improvement of 3D reconstruction theory and the rapid development of computer hardware technology, the reconstructed 3D models are enlarging in scale and increasing in complexity. Models with tens of thousands of 3D points or triangular meshes are common in practical applications. Due to storage and computing power limitation, it is difficult to achieve real-time display and interaction with large scale 3D models for some common 3D display software, such as MeshLab. In this paper, we propose a display system for large-scale 3D scene models. We construct the LOD (Levels of Detail) model of the reconstructed 3D scene in advance, and then use an out-of-core view-dependent multi-resolution rendering scheme to realize the real-time display of the large-scale 3D model. With the proposed method, our display system is able to render in real time while roaming in the reconstructed scene and 3D camera poses can also be displayed. Furthermore, the memory consumption can be significantly decreased via internal and external memory exchange mechanism, so that it is possible to display a large scale reconstructed scene with over millions of 3D points or triangular meshes in a regular PC with only 4GB RAM.

  19. Application of unmanned aerial systems for high throughput phenotyping of large wheat breeding nurseries.

    Science.gov (United States)

    Haghighattalab, Atena; González Pérez, Lorena; Mondal, Suchismita; Singh, Daljit; Schinstock, Dale; Rutkoski, Jessica; Ortiz-Monasterio, Ivan; Singh, Ravi Prakash; Goodin, Douglas; Poland, Jesse

    2016-01-01

    Low cost unmanned aerial systems (UAS) have great potential for rapid proximal measurements of plants in agriculture. In the context of plant breeding and genetics, current approaches for phenotyping a large number of breeding lines under field conditions require substantial investments in time, cost, and labor. For field-based high-throughput phenotyping (HTP), UAS platforms can provide high-resolution measurements for small plot research, while enabling the rapid assessment of tens-of-thousands of field plots. The objective of this study was to complete a baseline assessment of the utility of UAS in assessment field trials as commonly implemented in wheat breeding programs. We developed a semi-automated image-processing pipeline to extract plot level data from UAS imagery. The image dataset was processed using a photogrammetric pipeline based on image orientation and radiometric calibration to produce orthomosaic images. We also examined the relationships between vegetation indices (VIs) extracted from high spatial resolution multispectral imagery collected with two different UAS systems (eBee Ag carrying MultiSpec 4C camera, and IRIS+ quadcopter carrying modified NIR Canon S100) and ground truth spectral data from hand-held spectroradiometer. We found good correlation between the VIs obtained from UAS platforms and ground-truth measurements and observed high broad-sense heritability for VIs. We determined radiometric calibration methods developed for satellite imagery significantly improved the precision of VIs from the UAS. We observed VIs extracted from calibrated images of Canon S100 had a significantly higher correlation to the spectroradiometer (r = 0.76) than VIs from the MultiSpec 4C camera (r = 0.64). Their correlation to spectroradiometer readings was as high as or higher than repeated measurements with the spectroradiometer per se. The approaches described here for UAS imaging and extraction of proximal sensing data enable collection of HTP

  20. Large-scale hydrology in Europe : observed patterns and model performance

    Energy Technology Data Exchange (ETDEWEB)

    Gudmundsson, Lukas

    2011-06-15

    In a changing climate, terrestrial water storages are of great interest as water availability impacts key aspects of ecosystem functioning. Thus, a better understanding of the variations of wet and dry periods will contribute to fully grasp processes of the earth system such as nutrient cycling and vegetation dynamics. Currently, river runoff from small, nearly natural, catchments is one of the few variables of the terrestrial water balance that is regularly monitored with detailed spatial and temporal coverage on large scales. River runoff, therefore, provides a foundation to approach European hydrology with respect to observed patterns on large scales, with regard to the ability of models to capture these.The analysis of observed river flow from small catchments, focused on the identification and description of spatial patterns of simultaneous temporal variations of runoff. These are dominated by large-scale variations of climatic variables but also altered by catchment processes. It was shown that time series of annual low, mean and high flows follow the same atmospheric drivers. The observation that high flows are more closely coupled to large scale atmospheric drivers than low flows, indicates the increasing influence of catchment properties on runoff under dry conditions. Further, it was shown that the low-frequency variability of European runoff is dominated by two opposing centres of simultaneous variations, such that dry years in the north are accompanied by wet years in the south.Large-scale hydrological models are simplified representations of our current perception of the terrestrial water balance on large scales. Quantification of the models strengths and weaknesses is the prerequisite for a reliable interpretation of simulation results. Model evaluations may also enable to detect shortcomings with model assumptions and thus enable a refinement of the current perception of hydrological systems. The ability of a multi model ensemble of nine large-scale

  1. Detection of large scale 3' deletions in the PMS2 gene amongst Colon-CFR participants: have we been missing anything?

    Science.gov (United States)

    Clendenning, Mark; Walsh, Michael D; Gelpi, Judith Balmana; Thibodeau, Stephen N; Lindor, Noralane; Potter, John D; Newcomb, Polly; LeMarchand, Loic; Haile, Robert; Gallinger, Steve; Hopper, John L; Jenkins, Mark A; Rosty, Christophe; Young, Joanne P; Buchanan, Daniel D

    2013-09-01

    Current screening practices have been able to identify PMS2 mutations in 78 % of cases of colorectal cancer from the Colorectal Cancer Family Registry (Colon CFR) which showed solitary loss of the PMS2 protein. However the detection of large-scale deletions in the 3' end of the PMS2 gene has not been possible due to technical difficulties associated with pseudogene sequences. Here, we utilised a recently described MLPA/long-range PCR-based approach to screen the remaining 22 % (n = 16) of CRC-affected probands for mutations in the 3' end of the PMS2 gene. No deletions encompassing any or all of exons 12 through 15 were identified; therefore, our results suggest that 3' deletions in PMS2 are not a frequent occurrence in such families.

  2. Large-scale perturbations from the waterfall field in hybrid inflation

    International Nuclear Information System (INIS)

    Fonseca, José; Wands, David; Sasaki, Misao

    2010-01-01

    We estimate large-scale curvature perturbations from isocurvature fluctuations in the waterfall field during hybrid inflation, in addition to the usual inflaton field perturbations. The tachyonic instability at the end of inflation leads to an explosive growth of super-Hubble scale perturbations, but they retain the steep blue spectrum characteristic of vacuum fluctuations in a massive field during inflation. The power spectrum thus peaks around the Hubble-horizon scale at the end of inflation. We extend the usual δN formalism to include the essential role of these small fluctuations when estimating the large-scale curvature perturbation. The resulting curvature perturbation due to fluctuations in the waterfall field is second-order and the spectrum is expected to be of order 10 −54 on cosmological scales

  3. Decoupling local mechanics from large-scale structure in modular metamaterials

    Science.gov (United States)

    Yang, Nan; Silverberg, Jesse L.

    2017-04-01

    A defining feature of mechanical metamaterials is that their properties are determined by the organization of internal structure instead of the raw fabrication materials. This shift of attention to engineering internal degrees of freedom has coaxed relatively simple materials into exhibiting a wide range of remarkable mechanical properties. For practical applications to be realized, however, this nascent understanding of metamaterial design must be translated into a capacity for engineering large-scale structures with prescribed mechanical functionality. Thus, the challenge is to systematically map desired functionality of large-scale structures backward into a design scheme while using finite parameter domains. Such “inverse design” is often complicated by the deep coupling between large-scale structure and local mechanical function, which limits the available design space. Here, we introduce a design strategy for constructing 1D, 2D, and 3D mechanical metamaterials inspired by modular origami and kirigami. Our approach is to assemble a number of modules into a voxelized large-scale structure, where the module’s design has a greater number of mechanical design parameters than the number of constraints imposed by bulk assembly. This inequality allows each voxel in the bulk structure to be uniquely assigned mechanical properties independent from its ability to connect and deform with its neighbors. In studying specific examples of large-scale metamaterial structures we show that a decoupling of global structure from local mechanical function allows for a variety of mechanically and topologically complex designs.

  4. The origin of large scale cosmic structure

    International Nuclear Information System (INIS)

    Jones, B.J.T.; Palmer, P.L.

    1985-01-01

    The paper concerns the origin of large scale cosmic structure. The evolution of density perturbations, the nonlinear regime (Zel'dovich's solution and others), the Gott and Rees clustering hierarchy, the spectrum of condensations, and biassed galaxy formation, are all discussed. (UK)

  5. A practical process for light-water detritiation at large scales

    Energy Technology Data Exchange (ETDEWEB)

    Boniface, H.A. [Atomic Energy of Canada Limited, Chalk River, ON (Canada); Robinson, J., E-mail: jr@tyne-engineering.com [Tyne Engineering, Burlington, ON (Canada); Gnanapragasam, N.V.; Castillo, I.; Suppiah, S. [Atomic Energy of Canada Limited, Chalk River, ON (Canada)

    2014-07-01

    AECL and Tyne Engineering have recently completed a preliminary engineering design for a modest-scale tritium removal plant for light water, intended for installation at AECL's Chalk River Laboratories (CRL). This plant design was based on the Combined Electrolysis and Catalytic Exchange (CECE) technology developed at CRL over many years and demonstrated there and elsewhere. The general features and capabilities of this design have been reported as well as the versatility of the design for separating any pair of the three hydrogen isotopes. The same CECE technology could be applied directly to very large-scale wastewater detritiation, such as the case at Fukushima Daiichi Nuclear Power Station. However, since the CECE process scales linearly with throughput, the required capital and operating costs are substantial for such large-scale applications. This paper discusses some options for reducing the costs of very large-scale detritiation. Options include: Reducing tritium removal effectiveness; Energy recovery; Improving the tolerance of impurities; Use of less expensive or more efficient equipment. A brief comparison with alternative processes is also presented. (author)

  6. OffshoreDC DC grids for integration of large scale wind power

    DEFF Research Database (Denmark)

    Zeni, Lorenzo; Endegnanew, Atsede Gualu; Stamatiou, Georgios

    The present report summarizes the main findings of the Nordic Energy Research project “DC grids for large scale integration of offshore wind power – OffshoreDC”. The project is been funded by Nordic Energy Research through the TFI programme and was active between 2011 and 2016. The overall...... objective of the project was to drive the development of the VSC based HVDC technology for future large scale offshore grids, supporting a standardised and commercial development of the technology, and improving the opportunities for the technology to support power system integration of large scale offshore...

  7. Low-Complexity Transmit Antenna Selection and Beamforming for Large-Scale MIMO Communications

    Directory of Open Access Journals (Sweden)

    Kun Qian

    2014-01-01

    Full Text Available Transmit antenna selection plays an important role in large-scale multiple-input multiple-output (MIMO communications, but optimal large-scale MIMO antenna selection is a technical challenge. Exhaustive search is often employed in antenna selection, but it cannot be efficiently implemented in large-scale MIMO communication systems due to its prohibitive high computation complexity. This paper proposes a low-complexity interactive multiple-parameter optimization method for joint transmit antenna selection and beamforming in large-scale MIMO communication systems. The objective is to jointly maximize the channel outrage capacity and signal-to-noise (SNR performance and minimize the mean square error in transmit antenna selection and minimum variance distortionless response (MVDR beamforming without exhaustive search. The effectiveness of all the proposed methods is verified by extensive simulation results. It is shown that the required antenna selection processing time of the proposed method does not increase along with the increase of selected antennas, but the computation complexity of conventional exhaustive search method will significantly increase when large-scale antennas are employed in the system. This is particularly useful in antenna selection for large-scale MIMO communication systems.

  8. Refined Phenotyping of Modic Changes

    Science.gov (United States)

    Määttä, Juhani H.; Karppinen, Jaro; Paananen, Markus; Bow, Cora; Luk, Keith D.K.; Cheung, Kenneth M.C.; Samartzis, Dino

    2016-01-01

    Abstract Low back pain (LBP) is the world's most disabling condition. Modic changes (MC) are vertebral bone marrow changes adjacent to the endplates as noted on magnetic resonance imaging. The associations of specific MC types and patterns with prolonged, severe LBP and disability remain speculative. This study assessed the relationship of prolonged, severe LBP and back-related disability, with the presence and morphology of lumbar MC in a large cross-sectional population-based study of Southern Chinese. We addressed the topographical and morphological dimensions of MC along with other magnetic resonance imaging phenotypes (eg, disc degeneration and displacement) on the basis of axial T1 and sagittal T2-weighted imaging of L1-S1. Prolonged severe LBP was defined as LBP lasting ≥30 days during the past year, and a visual analog scale severest pain intensity of at least 6/10. An Oswestry Disability Index score of 15% was regarded as significant disability. We also assessed subject demographics, occupation, and lifestyle factors. In total, 1142 subjects (63% females, mean age 53 years) were assessed. Of these, 282 (24.7%) had MC (7.1% type I, 17.6% type II). MC subjects were older (P = 0.003), had more frequent disc displacements (P disability. The strength of the associations increased with the number of MC. This large-scale study is the first to definitively note MC types and specific morphologies to be independently associated with prolonged severe LBP and back-related disability. This proposed refined MC phenotype may have direct implications in clinical decision-making as to the development and management of LBP. Understanding of these imaging biomarkers can lead to new preventative and personalized therapeutics related to LBP. PMID:27258491

  9. The effective field theory of cosmological large scale structures

    Energy Technology Data Exchange (ETDEWEB)

    Carrasco, John Joseph M. [Stanford Univ., Stanford, CA (United States); Hertzberg, Mark P. [Stanford Univ., Stanford, CA (United States); SLAC National Accelerator Lab., Menlo Park, CA (United States); Senatore, Leonardo [Stanford Univ., Stanford, CA (United States); SLAC National Accelerator Lab., Menlo Park, CA (United States)

    2012-09-20

    Large scale structure surveys will likely become the next leading cosmological probe. In our universe, matter perturbations are large on short distances and small at long scales, i.e. strongly coupled in the UV and weakly coupled in the IR. To make precise analytical predictions on large scales, we develop an effective field theory formulated in terms of an IR effective fluid characterized by several parameters, such as speed of sound and viscosity. These parameters, determined by the UV physics described by the Boltzmann equation, are measured from N-body simulations. We find that the speed of sound of the effective fluid is c2s ≈ 10–6c2 and that the viscosity contributions are of the same order. The fluid describes all the relevant physics at long scales k and permits a manifestly convergent perturbative expansion in the size of the matter perturbations δ(k) for all the observables. As an example, we calculate the correction to the power spectrum at order δ(k)4. As a result, the predictions of the effective field theory are found to be in much better agreement with observation than standard cosmological perturbation theory, already reaching percent precision at this order up to a relatively short scale k ≃ 0.24h Mpc–1.

  10. Temporal flexibility and careers: The role of large-scale organizations for physicians

    OpenAIRE

    Forrest Briscoe

    2006-01-01

    Temporal flexibility and careers: The role of large-scale organizations for physicians. Forrest Briscoe Briscoe This study investigates how employment in large-scale organizations affects the work lives of practicing physicians. Well-established theory associates larger organizations with bureaucratic constraint, loss of workplace control, and dissatisfaction, but this author finds that large scale is also associated with greater schedule and career flexibility. Ironically, the bureaucratic p...

  11. Algorithmic Mapping and Characterization of the Drug-Induced Phenotypic-Response Space of Parasites Causing Schistosomiasis.

    Science.gov (United States)

    Singh, Rahul; Beasley, Rachel; Long, Thavy; Caffrey, Conor R

    2018-01-01

    Neglected tropical diseases, especially those caused by helminths, constitute some of the most common infections of the world's poorest people. Amongst these, schistosomiasis (bilharzia or 'snail fever'), caused by blood flukes of the genus Schistosoma, ranks second only to malaria in terms of human impact: two hundred million people are infected and close to 800 million are at risk of infection. Drug screening against helminths poses unique challenges: the parasite cannot be cloned and is difficult to target using gene knockouts or RNAi. Consequently, both lead identification and validation involve phenotypic screening, where parasites are exposed to compounds whose effects are determined through the analysis of the ensuing phenotypic responses. The efficacy of leads thus identified derives from one or more or even unknown molecular mechanisms of action. The two most immediate and significant challenges that confront the state-of-the-art in this area are: the development of automated and quantitative phenotypic screening techniques and the mapping and quantitative characterization of the totality of phenotypic responses of the parasite. In this paper, we investigate and propose solutions for the latter problem in terms of the following: (1) mathematical formulation and algorithms that allow rigorous representation of the phenotypic response space of the parasite, (2) application of graph-theoretic and network analysis techniques for quantitative modeling and characterization of the phenotypic space, and (3) application of the aforementioned methodology to analyze the phenotypic space of S. mansoni - one of the etiological agents of schistosomiasis, induced by compounds that target its polo-like kinase 1 (PLK 1) gene - a recently validated drug target. In our approach, first, bio-image analysis algorithms are used to quantify the phenotypic responses of different drugs. Next, these responses are linearly mapped into a low- dimensional space using Principle

  12. The role of large scale motions on passive scalar transport

    Science.gov (United States)

    Dharmarathne, Suranga; Araya, Guillermo; Tutkun, Murat; Leonardi, Stefano; Castillo, Luciano

    2014-11-01

    We study direct numerical simulation (DNS) of turbulent channel flow at Reτ = 394 to investigate effect of large scale motions on fluctuating temperature field which forms a passive scalar field. Statistical description of the large scale features of the turbulent channel flow is obtained using two-point correlations of velocity components. Two-point correlations of fluctuating temperature field is also examined in order to identify possible similarities between velocity and temperature fields. The two-point cross-correlations betwen the velocity and temperature fluctuations are further analyzed to establish connections between these two fields. In addition, we use proper orhtogonal decompotion (POD) to extract most dominant modes of the fields and discuss the coupling of large scale features of turbulence and the temperature field.

  13. Signatures of non-universal large scales in conditional structure functions from various turbulent flows

    International Nuclear Information System (INIS)

    Blum, Daniel B; Voth, Greg A; Bewley, Gregory P; Bodenschatz, Eberhard; Gibert, Mathieu; Xu Haitao; Gylfason, Ármann; Mydlarski, Laurent; Yeung, P K

    2011-01-01

    We present a systematic comparison of conditional structure functions in nine turbulent flows. The flows studied include forced isotropic turbulence simulated on a periodic domain, passive grid wind tunnel turbulence in air and in pressurized SF 6 , active grid wind tunnel turbulence (in both synchronous and random driving modes), the flow between counter-rotating discs, oscillating grid turbulence and the flow in the Lagrangian exploration module (in both constant and random driving modes). We compare longitudinal Eulerian second-order structure functions conditioned on the instantaneous large-scale velocity in each flow to assess the ways in which the large scales affect the small scales in a variety of turbulent flows. Structure functions are shown to have larger values when the large-scale velocity significantly deviates from the mean in most flows, suggesting that dependence on the large scales is typical in many turbulent flows. The effects of the large-scale velocity on the structure functions can be quite strong, with the structure function varying by up to a factor of 2 when the large-scale velocity deviates from the mean by ±2 standard deviations. In several flows, the effects of the large-scale velocity are similar at all the length scales we measured, indicating that the large-scale effects are scale independent. In a few flows, the effects of the large-scale velocity are larger on the smallest length scales. (paper)

  14. Cytology of DNA Replication Reveals Dynamic Plasticity of Large-Scale Chromatin Fibers.

    Science.gov (United States)

    Deng, Xiang; Zhironkina, Oxana A; Cherepanynets, Varvara D; Strelkova, Olga S; Kireev, Igor I; Belmont, Andrew S

    2016-09-26

    In higher eukaryotic interphase nuclei, the 100- to >1,000-fold linear compaction of chromatin is difficult to reconcile with its function as a template for transcription, replication, and repair. It is challenging to imagine how DNA and RNA polymerases with their associated molecular machinery would move along the DNA template without transient decondensation of observed large-scale chromatin "chromonema" fibers [1]. Transcription or "replication factory" models [2], in which polymerases remain fixed while DNA is reeled through, are similarly difficult to conceptualize without transient decondensation of these chromonema fibers. Here, we show how a dynamic plasticity of chromatin folding within large-scale chromatin fibers allows DNA replication to take place without significant changes in the global large-scale chromatin compaction or shape of these large-scale chromatin fibers. Time-lapse imaging of lac-operator-tagged chromosome regions shows no major change in the overall compaction of these chromosome regions during their DNA replication. Improved pulse-chase labeling of endogenous interphase chromosomes yields a model in which the global compaction and shape of large-Mbp chromatin domains remains largely invariant during DNA replication, with DNA within these domains undergoing significant movements and redistribution as they move into and then out of adjacent replication foci. In contrast to hierarchical folding models, this dynamic plasticity of large-scale chromatin organization explains how localized changes in DNA topology allow DNA replication to take place without an accompanying global unfolding of large-scale chromatin fibers while suggesting a possible mechanism for maintaining epigenetic programming of large-scale chromatin domains throughout DNA replication. Copyright © 2016 Elsevier Ltd. All rights reserved.

  15. Evaluation of drought propagation in an ensemble mean of large-scale hydrological models

    NARCIS (Netherlands)

    Loon, van A.F.; Huijgevoort, van M.H.J.; Lanen, van H.A.J.

    2012-01-01

    Hydrological drought is increasingly studied using large-scale models. It is, however, not sure whether large-scale models reproduce the development of hydrological drought correctly. The pressing question is how well do large-scale models simulate the propagation from meteorological to hydrological

  16. Phenotypic and molecular characterizations of Yersinia pestis isolates from Kazakhstan and adjacent regions.

    Science.gov (United States)

    Lowell, Jennifer L; Zhansarina, Aigul; Yockey, Brook; Meka-Mechenko, Tatyana; Stybayeva, Gulnaz; Atshabar, Bakyt; Nekrassova, Larissa; Tashmetov, Rinat; Kenghebaeva, Kuralai; Chu, May C; Kosoy, Michael; Antolin, Michael F; Gage, Kenneth L

    2007-01-01

    Recent interest in characterizing infectious agents associated with bioterrorism has resulted in the development of effective pathogen genotyping systems, but this information is rarely combined with phenotypic data. Yersinia pestis, the aetiological agent of plague, has been well defined genotypically on local and worldwide scales using multi-locus variable number tandem repeat analysis (MLVA), with emphasis on evolutionary patterns using old isolate collections from countries where Y. pestis has existed the longest. Worldwide MLVA studies are largely based on isolates that have been in long-term laboratory culture and storage, or on field material from parts of the world where Y. pestis has potentially circulated in nature for thousands of years. Diversity in these isolates suggests that they may no longer represent the wild-type organism phenotypically, including the possibility of altered pathogenicity. This study focused on the phenotypic and genotypic properties of 48 Y. pestis isolates collected from 10 plague foci in and bordering Kazakhstan. Phenotypic characterization was based on diagnostic tests typically performed in reference laboratories working with Y. pestis. MLVA was used to define the genotypic relationships between the central-Asian isolates and a group of North American isolates, and to examine Kazakh Y. pestis diversity according to predefined plague foci and on an intermediate geographical scale. Phenotypic properties revealed that a large portion of this collection lacks one or more plasmids necessary to complete the blocked flea/mammal transmission cycle, has lost Congo red binding capabilities (Pgm-), or both. MLVA analysis classified isolates into previously identified biovars, and in some cases groups of isolates collected within the same plague focus formed a clade. Overall, MLVA did not distinguish unique phylogeographical groups of Y. pestis isolates as defined by plague foci and indicated higher genetic diversity among older biovars.

  17. Configuration management in large scale infrastructure development

    NARCIS (Netherlands)

    Rijn, T.P.J. van; Belt, H. van de; Los, R.H.

    2000-01-01

    Large Scale Infrastructure (LSI) development projects such as the construction of roads, rail-ways and other civil engineering (water)works is tendered differently today than a decade ago. Traditional workflow requested quotes from construction companies for construction works where the works to be

  18. Dual Decomposition for Large-Scale Power Balancing

    DEFF Research Database (Denmark)

    Halvgaard, Rasmus; Jørgensen, John Bagterp; Vandenberghe, Lieven

    2013-01-01

    Dual decomposition is applied to power balancing of exible thermal storage units. The centralized large-scale problem is decomposed into smaller subproblems and solved locallyby each unit in the Smart Grid. Convergence is achieved by coordinating the units consumption through a negotiation...

  19. Generation of large-scale vortives in compressible helical turbulence

    International Nuclear Information System (INIS)

    Chkhetiani, O.G.; Gvaramadze, V.V.

    1989-01-01

    We consider generation of large-scale vortices in compressible self-gravitating turbulent medium. The closed equation describing evolution of the large-scale vortices in helical turbulence with finite correlation time is obtained. This equation has the form similar to the hydromagnetic dynamo equation, which allows us to call the vortx genertation effect the vortex dynamo. It is possible that principally the same mechanism is responsible both for amplification and maintenance of density waves and magnetic fields in gaseous disks of spiral galaxies. (author). 29 refs

  20. Dipolar modulation of Large-Scale Structure

    Science.gov (United States)

    Yoon, Mijin

    For the last two decades, we have seen a drastic development of modern cosmology based on various observations such as the cosmic microwave background (CMB), type Ia supernovae, and baryonic acoustic oscillations (BAO). These observational evidences have led us to a great deal of consensus on the cosmological model so-called LambdaCDM and tight constraints on cosmological parameters consisting the model. On the other hand, the advancement in cosmology relies on the cosmological principle: the universe is isotropic and homogeneous on large scales. Testing these fundamental assumptions is crucial and will soon become possible given the planned observations ahead. Dipolar modulation is the largest angular anisotropy of the sky, which is quantified by its direction and amplitude. We measured a huge dipolar modulation in CMB, which mainly originated from our solar system's motion relative to CMB rest frame. However, we have not yet acquired consistent measurements of dipolar modulations in large-scale structure (LSS), as they require large sky coverage and a number of well-identified objects. In this thesis, we explore measurement of dipolar modulation in number counts of LSS objects as a test of statistical isotropy. This thesis is based on two papers that were published in peer-reviewed journals. In Chapter 2 [Yoon et al., 2014], we measured a dipolar modulation in number counts of WISE matched with 2MASS sources. In Chapter 3 [Yoon & Huterer, 2015], we investigated requirements for detection of kinematic dipole in future surveys.

  1. Impact of large-scale tides on cosmological distortions via redshift-space power spectrum

    Science.gov (United States)

    Akitsu, Kazuyuki; Takada, Masahiro

    2018-03-01

    Although large-scale perturbations beyond a finite-volume survey region are not direct observables, these affect measurements of clustering statistics of small-scale (subsurvey) perturbations in large-scale structure, compared with the ensemble average, via the mode-coupling effect. In this paper we show that a large-scale tide induced by scalar perturbations causes apparent anisotropic distortions in the redshift-space power spectrum of galaxies in a way depending on an alignment between the tide, wave vector of small-scale modes and line-of-sight direction. Using the perturbation theory of structure formation, we derive a response function of the redshift-space power spectrum to large-scale tide. We then investigate the impact of large-scale tide on estimation of cosmological distances and the redshift-space distortion parameter via the measured redshift-space power spectrum for a hypothetical large-volume survey, based on the Fisher matrix formalism. To do this, we treat the large-scale tide as a signal, rather than an additional source of the statistical errors, and show that a degradation in the parameter is restored if we can employ the prior on the rms amplitude expected for the standard cold dark matter (CDM) model. We also discuss whether the large-scale tide can be constrained at an accuracy better than the CDM prediction, if the effects up to a larger wave number in the nonlinear regime can be included.

  2. Large-scale Intelligent Transporation Systems simulation

    Energy Technology Data Exchange (ETDEWEB)

    Ewing, T.; Canfield, T.; Hannebutte, U.; Levine, D.; Tentner, A.

    1995-06-01

    A prototype computer system has been developed which defines a high-level architecture for a large-scale, comprehensive, scalable simulation of an Intelligent Transportation System (ITS) capable of running on massively parallel computers and distributed (networked) computer systems. The prototype includes the modelling of instrumented ``smart`` vehicles with in-vehicle navigation units capable of optimal route planning and Traffic Management Centers (TMC). The TMC has probe vehicle tracking capabilities (display position and attributes of instrumented vehicles), and can provide 2-way interaction with traffic to provide advisories and link times. Both the in-vehicle navigation module and the TMC feature detailed graphical user interfaces to support human-factors studies. The prototype has been developed on a distributed system of networked UNIX computers but is designed to run on ANL`s IBM SP-X parallel computer system for large scale problems. A novel feature of our design is that vehicles will be represented by autonomus computer processes, each with a behavior model which performs independent route selection and reacts to external traffic events much like real vehicles. With this approach, one will be able to take advantage of emerging massively parallel processor (MPP) systems.

  3. The Hamburg large scale geostrophic ocean general circulation model. Cycle 1

    International Nuclear Information System (INIS)

    Maier-Reimer, E.; Mikolajewicz, U.

    1992-02-01

    The rationale for the Large Scale Geostrophic ocean circulation model (LSG-OGCM) is based on the observations that for a large scale ocean circulation model designed for climate studies, the relevant characteristic spatial scales are large compared with the internal Rossby radius throughout most of the ocean, while the characteristic time scales are large compared with the periods of gravity modes and barotropic Rossby wave modes. In the present version of the model, the fast modes have been filtered out by a conventional technique of integrating the full primitive equations, including all terms except the nonlinear advection of momentum, by an implicit time integration method. The free surface is also treated prognostically, without invoking a rigid lid approximation. The numerical scheme is unconditionally stable and has the additional advantage that it can be applied uniformly to the entire globe, including the equatorial and coastal current regions. (orig.)

  4. Soft X-ray Emission from Large-Scale Galactic Outflows in Seyfert Galaxies

    Science.gov (United States)

    Colbert, E. J. M.; Baum, S.; O'Dea, C.; Veilleux, S.

    1998-01-01

    Kiloparsec-scale soft X-ray nebulae extend along the galaxy minor axes in several Seyfert galaxies, including NGC 2992, NGC 4388 and NGC 5506. In these three galaxies, the extended X-ray emission observed in ROSAT HRI images has 0.2-2.4 keV X-ray luminosities of 0.4-3.5 x 10(40) erg s(-1) . The X-ray nebulae are roughly co-spatial with the large-scale radio emission, suggesting that both are produced by large-scale galactic outflows. Assuming pressure balance between the radio and X-ray plasmas, the X-ray filling factor is >~ 10(4) times as large as the radio plasma filling factor, suggesting that large-scale outflows in Seyfert galaxies are predominantly winds of thermal X-ray emitting gas. We favor an interpretation in which large-scale outflows originate as AGN-driven jets that entrain and heat gas on kpc scales as they make their way out of the galaxy. AGN- and starburst-driven winds are also possible explanations if the winds are oriented along the rotation axis of the galaxy disk. Since large-scale outflows are present in at least 50 percent of Seyfert galaxies, the soft X-ray emission from the outflowing gas may, in many cases, explain the ``soft excess" X-ray feature observed below 2 keV in X-ray spectra of many Seyfert 2 galaxies.

  5. Pro website development and operations streamlining DevOps for large-scale websites

    CERN Document Server

    Sacks, Matthew

    2012-01-01

    Pro Website Development and Operations gives you the experience you need to create and operate a large-scale production website. Large-scale websites have their own unique set of problems regarding their design-problems that can get worse when agile methodologies are adopted for rapid results. Managing large-scale websites, deploying applications, and ensuring they are performing well often requires a full scale team involving the development and operations sides of the company-two departments that don't always see eye to eye. When departments struggle with each other, it adds unnecessary comp

  6. Neutrinos and large-scale structure

    International Nuclear Information System (INIS)

    Eisenstein, Daniel J.

    2015-01-01

    I review the use of cosmological large-scale structure to measure properties of neutrinos and other relic populations of light relativistic particles. With experiments to measure the anisotropies of the cosmic microwave anisotropies and the clustering of matter at low redshift, we now have securely measured a relativistic background with density appropriate to the cosmic neutrino background. Our limits on the mass of the neutrino continue to shrink. Experiments coming in the next decade will greatly improve the available precision on searches for the energy density of novel relativistic backgrounds and the mass of neutrinos

  7. Neutrinos and large-scale structure

    Energy Technology Data Exchange (ETDEWEB)

    Eisenstein, Daniel J. [Daniel J. Eisenstein, Harvard-Smithsonian Center for Astrophysics, 60 Garden St., MS #20, Cambridge, MA 02138 (United States)

    2015-07-15

    I review the use of cosmological large-scale structure to measure properties of neutrinos and other relic populations of light relativistic particles. With experiments to measure the anisotropies of the cosmic microwave anisotropies and the clustering of matter at low redshift, we now have securely measured a relativistic background with density appropriate to the cosmic neutrino background. Our limits on the mass of the neutrino continue to shrink. Experiments coming in the next decade will greatly improve the available precision on searches for the energy density of novel relativistic backgrounds and the mass of neutrinos.

  8. Evaluation of Large-scale Public Sector Reforms

    DEFF Research Database (Denmark)

    Breidahl, Karen Nielsen; Gjelstrup, Gunnar; Hansen, Hanne Foss

    2017-01-01

    and more delimited policy areas take place. In our analysis we apply four governance perspectives (rational-instrumental, rational-interest based, institutional-cultural and a chaos perspective) in a comparative analysis of the evaluations of two large-scale public sector reforms in Denmark and Norway. We...

  9. Noise Reduction in High-Throughput Gene Perturbation Screens

    Science.gov (United States)

    Motivation: Accurate interpretation of perturbation screens is essential for a successful functional investigation. However, the screened phenotypes are often distorted by noise, and their analysis requires specialized statistical analysis tools. The number and scope of statistical methods available...

  10. Highly Scalable Trip Grouping for Large Scale Collective Transportation Systems

    DEFF Research Database (Denmark)

    Gidofalvi, Gyozo; Pedersen, Torben Bach; Risch, Tore

    2008-01-01

    Transportation-related problems, like road congestion, parking, and pollution, are increasing in most cities. In order to reduce traffic, recent work has proposed methods for vehicle sharing, for example for sharing cabs by grouping "closeby" cab requests and thus minimizing transportation cost...... and utilizing cab space. However, the methods published so far do not scale to large data volumes, which is necessary to facilitate large-scale collective transportation systems, e.g., ride-sharing systems for large cities. This paper presents highly scalable trip grouping algorithms, which generalize previous...

  11. Penalized Estimation in Large-Scale Generalized Linear Array Models

    DEFF Research Database (Denmark)

    Lund, Adam; Vincent, Martin; Hansen, Niels Richard

    2017-01-01

    Large-scale generalized linear array models (GLAMs) can be challenging to fit. Computation and storage of its tensor product design matrix can be impossible due to time and memory constraints, and previously considered design matrix free algorithms do not scale well with the dimension...

  12. Large-scale coastal impact induced by a catastrophic storm

    DEFF Research Database (Denmark)

    Fruergaard, Mikkel; Andersen, Thorbjørn Joest; Johannessen, Peter N

    breaching. Our results demonstrate that violent, millennial-scale storms can trigger significant large-scale and long-term changes on barrier coasts, and that coastal changes assumed to take place over centuries or even millennia may occur in association with a single extreme storm event....

  13. Large-eddy simulation with accurate implicit subgrid-scale diffusion

    NARCIS (Netherlands)

    B. Koren (Barry); C. Beets

    1996-01-01

    textabstractA method for large-eddy simulation is presented that does not use an explicit subgrid-scale diffusion term. Subgrid-scale effects are modelled implicitly through an appropriate monotone (in the sense of Spekreijse 1987) discretization method for the advective terms. Special attention is

  14. Challenges for Large Scale Structure Theory

    CERN Multimedia

    CERN. Geneva

    2018-01-01

    I will describe some of the outstanding questions in Cosmology where answers could be provided by observations of the Large Scale Structure of the Universe at late times.I will discuss some of the theoretical challenges which will have to be overcome to extract this information from the observations. I will describe some of the theoretical tools that might be useful to achieve this goal. 

  15. Macroecological factors explain large-scale spatial population patterns of ancient agriculturalists

    NARCIS (Netherlands)

    Xu, C.; Chen, B.; Abades, S.; Reino, L.; Teng, S.; Ljungqvist, F.C.; Huang, Z.Y.X.; Liu, X.

    2015-01-01

    Aim: It has been well demonstrated that the large-scale distribution patterns of numerous species are driven by similar macroecological factors. However, understanding of this topic remains limited when applied to our own species. Here we take a large-scale look at ancient agriculturalist

  16. Large Scale Investments in Infrastructure : Competing Policy regimes to Control Connections

    NARCIS (Netherlands)

    Otsuki, K.; Read, M.L.; Zoomers, E.B.

    2016-01-01

    This paper proposes to analyse implications of large-scale investments in physical infrastructure for social and environmental justice. While case studies on the global land rush and climate change have advanced our understanding of how large-scale investments in land, forests and water affect

  17. Rotation invariant fast features for large-scale recognition

    Science.gov (United States)

    Takacs, Gabriel; Chandrasekhar, Vijay; Tsai, Sam; Chen, David; Grzeszczuk, Radek; Girod, Bernd

    2012-10-01

    We present an end-to-end feature description pipeline which uses a novel interest point detector and Rotation- Invariant Fast Feature (RIFF) descriptors. The proposed RIFF algorithm is 15× faster than SURF1 while producing large-scale retrieval results that are comparable to SIFT.2 Such high-speed features benefit a range of applications from Mobile Augmented Reality (MAR) to web-scale image retrieval and analysis.

  18. Rapid Cellular Phenotyping of Human Pluripotent Stem Cell-Derived Cardiomyocytes using a Genetically Encoded Fluorescent Voltage Sensor

    Directory of Open Access Journals (Sweden)

    Jordan S. Leyton-Mange

    2014-02-01

    Full Text Available In addition to their promise in regenerative medicine, pluripotent stem cells have proved to be faithful models of many human diseases. In particular, patient-specific stem cell-derived cardiomyocytes recapitulate key features of several life-threatening cardiac arrhythmia syndromes. For both modeling and regenerative approaches, phenotyping of stem cell-derived tissues is critical. Cellular phenotyping has largely relied upon expression of lineage markers rather than physiologic attributes. This is especially true for cardiomyocytes, in part because electrophysiological recordings are labor intensive. Likewise, most optical voltage indicators suffer from phototoxicity, which damages cells and degrades signal quality. Here we present the use of a genetically encoded fluorescent voltage indicator, ArcLight, which we demonstrate can faithfully report transmembrane potentials in human stem cell-derived cardiomyocytes. We demonstrate the application of this fluorescent sensor in high-throughput, serial phenotyping of differentiating cardiomyocyte populations and in screening for drug-induced cardiotoxicity.

  19. Large-scale bioenergy production: how to resolve sustainability trade-offs?

    Science.gov (United States)

    Humpenöder, Florian; Popp, Alexander; Bodirsky, Benjamin Leon; Weindl, Isabelle; Biewald, Anne; Lotze-Campen, Hermann; Dietrich, Jan Philipp; Klein, David; Kreidenweis, Ulrich; Müller, Christoph; Rolinski, Susanne; Stevanovic, Miodrag

    2018-02-01

    Large-scale 2nd generation bioenergy deployment is a key element of 1.5 °C and 2 °C transformation pathways. However, large-scale bioenergy production might have negative sustainability implications and thus may conflict with the Sustainable Development Goal (SDG) agenda. Here, we carry out a multi-criteria sustainability assessment of large-scale bioenergy crop production throughout the 21st century (300 EJ in 2100) using a global land-use model. Our analysis indicates that large-scale bioenergy production without complementary measures results in negative effects on the following sustainability indicators: deforestation, CO2 emissions from land-use change, nitrogen losses, unsustainable water withdrawals and food prices. One of our main findings is that single-sector environmental protection measures next to large-scale bioenergy production are prone to involve trade-offs among these sustainability indicators—at least in the absence of more efficient land or water resource use. For instance, if bioenergy production is accompanied by forest protection, deforestation and associated emissions (SDGs 13 and 15) decline substantially whereas food prices (SDG 2) increase. However, our study also shows that this trade-off strongly depends on the development of future food demand. In contrast to environmental protection measures, we find that agricultural intensification lowers some side-effects of bioenergy production substantially (SDGs 13 and 15) without generating new trade-offs—at least among the sustainability indicators considered here. Moreover, our results indicate that a combination of forest and water protection schemes, improved fertilization efficiency, and agricultural intensification would reduce the side-effects of bioenergy production most comprehensively. However, although our study includes more sustainability indicators than previous studies on bioenergy side-effects, our study represents only a small subset of all indicators relevant for the

  20. Large-scale structure in the universe: Theory vs observations

    International Nuclear Information System (INIS)

    Kashlinsky, A.; Jones, B.J.T.

    1990-01-01

    A variety of observations constrain models of the origin of large scale cosmic structures. We review here the elements of current theories and comment in detail on which of the current observational data provide the principal constraints. We point out that enough observational data have accumulated to constrain (and perhaps determine) the power spectrum of primordial density fluctuations over a very large range of scales. We discuss the theories in the light of observational data and focus on the potential of future observations in providing even (and ever) tighter constraints. (orig.)

  1. Evaluation of drought propagation in an ensemble mean of large-scale hydrological models

    Directory of Open Access Journals (Sweden)

    A. F. Van Loon

    2012-11-01

    Full Text Available Hydrological drought is increasingly studied using large-scale models. It is, however, not sure whether large-scale models reproduce the development of hydrological drought correctly. The pressing question is how well do large-scale models simulate the propagation from meteorological to hydrological drought? To answer this question, we evaluated the simulation of drought propagation in an ensemble mean of ten large-scale models, both land-surface models and global hydrological models, that participated in the model intercomparison project of WATCH (WaterMIP. For a selection of case study areas, we studied drought characteristics (number of droughts, duration, severity, drought propagation features (pooling, attenuation, lag, lengthening, and hydrological drought typology (classical rainfall deficit drought, rain-to-snow-season drought, wet-to-dry-season drought, cold snow season drought, warm snow season drought, composite drought.

    Drought characteristics simulated by large-scale models clearly reflected drought propagation; i.e. drought events became fewer and longer when moving through the hydrological cycle. However, more differentiation was expected between fast and slowly responding systems, with slowly responding systems having fewer and longer droughts in runoff than fast responding systems. This was not found using large-scale models. Drought propagation features were poorly reproduced by the large-scale models, because runoff reacted immediately to precipitation, in all case study areas. This fast reaction to precipitation, even in cold climates in winter and in semi-arid climates in summer, also greatly influenced the hydrological drought typology as identified by the large-scale models. In general, the large-scale models had the correct representation of drought types, but the percentages of occurrence had some important mismatches, e.g. an overestimation of classical rainfall deficit droughts, and an

  2. High content screening in microfluidic devices

    Science.gov (United States)

    Cheong, Raymond; Paliwal, Saurabh; Levchenko, Andre

    2011-01-01

    Importance of the field Miniaturization is key to advancing the state-of-the-art in high content screening (HCS), in order to enable dramatic cost savings through reduced usage of expensive biochemical reagents and to enable large-scale screening on primary cells. Microfluidic technology offers the potential to enable HCS to be performed with an unprecedented degree of miniaturization. Areas covered in this review This perspective highlights a real-world example from the authors’ work of HCS assays implemented in a highly miniaturized microfluidic format. Advantages of this technology are discussed, including cost savings, high throughput screening on primary cells, improved accuracy, the ability to study complex time-varying stimuli, and ease of automation, integration, and scaling. What the reader will gain The reader will understand the capabilities of a new microfluidics-based platform for HCS, and the advantages it provides over conventional plate-based HCS. Take home message Microfluidics technology will drive significant advancements and broader usage and applicability of HCS in drug discovery. PMID:21852997

  3. Multiresolution comparison of precipitation datasets for large-scale models

    Science.gov (United States)

    Chun, K. P.; Sapriza Azuri, G.; Davison, B.; DeBeer, C. M.; Wheater, H. S.

    2014-12-01

    Gridded precipitation datasets are crucial for driving large-scale models which are related to weather forecast and climate research. However, the quality of precipitation products is usually validated individually. Comparisons between gridded precipitation products along with ground observations provide another avenue for investigating how the precipitation uncertainty would affect the performance of large-scale models. In this study, using data from a set of precipitation gauges over British Columbia and Alberta, we evaluate several widely used North America gridded products including the Canadian Gridded Precipitation Anomalies (CANGRD), the National Center for Environmental Prediction (NCEP) reanalysis, the Water and Global Change (WATCH) project, the thin plate spline smoothing algorithms (ANUSPLIN) and Canadian Precipitation Analysis (CaPA). Based on verification criteria for various temporal and spatial scales, results provide an assessment of possible applications for various precipitation datasets. For long-term climate variation studies (~100 years), CANGRD, NCEP, WATCH and ANUSPLIN have different comparative advantages in terms of their resolution and accuracy. For synoptic and mesoscale precipitation patterns, CaPA provides appealing performance of spatial coherence. In addition to the products comparison, various downscaling methods are also surveyed to explore new verification and bias-reduction methods for improving gridded precipitation outputs for large-scale models.

  4. Toward Instructional Leadership: Principals' Perceptions of Large-Scale Assessment in Schools

    Science.gov (United States)

    Prytula, Michelle; Noonan, Brian; Hellsten, Laurie

    2013-01-01

    This paper describes a study of the perceptions that Saskatchewan school principals have regarding large-scale assessment reform and their perceptions of how assessment reform has affected their roles as principals. The findings revealed that large-scale assessments, especially provincial assessments, have affected the principal in Saskatchewan…

  5. A large scale field experiment in the Amazon basin (LAMBADA/BATERISTA)

    NARCIS (Netherlands)

    Dolman, A.J.; Kabat, P.; Gash, J.H.C.; Noilhan, J.; Jochum, A.M.; Nobre, C.

    1995-01-01

    A description is given of a large-scale field experiment planned in the Amazon basin, aimed at assessing the large-scale balances of energy, water and carbon dioxide. The embedding of this experiment in global change programmes is described, viz. the Biospheric Aspects of the Hydrological Cycle

  6. Large-scale derived flood frequency analysis based on continuous simulation

    Science.gov (United States)

    Dung Nguyen, Viet; Hundecha, Yeshewatesfa; Guse, Björn; Vorogushyn, Sergiy; Merz, Bruno

    2016-04-01

    There is an increasing need for spatially consistent flood risk assessments at the regional scale (several 100.000 km2), in particular in the insurance industry and for national risk reduction strategies. However, most large-scale flood risk assessments are composed of smaller-scale assessments and show spatial inconsistencies. To overcome this deficit, a large-scale flood model composed of a weather generator and catchments models was developed reflecting the spatially inherent heterogeneity. The weather generator is a multisite and multivariate stochastic model capable of generating synthetic meteorological fields (precipitation, temperature, etc.) at daily resolution for the regional scale. These fields respect the observed autocorrelation, spatial correlation and co-variance between the variables. They are used as input into catchment models. A long-term simulation of this combined system enables to derive very long discharge series at many catchment locations serving as a basic for spatially consistent flood risk estimates at the regional scale. This combined model was set up and validated for major river catchments in Germany. The weather generator was trained by 53-year observation data at 528 stations covering not only the complete Germany but also parts of France, Switzerland, Czech Republic and Australia with the aggregated spatial scale of 443,931 km2. 10.000 years of daily meteorological fields for the study area were generated. Likewise, rainfall-runoff simulations with SWIM were performed for the entire Elbe, Rhine, Weser, Donau and Ems catchments. The validation results illustrate a good performance of the combined system, as the simulated flood magnitudes and frequencies agree well with the observed flood data. Based on continuous simulation this model chain is then used to estimate flood quantiles for the whole Germany including upstream headwater catchments in neighbouring countries. This continuous large scale approach overcomes the several

  7. Scaling up cervical cancer screening in the midst of human papillomavirus vaccination advocacy in Thailand

    Directory of Open Access Journals (Sweden)

    Teerawattananon Yot

    2010-07-01

    Full Text Available Abstract Background Screening tests for cervical cancer are effective in reducing the disease burden. In Thailand, a Pap smear program has been implemented throughout the country for 40 years. In 2008 the Ministry of Public Health (MoPH unexpectedly decided to scale up the coverage of free cervical cancer screening services, to meet an ambitious target. This study analyzes the processes and factors that drove this policy innovation in the area of cervical cancer control in Thailand. Methods In-depth interviews with key policy actors and review of relevant documents were conducted in 2009. Data analysis was guided by a framework, developed on public policy models and existing literature on scaling-up health care interventions. Results Between 2006 and 2008 international organizations and the vaccine industry advocated the introduction of Human Papillomavirus (HPV vaccine for the primary prevention of cervical cancer. Meanwhile, a local study suggested that the vaccine was considerably less cost-effective than cervical cancer screening in the Thai context. Then, from August to December 2008, the MoPH carried out a campaign to expand the coverage of its cervical cancer screening program, targeting one million women. The study reveals that several factors were influential in focusing the attention of policymakers on strengthening the screening services. These included the high burden of cervical cancer in Thailand, the launch of the HPV vaccine onto the global and domestic markets, the country’s political instability, and the dissemination of scientific evidence regarding the appropriateness of different options for cervical cancer prevention. Influenced by the country’s political crisis, the MoPH’s campaign was devised in a very short time. In the view of the responsible health officials, the campaign was not successful and indeed, did not achieve its ambitious target. Conclusion The Thai case study suggests that the political crisis was a

  8. Scaling up cervical cancer screening in the midst of human papillomavirus vaccination advocacy in Thailand.

    Science.gov (United States)

    Yothasamut, Jomkwan; Putchong, Choenkwan; Sirisamutr, Teera; Teerawattananon, Yot; Tantivess, Sripen

    2010-07-02

    Screening tests for cervical cancer are effective in reducing the disease burden. In Thailand, a Pap smear program has been implemented throughout the country for 40 years. In 2008 the Ministry of Public Health (MoPH) unexpectedly decided to scale up the coverage of free cervical cancer screening services, to meet an ambitious target. This study analyzes the processes and factors that drove this policy innovation in the area of cervical cancer control in Thailand. In-depth interviews with key policy actors and review of relevant documents were conducted in 2009. Data analysis was guided by a framework, developed on public policy models and existing literature on scaling-up health care interventions. Between 2006 and 2008 international organizations and the vaccine industry advocated the introduction of Human Papillomavirus (HPV) vaccine for the primary prevention of cervical cancer. Meanwhile, a local study suggested that the vaccine was considerably less cost-effective than cervical cancer screening in the Thai context. Then, from August to December 2008, the MoPH carried out a campaign to expand the coverage of its cervical cancer screening program, targeting one million women. The study reveals that several factors were influential in focusing the attention of policymakers on strengthening the screening services. These included the high burden of cervical cancer in Thailand, the launch of the HPV vaccine onto the global and domestic markets, the country's political instability, and the dissemination of scientific evidence regarding the appropriateness of different options for cervical cancer prevention. Influenced by the country's political crisis, the MoPH's campaign was devised in a very short time. In the view of the responsible health officials, the campaign was not successful and indeed, did not achieve its ambitious target. The Thai case study suggests that the political crisis was a crucial factor that drew the attention of policymakers to the cervical

  9. New information on high risk breast screening

    International Nuclear Information System (INIS)

    Riedl, C.C.; Ponhold, L.; Gruber, R.; Pinker, K.; Helbich, T.H.

    2010-01-01

    Women with an elevated risk for breast cancer require intensified screening beginning at an early age. Such high risk screening differs considerably from screening in the general population. After an expert has evaluated the exact risk a breast MRI examination should be offered at least once a year and beginning latest at the age of 30 depending on the patients risk category. Complementary mammograms should not be performed before the age of 35. An additional ultrasound examination is no longer recommended. To ensure a high sensitivity and specificity high risk screening should be performed only at a nationally or regionally approved and audited service. Adequate knowledge about the phenotypical characteristics of familial breast cancer is essential. Besides the common malignant phenotypes, benign morphologies (round or oval shape and smooth margins) as well as a low prevalence of calcifications have been described. Using MRI benign contrast media kinetics as well as non-solid lesions with focal, regional and segmental enhancement can often be visualized. (orig.) [de

  10. GAIA: A WINDOW TO LARGE-SCALE MOTIONS

    Energy Technology Data Exchange (ETDEWEB)

    Nusser, Adi [Physics Department and the Asher Space Science Institute-Technion, Haifa 32000 (Israel); Branchini, Enzo [Department of Physics, Universita Roma Tre, Via della Vasca Navale 84, 00146 Rome (Italy); Davis, Marc, E-mail: adi@physics.technion.ac.il, E-mail: branchin@fis.uniroma3.it, E-mail: mdavis@berkeley.edu [Departments of Astronomy and Physics, University of California, Berkeley, CA 94720 (United States)

    2012-08-10

    Using redshifts as a proxy for galaxy distances, estimates of the two-dimensional (2D) transverse peculiar velocities of distant galaxies could be obtained from future measurements of proper motions. We provide the mathematical framework for analyzing 2D transverse motions and show that they offer several advantages over traditional probes of large-scale motions. They are completely independent of any intrinsic relations between galaxy properties; hence, they are essentially free of selection biases. They are free from homogeneous and inhomogeneous Malmquist biases that typically plague distance indicator catalogs. They provide additional information to traditional probes that yield line-of-sight peculiar velocities only. Further, because of their 2D nature, fundamental questions regarding vorticity of large-scale flows can be addressed. Gaia, for example, is expected to provide proper motions of at least bright galaxies with high central surface brightness, making proper motions a likely contender for traditional probes based on current and future distance indicator measurements.

  11. Large-scale hydrogen production using nuclear reactors

    Energy Technology Data Exchange (ETDEWEB)

    Ryland, D.; Stolberg, L.; Kettner, A.; Gnanapragasam, N.; Suppiah, S. [Atomic Energy of Canada Limited, Chalk River, ON (Canada)

    2014-07-01

    For many years, Atomic Energy of Canada Limited (AECL) has been studying the feasibility of using nuclear reactors, such as the Supercritical Water-cooled Reactor, as an energy source for large scale hydrogen production processes such as High Temperature Steam Electrolysis and the Copper-Chlorine thermochemical cycle. Recent progress includes the augmentation of AECL's experimental capabilities by the construction of experimental systems to test high temperature steam electrolysis button cells at ambient pressure and temperatures up to 850{sup o}C and CuCl/HCl electrolysis cells at pressures up to 7 bar and temperatures up to 100{sup o}C. In parallel, detailed models of solid oxide electrolysis cells and the CuCl/HCl electrolysis cell are being refined and validated using experimental data. Process models are also under development to assess options for economic integration of these hydrogen production processes with nuclear reactors. Options for large-scale energy storage, including hydrogen storage, are also under study. (author)

  12. RNAi Screening in Spodoptera frugiperda.

    Science.gov (United States)

    Ghosh, Subhanita; Singh, Gatikrushna; Sachdev, Bindiya; Kumar, Ajit; Malhotra, Pawan; Mukherjee, Sunil K; Bhatnagar, Raj K

    2016-01-01

    RNA interference is a potent and precise reverse genetic approach to carryout large-scale functional genomic studies in a given organism. During the past decade, RNAi has also emerged as an important investigative tool to understand the process of viral pathogenesis. Our laboratory has successfully generated transgenic reporter and RNAi sensor line of Spodoptera frugiperda (Sf21) cells and developed a reversal of silencing assay via siRNA or shRNA guided screening to investigate RNAi factors or viral pathogenic factors with extraordinary fidelity. Here we describe empirical approaches and conceptual understanding to execute successful RNAi screening in Spodoptera frugiperda 21-cell line.

  13. Planck intermediate results XLII. Large-scale Galactic magnetic fields

    DEFF Research Database (Denmark)

    Adam, R.; Ade, P. A. R.; Alves, M. I. R.

    2016-01-01

    Recent models for the large-scale Galactic magnetic fields in the literature have been largely constrained by synchrotron emission and Faraday rotation measures. We use three different but representative models to compare their predicted polarized synchrotron and dust emission with that measured ...

  14. A Topology Visualization Early Warning Distribution Algorithm for Large-Scale Network Security Incidents

    Directory of Open Access Journals (Sweden)

    Hui He

    2013-01-01

    Full Text Available It is of great significance to research the early warning system for large-scale network security incidents. It can improve the network system’s emergency response capabilities, alleviate the cyber attacks’ damage, and strengthen the system’s counterattack ability. A comprehensive early warning system is presented in this paper, which combines active measurement and anomaly detection. The key visualization algorithm and technology of the system are mainly discussed. The large-scale network system’s plane visualization is realized based on the divide and conquer thought. First, the topology of the large-scale network is divided into some small-scale networks by the MLkP/CR algorithm. Second, the sub graph plane visualization algorithm is applied to each small-scale network. Finally, the small-scale networks’ topologies are combined into a topology based on the automatic distribution algorithm of force analysis. As the algorithm transforms the large-scale network topology plane visualization problem into a series of small-scale network topology plane visualization and distribution problems, it has higher parallelism and is able to handle the display of ultra-large-scale network topology.

  15. No Large Scale Curvature Perturbations during Waterfall of Hybrid Inflation

    OpenAIRE

    Abolhasani, Ali Akbar; Firouzjahi, Hassan

    2010-01-01

    In this paper the possibility of generating large scale curvature perturbations induced from the entropic perturbations during the waterfall phase transition of standard hybrid inflation model is studied. We show that whether or not appreciable amounts of large scale curvature perturbations are produced during the waterfall phase transition depend crucially on the competition between the classical and the quantum mechanical back-reactions to terminate inflation. If one considers only the clas...

  16. Large Scale Emerging Properties from Non Hamiltonian Complex Systems

    Directory of Open Access Journals (Sweden)

    Marco Bianucci

    2017-06-01

    Full Text Available The concept of “large scale” depends obviously on the phenomenon we are interested in. For example, in the field of foundation of Thermodynamics from microscopic dynamics, the spatial and time large scales are order of fraction of millimetres and microseconds, respectively, or lesser, and are defined in relation to the spatial and time scales of the microscopic systems. In large scale oceanography or global climate dynamics problems the time scales of interest are order of thousands of kilometres, for space, and many years for time, and are compared to the local and daily/monthly times scales of atmosphere and ocean dynamics. In all the cases a Zwanzig projection approach is, at least in principle, an effective tool to obtain class of universal smooth “large scale” dynamics for few degrees of freedom of interest, starting from the complex dynamics of the whole (usually many degrees of freedom system. The projection approach leads to a very complex calculus with differential operators, that is drastically simplified when the basic dynamics of the system of interest is Hamiltonian, as it happens in Foundation of Thermodynamics problems. However, in geophysical Fluid Dynamics, Biology, and in most of the physical problems the building block fundamental equations of motions have a non Hamiltonian structure. Thus, to continue to apply the useful projection approach also in these cases, we exploit the generalization of the Hamiltonian formalism given by the Lie algebra of dissipative differential operators. In this way, we are able to analytically deal with the series of the differential operators stemming from the projection approach applied to these general cases. Then we shall apply this formalism to obtain some relevant results concerning the statistical properties of the El Niño Southern Oscillation (ENSO.

  17. Open innovation for phenotypic drug discovery: The PD2 assay panel.

    Science.gov (United States)

    Lee, Jonathan A; Chu, Shaoyou; Willard, Francis S; Cox, Karen L; Sells Galvin, Rachelle J; Peery, Robert B; Oliver, Sarah E; Oler, Jennifer; Meredith, Tamika D; Heidler, Steven A; Gough, Wendy H; Husain, Saba; Palkowitz, Alan D; Moxham, Christopher M

    2011-07-01

    Phenotypic lead generation strategies seek to identify compounds that modulate complex, physiologically relevant systems, an approach that is complementary to traditional, target-directed strategies. Unlike gene-specific assays, phenotypic assays interrogate multiple molecular targets and signaling pathways in a target "agnostic" fashion, which may reveal novel functions for well-studied proteins and discover new pathways of therapeutic value. Significantly, existing compound libraries may not have sufficient chemical diversity to fully leverage a phenotypic strategy. To address this issue, Eli Lilly and Company launched the Phenotypic Drug Discovery Initiative (PD(2)), a model of open innovation whereby external research groups can submit compounds for testing in a panel of Lilly phenotypic assays. This communication describes the statistical validation, operations, and initial screening results from the first PD(2) assay panel. Analysis of PD(2) submissions indicates that chemical diversity from open source collaborations complements internal sources. Screening results for the first 4691 compounds submitted to PD(2) have confirmed hit rates from 1.6% to 10%, with the majority of active compounds exhibiting acceptable potency and selectivity. Phenotypic lead generation strategies, in conjunction with novel chemical diversity obtained via open-source initiatives such as PD(2), may provide a means to identify compounds that modulate biology by novel mechanisms and expand the innovation potential of drug discovery.

  18. The micro-environmental impact of volatile organic compound emissions from large-scale assemblies of people in a confined space

    Energy Technology Data Exchange (ETDEWEB)

    Dutta, Tanushree [Department of Civil & Environmental Engineering, Hanyang University, 222 Wangsimni-Ro, Seoul 04763 (Korea, Republic of); Kim, Ki-Hyun, E-mail: kkim61@hanyang.ac.kr [Department of Civil & Environmental Engineering, Hanyang University, 222 Wangsimni-Ro, Seoul 04763 (Korea, Republic of); Uchimiya, Minori [USDA-ARS Southern Regional Research Center, 1100 Robert E. Lee Boulevard, New Orleans, LA 70124 (United States); Kumar, Pawan [Department of Chemical Engineering, Indian Institute of Technology, Hauz Khas, New Delhi 11016 (India); Das, Subhasish; Bhattacharya, Satya Sundar [Soil & Agro-Bioengineering Lab, Department of Environmental Science, Tezpur University, Napaam 784028 (India); Szulejko, Jan [Department of Civil & Environmental Engineering, Hanyang University, 222 Wangsimni-Ro, Seoul 04763 (Korea, Republic of)

    2016-11-15

    Large-scale assemblies of people in a confined space can exert significant impacts on the local air chemistry due to human emissions of volatile organics. Variations of air-quality in such small scale can be studied by quantifying fingerprint volatile organic compounds (VOCs) such as acetone, toluene, and isoprene produced during concerts, movie screenings, and sport events (like the Olympics and the World Cup). This review summarizes the extent of VOC accumulation resulting from a large population in a confined area or in a small open area during sporting and other recreational activities. Apart from VOCs emitted directly from human bodies (e.g., perspiration and exhaled breath), those released indirectly from other related sources (e.g., smoking, waste disposal, discharge of food-waste, and use of personal-care products) are also discussed. Although direct and indirect emissions of VOCs from human may constitute <1% of the global atmospheric VOCs budget, unique spatiotemporal variations in VOCs species within a confined space can have unforeseen impacts on the local atmosphere to lead to acute human exposure to harmful pollutants.

  19. The micro-environmental impact of volatile organic compound emissions from large-scale assemblies of people in a confined space

    International Nuclear Information System (INIS)

    Dutta, Tanushree; Kim, Ki-Hyun; Uchimiya, Minori; Kumar, Pawan; Das, Subhasish; Bhattacharya, Satya Sundar; Szulejko, Jan

    2016-01-01

    Large-scale assemblies of people in a confined space can exert significant impacts on the local air chemistry due to human emissions of volatile organics. Variations of air-quality in such small scale can be studied by quantifying fingerprint volatile organic compounds (VOCs) such as acetone, toluene, and isoprene produced during concerts, movie screenings, and sport events (like the Olympics and the World Cup). This review summarizes the extent of VOC accumulation resulting from a large population in a confined area or in a small open area during sporting and other recreational activities. Apart from VOCs emitted directly from human bodies (e.g., perspiration and exhaled breath), those released indirectly from other related sources (e.g., smoking, waste disposal, discharge of food-waste, and use of personal-care products) are also discussed. Although direct and indirect emissions of VOCs from human may constitute <1% of the global atmospheric VOCs budget, unique spatiotemporal variations in VOCs species within a confined space can have unforeseen impacts on the local atmosphere to lead to acute human exposure to harmful pollutants.

  20. A new system of labour management in African large-scale agriculture?

    DEFF Research Database (Denmark)

    Gibbon, Peter; Riisgaard, Lone

    2014-01-01

    This paper applies a convention theory (CT) approach to the analysis of labour management systems in African large-scale farming. The reconstruction of previous analyses of high-value crop production on large-scale farms in Africa in terms of CT suggests that, since 1980–95, labour management has...

  1. Pseudoscalar-photon mixing and the large scale alignment of QsO ...

    Indian Academy of Sciences (India)

    physics pp. 679-682. Pseudoscalar-photon mixing and the large scale alignment of QsO optical polarizations. PANKAJ JAIN, sUKANTA PANDA and s sARALA. Physics Department, Indian Institute of Technology, Kanpur 208 016, India. Abstract. We review the observation of large scale alignment of QSO optical polariza-.

  2. A bioimage informatics platform for high-throughput embryo phenotyping.

    Science.gov (United States)

    Brown, James M; Horner, Neil R; Lawson, Thomas N; Fiegel, Tanja; Greenaway, Simon; Morgan, Hugh; Ring, Natalie; Santos, Luis; Sneddon, Duncan; Teboul, Lydia; Vibert, Jennifer; Yaikhom, Gagarine; Westerberg, Henrik; Mallon, Ann-Marie

    2018-01-01

    High-throughput phenotyping is a cornerstone of numerous functional genomics projects. In recent years, imaging screens have become increasingly important in understanding gene-phenotype relationships in studies of cells, tissues and whole organisms. Three-dimensional (3D) imaging has risen to prominence in the field of developmental biology for its ability to capture whole embryo morphology and gene expression, as exemplified by the International Mouse Phenotyping Consortium (IMPC). Large volumes of image data are being acquired by multiple institutions around the world that encompass a range of modalities, proprietary software and metadata. To facilitate robust downstream analysis, images and metadata must be standardized to account for these differences. As an open scientific enterprise, making the data readily accessible is essential so that members of biomedical and clinical research communities can study the images for themselves without the need for highly specialized software or technical expertise. In this article, we present a platform of software tools that facilitate the upload, analysis and dissemination of 3D images for the IMPC. Over 750 reconstructions from 80 embryonic lethal and subviable lines have been captured to date, all of which are openly accessible at mousephenotype.org. Although designed for the IMPC, all software is available under an open-source licence for others to use and develop further. Ongoing developments aim to increase throughput and improve the analysis and dissemination of image data. Furthermore, we aim to ensure that images are searchable so that users can locate relevant images associated with genes, phenotypes or human diseases of interest. © The Author 2016. Published by Oxford University Press.

  3. On the universal character of the large scale structure of the universe

    International Nuclear Information System (INIS)

    Demianski, M.; International Center for Relativistic Astrophysics; Rome Univ.; Doroshkevich, A.G.

    1991-01-01

    We review different theories of formation of the large scale structure of the Universe. Special emphasis is put on the theory of inertial instability. We show that for a large class of initial spectra the resulting two point correlation functions are similar. We discuss also the adhesion theory which uses the Burgers equation, Navier-Stokes equation or coagulation process. We review the Zeldovich theory of gravitational instability and discuss the internal structure of pancakes. Finally we discuss the role of the velocity potential in determining the global characteristics of large scale structures (distribution of caustics, scale of voids, etc.). In the last chapter we list the main unsolved problems and main successes of the theory of formation of large scale structure. (orig.)

  4. LAVA: Large scale Automated Vulnerability Addition

    Science.gov (United States)

    2016-05-23

    LAVA: Large-scale Automated Vulnerability Addition Brendan Dolan -Gavitt∗, Patrick Hulin†, Tim Leek†, Fredrich Ulrich†, Ryan Whelan† (Authors listed...released, and thus rapidly become stale. We can expect tools to have been trained to detect bugs that have been released. Given the commercial price tag...low TCN) and dead (low liveness) program data is a powerful one for vulnera- bility injection. The DUAs it identifies are internal program quantities

  5. Large-Scale Transit Signal Priority Implementation

    OpenAIRE

    Lee, Kevin S.; Lozner, Bailey

    2018-01-01

    In 2016, the District Department of Transportation (DDOT) deployed Transit Signal Priority (TSP) at 195 intersections in highly urbanized areas of Washington, DC. In collaboration with a broader regional implementation, and in partnership with the Washington Metropolitan Area Transit Authority (WMATA), DDOT set out to apply a systems engineering–driven process to identify, design, test, and accept a large-scale TSP system. This presentation will highlight project successes and lessons learned.

  6. Probing cosmology with the homogeneity scale of the Universe through large scale structure surveys

    International Nuclear Information System (INIS)

    Ntelis, Pierros

    2017-01-01

    This thesis exposes my contribution to the measurement of homogeneity scale using galaxies, with the cosmological interpretation of results. In physics, any model is characterized by a set of principles. Most models in cosmology are based on the Cosmological Principle, which states that the universe is statistically homogeneous and isotropic on a large scales. Today, this principle is considered to be true since it is respected by those cosmological models that accurately describe the observations. However, while the isotropy of the universe is now confirmed by many experiments, it is not the case for the homogeneity. To study cosmic homogeneity, we propose to not only test a model but to test directly one of the postulates of modern cosmology. Since 1998 the measurements of cosmic distances using type Ia supernovae, we know that the universe is now in a phase of accelerated expansion. This phenomenon can be explained by the addition of an unknown energy component, which is called dark energy. Since dark energy is responsible for the expansion of the universe, we can study this mysterious fluid by measuring the rate of expansion of the universe. The universe has imprinted in its matter distribution a standard ruler, the Baryon Acoustic Oscillation (BAO) scale. By measuring this scale at different times during the evolution of our universe, it is then possible to measure the rate of expansion of the universe and thus characterize this dark energy. Alternatively, we can use the homogeneity scale to study this dark energy. Studying the homogeneity and the BAO scale requires the statistical study of the matter distribution of the universe at large scales, superior to tens of Mega-parsecs. Galaxies and quasars are formed in the vast over densities of matter and they are very luminous: these sources trace the distribution of matter. By measuring the emission spectra of these sources using large spectroscopic surveys, such as BOSS and eBOSS, we can measure their positions

  7. Utilization of genomic signatures to identify phenotype-specific drugs.

    Directory of Open Access Journals (Sweden)

    Seiichi Mori

    2009-08-01

    Full Text Available Genetic and genomic studies highlight the substantial complexity and heterogeneity of human cancers and emphasize the general lack of therapeutics that can match this complexity. With the goal of expanding opportunities for drug discovery, we describe an approach that makes use of a phenotype-based screen combined with the use of multiple cancer cell lines. In particular, we have used the NCI-60 cancer cell line panel that includes drug sensitivity measures for over 40,000 compounds assayed on 59 independent cells lines. Targets are cancer-relevant phenotypes represented as gene expression signatures that are used to identify cells within the NCI-60 panel reflecting the signature phenotype and then connect to compounds that are selectively active against those cells. As a proof-of-concept, we show that this strategy effectively identifies compounds with selectivity to the RAS or PI3K pathways. We have then extended this strategy to identify compounds that have activity towards cells exhibiting the basal phenotype of breast cancer, a clinically-important breast cancer characterized as ER-, PR-, and Her2- that lacks viable therapeutic options. One of these compounds, Simvastatin, has previously been shown to inhibit breast cancer cell growth in vitro and importantly, has been associated with a reduction in ER-, PR- breast cancer in a clinical study. We suggest that this approach provides a novel strategy towards identification of therapeutic agents based on clinically relevant phenotypes that can augment the conventional strategies of target-based screens.

  8. Large-Scale Optimization for Bayesian Inference in Complex Systems

    Energy Technology Data Exchange (ETDEWEB)

    Willcox, Karen [MIT; Marzouk, Youssef [MIT

    2013-11-12

    The SAGUARO (Scalable Algorithms for Groundwater Uncertainty Analysis and Robust Optimization) Project focused on the development of scalable numerical algorithms for large-scale Bayesian inversion in complex systems that capitalize on advances in large-scale simulation-based optimization and inversion methods. The project was a collaborative effort among MIT, the University of Texas at Austin, Georgia Institute of Technology, and Sandia National Laboratories. The research was directed in three complementary areas: efficient approximations of the Hessian operator, reductions in complexity of forward simulations via stochastic spectral approximations and model reduction, and employing large-scale optimization concepts to accelerate sampling. The MIT--Sandia component of the SAGUARO Project addressed the intractability of conventional sampling methods for large-scale statistical inverse problems by devising reduced-order models that are faithful to the full-order model over a wide range of parameter values; sampling then employs the reduced model rather than the full model, resulting in very large computational savings. Results indicate little effect on the computed posterior distribution. On the other hand, in the Texas--Georgia Tech component of the project, we retain the full-order model, but exploit inverse problem structure (adjoint-based gradients and partial Hessian information of the parameter-to-observation map) to implicitly extract lower dimensional information on the posterior distribution; this greatly speeds up sampling methods, so that fewer sampling points are needed. We can think of these two approaches as ``reduce then sample'' and ``sample then reduce.'' In fact, these two approaches are complementary, and can be used in conjunction with each other. Moreover, they both exploit deterministic inverse problem structure, in the form of adjoint-based gradient and Hessian information of the underlying parameter-to-observation map, to

  9. Response of deep and shallow tropical maritime cumuli to large-scale processes

    Science.gov (United States)

    Yanai, M.; Chu, J.-H.; Stark, T. E.; Nitta, T.

    1976-01-01

    The bulk diagnostic method of Yanai et al. (1973) and a simplified version of the spectral diagnostic method of Nitta (1975) are used for a more quantitative evaluation of the response of various types of cumuliform clouds to large-scale processes, using the same data set in the Marshall Islands area for a 100-day period in 1956. The dependence of the cloud mass flux distribution on radiative cooling, large-scale vertical motion, and evaporation from the sea is examined. It is shown that typical radiative cooling rates in the tropics tend to produce a bimodal distribution of mass spectrum exhibiting deep and shallow clouds. The bimodal distribution is further enhanced when the large-scale vertical motion is upward, and a nearly unimodal distribution of shallow clouds prevails when the relative cooling is compensated by the heating due to the large-scale subsidence. Both deep and shallow clouds are modulated by large-scale disturbances. The primary role of surface evaporation is to maintain the moisture flux at the cloud base.

  10. Accuracy assessment of planimetric large-scale map data for decision-making

    Directory of Open Access Journals (Sweden)

    Doskocz Adam

    2016-06-01

    Full Text Available This paper presents decision-making risk estimation based on planimetric large-scale map data, which are data sets or databases which are useful for creating planimetric maps on scales of 1:5,000 or larger. The studies were conducted on four data sets of large-scale map data. Errors of map data were used for a risk assessment of decision-making about the localization of objects, e.g. for land-use planning in realization of investments. An analysis was performed for a large statistical sample set of shift vectors of control points, which were identified with the position errors of these points (errors of map data.

  11. Large scale, highly conductive and patterned transparent films of silver nanowires on arbitrary substrates and their application in touch screens

    International Nuclear Information System (INIS)

    Madaria, Anuj R; Kumar, Akshay; Zhou Chongwu

    2011-01-01

    The application of silver nanowire films as transparent conductive electrodes has shown promising results recently. In this paper, we demonstrate the application of a simple spray coating technique to obtain large scale, highly uniform and conductive silver nanowire films on arbitrary substrates. We also integrated a polydimethylsiloxane (PDMS)-assisted contact transfer technique with spray coating, which allowed us to obtain large scale high quality patterned films of silver nanowires. The transparency and conductivity of the films was controlled by the volume of the dispersion used in spraying and the substrate area. We note that the optoelectrical property, σ DC /σ Op , for various films fabricated was in the range 75-350, which is extremely high for transparent thin film compared to other candidate alternatives to doped metal oxide film. Using this method, we obtain silver nanowire films on a flexible polyethylene terephthalate (PET) substrate with a transparency of 85% and sheet resistance of 33 Ω/sq, which is comparable to that of tin-doped indium oxide (ITO) on flexible substrates. In-depth analysis of the film shows a high performance using another commonly used figure-of-merit, Φ TE . Also, Ag nanowire film/PET shows good mechanical flexibility and the application of such a conductive silver nanowire film as an electrode in a touch panel has been demonstrated.

  12. Large scale, highly conductive and patterned transparent films of silver nanowires on arbitrary substrates and their application in touch screens.

    Science.gov (United States)

    Madaria, Anuj R; Kumar, Akshay; Zhou, Chongwu

    2011-06-17

    The application of silver nanowire films as transparent conductive electrodes has shown promising results recently. In this paper, we demonstrate the application of a simple spray coating technique to obtain large scale, highly uniform and conductive silver nanowire films on arbitrary substrates. We also integrated a polydimethylsiloxane (PDMS)-assisted contact transfer technique with spray coating, which allowed us to obtain large scale high quality patterned films of silver nanowires. The transparency and conductivity of the films was controlled by the volume of the dispersion used in spraying and the substrate area. We note that the optoelectrical property, σ(DC)/σ(Op), for various films fabricated was in the range 75-350, which is extremely high for transparent thin film compared to other candidate alternatives to doped metal oxide film. Using this method, we obtain silver nanowire films on a flexible polyethylene terephthalate (PET) substrate with a transparency of 85% and sheet resistance of 33 Ω/sq, which is comparable to that of tin-doped indium oxide (ITO) on flexible substrates. In-depth analysis of the film shows a high performance using another commonly used figure-of-merit, Φ(TE). Also, Ag nanowire film/PET shows good mechanical flexibility and the application of such a conductive silver nanowire film as an electrode in a touch panel has been demonstrated.

  13. Reviving large-scale projects

    International Nuclear Information System (INIS)

    Desiront, A.

    2003-01-01

    For the past decade, most large-scale hydro development projects in northern Quebec have been put on hold due to land disputes with First Nations. Hydroelectric projects have recently been revived following an agreement signed with Aboriginal communities in the province who recognized the need to find new sources of revenue for future generations. Many Cree are working on the project to harness the waters of the Eastmain River located in the middle of their territory. The work involves building an 890 foot long dam, 30 dikes enclosing a 603 square-km reservoir, a spillway, and a power house with 3 generating units with a total capacity of 480 MW of power for start-up in 2007. The project will require the use of 2,400 workers in total. The Cree Construction and Development Company is working on relations between Quebec's 14,000 Crees and the James Bay Energy Corporation, the subsidiary of Hydro-Quebec which is developing the project. Approximately 10 per cent of the $735-million project has been designated for the environmental component. Inspectors ensure that the project complies fully with environmental protection guidelines. Total development costs for Eastmain-1 are in the order of $2 billion of which $735 million will cover work on site and the remainder will cover generating units, transportation and financial charges. Under the treaty known as the Peace of the Braves, signed in February 2002, the Quebec government and Hydro-Quebec will pay the Cree $70 million annually for 50 years for the right to exploit hydro, mining and forest resources within their territory. The project comes at a time when electricity export volumes to the New England states are down due to growth in Quebec's domestic demand. Hydropower is a renewable and non-polluting source of energy that is one of the most acceptable forms of energy where the Kyoto Protocol is concerned. It was emphasized that large-scale hydro-electric projects are needed to provide sufficient energy to meet both

  14. Large-scale Flow and Transport of Magnetic Flux in the Solar ...

    Indian Academy of Sciences (India)

    tribpo

    Abstract. Horizontal large-scale velocity field describes horizontal displacement of the photospheric magnetic flux in zonal and meridian directions. The flow systems of solar plasma, constructed according to the velocity field, create the large-scale cellular-like patterns with up-flow in the center and the down-flow on the ...

  15. Utilization of Large Scale Surface Models for Detailed Visibility Analyses

    Science.gov (United States)

    Caha, J.; Kačmařík, M.

    2017-11-01

    This article demonstrates utilization of large scale surface models with small spatial resolution and high accuracy, acquired from Unmanned Aerial Vehicle scanning, for visibility analyses. The importance of large scale data for visibility analyses on the local scale, where the detail of the surface model is the most defining factor, is described. The focus is not only the classic Boolean visibility, that is usually determined within GIS, but also on so called extended viewsheds that aims to provide more information about visibility. The case study with examples of visibility analyses was performed on river Opava, near the Ostrava city (Czech Republic). The multiple Boolean viewshed analysis and global horizon viewshed were calculated to determine most prominent features and visibility barriers of the surface. Besides that, the extended viewshed showing angle difference above the local horizon, which describes angular height of the target area above the barrier, is shown. The case study proved that large scale models are appropriate data source for visibility analyses on local level. The discussion summarizes possible future applications and further development directions of visibility analyses.

  16. Large-scale modeling of rain fields from a rain cell deterministic model

    Science.gov (United States)

    FéRal, Laurent; Sauvageot, Henri; Castanet, Laurent; Lemorton, JoëL.; Cornet, FréDéRic; Leconte, Katia

    2006-04-01

    A methodology to simulate two-dimensional rain rate fields at large scale (1000 × 1000 km2, the scale of a satellite telecommunication beam or a terrestrial fixed broadband wireless access network) is proposed. It relies on a rain rate field cellular decomposition. At small scale (˜20 × 20 km2), the rain field is split up into its macroscopic components, the rain cells, described by the Hybrid Cell (HYCELL) cellular model. At midscale (˜150 × 150 km2), the rain field results from the conglomeration of rain cells modeled by HYCELL. To account for the rain cell spatial distribution at midscale, the latter is modeled by a doubly aggregative isotropic random walk, the optimal parameterization of which is derived from radar observations at midscale. The extension of the simulation area from the midscale to the large scale (1000 × 1000 km2) requires the modeling of the weather frontal area. The latter is first modeled by a Gaussian field with anisotropic covariance function. The Gaussian field is then turned into a binary field, giving the large-scale locations over which it is raining. This transformation requires the definition of the rain occupation rate over large-scale areas. Its probability distribution is determined from observations by the French operational radar network ARAMIS. The coupling with the rain field modeling at midscale is immediate whenever the large-scale field is split up into midscale subareas. The rain field thus generated accounts for the local CDF at each point, defining a structure spatially correlated at small scale, midscale, and large scale. It is then suggested that this approach be used by system designers to evaluate diversity gain, terrestrial path attenuation, or slant path attenuation for different azimuth and elevation angle directions.

  17. Facile Large-scale synthesis of stable CuO nanoparticles

    Science.gov (United States)

    Nazari, P.; Abdollahi-Nejand, B.; Eskandari, M.; Kohnehpoushi, S.

    2018-04-01

    In this work, a novel approach in synthesizing the CuO nanoparticles was introduced. A sequential corrosion and detaching was proposed in the growth and dispersion of CuO nanoparticles in the optimum pH value of eight. The produced CuO nanoparticles showed six nm (±2 nm) in diameter and spherical feather with a high crystallinity and uniformity in size. In this method, a large-scale production of CuO nanoparticles (120 grams in an experimental batch) from Cu micro-particles was achieved which may met the market criteria for large-scale production of CuO nanoparticles.

  18. Large-Scale Cooperative Task Distribution on Peer-to-Peer Networks

    Science.gov (United States)

    2012-01-01

    SUBTITLE Large-scale cooperative task distribution on peer-to-peer networks 5a. CONTRACT NUMBER 5b. GRANT NUMBER 5c. PROGRAM ELEMENT NUMBER 6...disadvantages of ML- Chord are its fixed size (two layers), and limited scala - bility for large-scale systems. RC-Chord extends ML- D. Karrels et al...configurable before runtime. This can be improved by incorporating a distributed learning algorithm to tune the number and range of the DLoE tracking

  19. Comparative Analysis of Different Protocols to Manage Large Scale Networks

    OpenAIRE

    Anil Rao Pimplapure; Dr Jayant Dubey; Prashant Sen

    2013-01-01

    In recent year the numbers, complexity and size is increased in Large Scale Network. The best example of Large Scale Network is Internet, and recently once are Data-centers in Cloud Environment. In this process, involvement of several management tasks such as traffic monitoring, security and performance optimization is big task for Network Administrator. This research reports study the different protocols i.e. conventional protocols like Simple Network Management Protocol and newly Gossip bas...

  20. A single-question screen for rapid eye movement sleep behavior disorder

    DEFF Research Database (Denmark)

    Postuma, Ronald B; Arnulf, Isabelle; Hogl, Birgit

    2012-01-01

    Idiopathic rapid eye movement (REM) sleep behavior disorder (RBD) is a parasomnia that is an important risk factor for Parkinson's disease (PD) and Lewy body dementia. Its prevalence is unknown. One barrier to determining prevalence is that current screening tools are too long for large......-scale epidemiologic surveys. Therefore, we designed the REM Sleep Behavior Disorder Single-Question Screen (RBD1Q), a screening question for dream enactment with a simple yes/no response....

  1. Puzzles of large scale structure and gravitation

    International Nuclear Information System (INIS)

    Sidharth, B.G.

    2006-01-01

    We consider the puzzle of cosmic voids bounded by two-dimensional structures of galactic clusters as also a puzzle pointed out by Weinberg: How can the mass of a typical elementary particle depend on a cosmic parameter like the Hubble constant? An answer to the first puzzle is proposed in terms of 'Scaled' Quantum Mechanical like behaviour which appears at large scales. The second puzzle can be answered by showing that the gravitational mass of an elementary particle has a Machian character (see Ahmed N. Cantorian small worked, Mach's principle and the universal mass network. Chaos, Solitons and Fractals 2004;21(4))

  2. Personalized Opportunistic Computing for CMS at Large Scale

    CERN Multimedia

    CERN. Geneva

    2015-01-01

    **Douglas Thain** is an Associate Professor of Computer Science and Engineering at the University of Notre Dame, where he designs large scale distributed computing systems to power the needs of advanced science and...

  3. Stability of large scale interconnected dynamical systems

    International Nuclear Information System (INIS)

    Akpan, E.P.

    1993-07-01

    Large scale systems modelled by a system of ordinary differential equations are considered and necessary and sufficient conditions are obtained for the uniform asymptotic connective stability of the systems using the method of cone-valued Lyapunov functions. It is shown that this model significantly improves the existing models. (author). 9 refs

  4. Hepatitis C screening trends in a large integrated health system.

    Science.gov (United States)

    Linas, Benjamin P; Hu, Haihong; Barter, Devra M; Horberg, Michael

    2014-05-01

    As new hepatitis C virus (HCV) therapies emerge, only 1%-12% of individuals are screened in the US for HCV infection. Presently, HCV screening trends are unknown. We utilized the Kaiser Permanente Mid-Atlantic States' (KPMAS) data repository to investigate HCV antibody screening between January 1, 2003 and December 31, 2012. We identified the proportion screened for HCV and 5-year cumulative incidence of screening, the screening positivity rate, the provider types performing HCV screening, patient-level factors associated with being screened, and trends in screening over time. There were 444,594 patients who met the inclusion criteria. Overall, 15.8% of the cohort was ever screened for HCV. Adult primary care and obstetrics and gynecology providers performed 75.9% of all screening. The overall test positivity rate was 3.8%. Screening was more frequent in younger age groups (P <.0001) and those with a documented history of illicit drug use (P <.0001). Patients with missing drug use history (46.7%) were least likely to be screened (P <.0001). While the rate of HCV screening increased in the later years of the study among those enrolled in KPMAS 2009-2012, only 11.8% were screened by the end of follow-up. Screening for HCV is increasing but remains incomplete. Targeting screening to those with a history of injection drug will not likely expand screening, as nearly half of patients have no documented drug use history. Routine screening is likely the most effective approach to expand HCV screening. Copyright © 2014. Published by Elsevier Inc.

  5. Large scale cross hole testing

    International Nuclear Information System (INIS)

    Ball, J.K.; Black, J.H.; Doe, T.

    1991-05-01

    As part of the Site Characterisation and Validation programme the results of the large scale cross hole testing have been used to document hydraulic connections across the SCV block, to test conceptual models of fracture zones and obtain hydrogeological properties of the major hydrogeological features. The SCV block is highly heterogeneous. This heterogeneity is not smoothed out even over scales of hundreds of meters. Results of the interpretation validate the hypothesis of the major fracture zones, A, B and H; not much evidence of minor fracture zones is found. The uncertainty in the flow path, through the fractured rock, causes sever problems in interpretation. Derived values of hydraulic conductivity were found to be in a narrow range of two to three orders of magnitude. Test design did not allow fracture zones to be tested individually. This could be improved by testing the high hydraulic conductivity regions specifically. The Piezomac and single hole equipment worked well. Few, if any, of the tests ran long enough to approach equilibrium. Many observation boreholes showed no response. This could either be because there is no hydraulic connection, or there is a connection but a response is not seen within the time scale of the pumping test. The fractional dimension analysis yielded credible results, and the sinusoidal testing procedure provided an effective means of identifying the dominant hydraulic connections. (10 refs.) (au)

  6. Infant outcomes among women with Zika virus infection during pregnancy: results of a large prenatal Zika screening program.

    Science.gov (United States)

    Adhikari, Emily H; Nelson, David B; Johnson, Kathryn A; Jacobs, Sara; Rogers, Vanessa L; Roberts, Scott W; Sexton, Taylor; McIntire, Donald D; Casey, Brian M

    2017-03-01

    Zika virus infection during pregnancy is a known cause of congenital microcephaly and other neurologic morbidities. We present the results of a large-scale prenatal screening program in place at a single-center health care system since March 14, 2016. Our aims were to report the baseline prevalence of travel-associated Zika infection in our pregnant population, determine travel characteristics of women with evidence of Zika infection, and evaluate maternal and neonatal outcomes compared to women without evidence of Zika infection. This is a prospective, observational study of prenatal Zika virus screening in our health care system. We screened all pregnant women for recent travel to a Zika-affected area, and the serum was tested for those considered at risk for infection. We compared maternal demographic and travel characteristics and perinatal outcomes among women with positive and negative Zika virus tests during pregnancy. Comprehensive neurologic evaluation was performed on all infants delivered of women with evidence of possible Zika virus infection during pregnancy. Head circumference percentiles by gestational age were compared for infants delivered of women with positive and negative Zika virus test results. From March 14 through Oct. 1, 2016, a total of 14,161 pregnant women were screened for travel to a Zika-affected country. A total of 610 (4.3%) women reported travel, and test results were available in 547. Of these, evidence of possible Zika virus infection was found in 29 (5.3%). In our population, the prevalence of asymptomatic or symptomatic Zika virus infection among pregnant women was 2/1000. Women with evidence of Zika virus infection were more likely to have traveled from Central or South America (97% vs 12%, P Zika virus infection. Additionally, there was no difference in mean head circumference of infants born to women with positive vs negative Zika virus testing. No microcephalic infants born to women with Zika infection were identified

  7. Large transverse momentum processes in a non-scaling parton model

    International Nuclear Information System (INIS)

    Stirling, W.J.

    1977-01-01

    The production of large transverse momentum mesons in hadronic collisions by the quark fusion mechanism is discussed in a parton model which gives logarithmic corrections to Bjorken scaling. It is found that the moments of the large transverse momentum structure function exhibit a simple scale breaking behaviour similar to the behaviour of the Drell-Yan and deep inelastic structure functions of the model. An estimate of corresponding experimental consequences is made and the extent to which analogous results can be expected in an asymptotically free gauge theory is discussed. A simple set of rules is presented for incorporating the logarithmic corrections to scaling into all covariant parton model calculations. (Auth.)

  8. Bivariate pointing movements on large touch screens: investigating the validity of a refined Fitts' Law.

    Science.gov (United States)

    Bützler, Jennifer; Vetter, Sebastian; Jochems, Nicole; Schlick, Christopher M

    2012-01-01

    On the basis of three empirical studies Fitts' Law was refined for bivariate pointing tasks on large touch screens. In the first study different target width parameters were investigated. The second study considered the effect of the motion angle. Based on the results of the two studies a refined model for movement time in human-computer interaction was formulated. A third study, which is described here in detail, concerns the validation of the refined model. For the validation study 20 subjects had to execute a bivariate pointing task on a large touch screen. In the experimental task 250 rectangular target objects were displayed at a randomly chosen position on the screen covering a broad range of ID values (ID= [1.01; 4.88]). Compared to existing refinements of Fitts' Law, the new model shows highest predictive validity. A promising field of application of the model is the ergonomic design and evaluation of project management software. By using the refined model, software designers can calculate a priori the appropriate angular position and the size of buttons, menus or icons.

  9. On the Renormalization of the Effective Field Theory of Large Scale Structures

    OpenAIRE

    Pajer, Enrico; Zaldarriaga, Matias

    2013-01-01

    Standard perturbation theory (SPT) for large-scale matter inhomogeneities is unsatisfactory for at least three reasons: there is no clear expansion parameter since the density contrast is not small on all scales; it does not fully account for deviations at large scales from a perfect pressureless fluid induced by short-scale non-linearities; for generic initial conditions, loop corrections are UV-divergent, making predictions cutoff dependent and hence unphysical. The Effective Field Theory o...

  10. Quantitative Missense Variant Effect Prediction Using Large-Scale Mutagenesis Data.

    Science.gov (United States)

    Gray, Vanessa E; Hause, Ronald J; Luebeck, Jens; Shendure, Jay; Fowler, Douglas M

    2018-01-24

    Large datasets describing the quantitative effects of mutations on protein function are becoming increasingly available. Here, we leverage these datasets to develop Envision, which predicts the magnitude of a missense variant's molecular effect. Envision combines 21,026 variant effect measurements from nine large-scale experimental mutagenesis datasets, a hitherto untapped training resource, with a supervised, stochastic gradient boosting learning algorithm. Envision outperforms other missense variant effect predictors both on large-scale mutagenesis data and on an independent test dataset comprising 2,312 TP53 variants whose effects were measured using a low-throughput approach. This dataset was never used for hyperparameter tuning or model training and thus serves as an independent validation set. Envision prediction accuracy is also more consistent across amino acids than other predictors. Finally, we demonstrate that Envision's performance improves as more large-scale mutagenesis data are incorporated. We precompute Envision predictions for every possible single amino acid variant in human, mouse, frog, zebrafish, fruit fly, worm, and yeast proteomes (https://envision.gs.washington.edu/). Copyright © 2017 Elsevier Inc. All rights reserved.

  11. Three decades of TBT contamination in sediments around a large scale shipyard.

    Science.gov (United States)

    Kim, Nam Sook; Shim, Won Joon; Yim, Un Hyuk; Ha, Sung Yong; An, Joon Geon; Shin, Kyung Hoon

    2011-08-30

    Tributyltin (TBT) contamination in sediments was investigated in the vicinity of a large-scale shipyard in the years after the implementation of a total ban on the use of TBT based antifouling paints in Korea. Extremely high level of TBT (36,292ng Sn/g) in surface sediment was found at a station in front of a drydock and near surface runoff outfall of the shipyard. TBT concentration in surface sediments of Gohyeon Bay, where the shipyard is located, showed an apparent decreased TBT concentration gradient from the shipyard towards the outer bay. The vertical distribution of TBT contamination derived from a sediment core analysis demonstrated a significant positive correlation (r(2)=0.88; pTBT concentrations at six stations surveyed before (2003) and seven years after (2010) the total ban showed no significant differences (p>0.05). Despite the ban on the use of TBT, including ocean going vessels, surface sediments are still being heavily contaminated with TBT, and its levels well exceeded the sediment quality guideline or screening values. Copyright © 2011 Elsevier B.V. All rights reserved.

  12. Methods for Large-Scale Nonlinear Optimization.

    Science.gov (United States)

    1980-05-01

    STANFORD, CALIFORNIA 94305 METHODS FOR LARGE-SCALE NONLINEAR OPTIMIZATION by Philip E. Gill, Waiter Murray, I Michael A. Saunden, and Masgaret H. Wright...typical iteration can be partitioned so that where B is an m X m basise matrix. This partition effectively divides the vari- ables into three classes... attention is given to the standard of the coding or the documentation. A much better way of obtaining mathematical software is from a software library

  13. Generation of large-scale vorticity in rotating stratified turbulence with inhomogeneous helicity: mean-field theory

    Science.gov (United States)

    Kleeorin, N.

    2018-06-01

    We discuss a mean-field theory of the generation of large-scale vorticity in a rotating density stratified developed turbulence with inhomogeneous kinetic helicity. We show that the large-scale non-uniform flow is produced due to either a combined action of a density stratified rotating turbulence and uniform kinetic helicity or a combined effect of a rotating incompressible turbulence and inhomogeneous kinetic helicity. These effects result in the formation of a large-scale shear, and in turn its interaction with the small-scale turbulence causes an excitation of the large-scale instability (known as a vorticity dynamo) due to a combined effect of the large-scale shear and Reynolds stress-induced generation of the mean vorticity. The latter is due to the effect of large-scale shear on the Reynolds stress. A fast rotation suppresses this large-scale instability.

  14. Recent Advances in Understanding Large Scale Vapour Explosions

    International Nuclear Information System (INIS)

    Board, S.J.; Hall, R.W.

    1976-01-01

    In foundries, violent explosions occur occasionally when molten metal comes into contact with water. If similar explosions can occur with other materials, hazardous situations may arise for example in LNG marine transportation accidents, or in liquid cooled reactor incidents when molten UO 2 contacts water or sodium coolant. Over the last 10 years a large body of experimental data has been obtained on the behaviour of small quantities of hot material in contact with a vaporisable coolant. Such experiments generally give low energy yields, despite producing fine fragmentation of the molten material. These events have been interpreted in terms of a wide range of phenomena such as violent boiling, liquid entrainment, bubble collapse, superheat, surface cracking and many others. Many of these studies have been aimed at understanding the small scale behaviour of the particular materials of interest. However, understanding the nature of the energetic events which were the original cause for concern may also be necessary to give confidence that violent events cannot occur for these materials in large scale situations. More recently, there has been a trend towards larger experiments and some of these have produced explosions of moderately high efficiency. Although occurrence of such large scale explosions can depend rather critically on initial conditions in a way which is not fully understood, there are signs that the interpretation of these events may be more straightforward than that of the single drop experiments. In the last two years several theoretical models for large scale explosions have appeared which attempt a self contained explanation of at least some stages of such high yield events: these have as their common feature a description of how a propagating breakdown of an initially quasi-stable distribution of materials is induced by the pressure and flow field caused by the energy release in adjacent regions. These models have led to the idea that for a full

  15. Robust large-scale parallel nonlinear solvers for simulations.

    Energy Technology Data Exchange (ETDEWEB)

    Bader, Brett William; Pawlowski, Roger Patrick; Kolda, Tamara Gibson (Sandia National Laboratories, Livermore, CA)

    2005-11-01

    This report documents research to develop robust and efficient solution techniques for solving large-scale systems of nonlinear equations. The most widely used method for solving systems of nonlinear equations is Newton's method. While much research has been devoted to augmenting Newton-based solvers (usually with globalization techniques), little has been devoted to exploring the application of different models. Our research has been directed at evaluating techniques using different models than Newton's method: a lower order model, Broyden's method, and a higher order model, the tensor method. We have developed large-scale versions of each of these models and have demonstrated their use in important applications at Sandia. Broyden's method replaces the Jacobian with an approximation, allowing codes that cannot evaluate a Jacobian or have an inaccurate Jacobian to converge to a solution. Limited-memory methods, which have been successful in optimization, allow us to extend this approach to large-scale problems. We compare the robustness and efficiency of Newton's method, modified Newton's method, Jacobian-free Newton-Krylov method, and our limited-memory Broyden method. Comparisons are carried out for large-scale applications of fluid flow simulations and electronic circuit simulations. Results show that, in cases where the Jacobian was inaccurate or could not be computed, Broyden's method converged in some cases where Newton's method failed to converge. We identify conditions where Broyden's method can be more efficient than Newton's method. We also present modifications to a large-scale tensor method, originally proposed by Bouaricha, for greater efficiency, better robustness, and wider applicability. Tensor methods are an alternative to Newton-based methods and are based on computing a step based on a local quadratic model rather than a linear model. The advantage of Bouaricha's method is that it can use any

  16. Large Scale GW Calculations on the Cori System

    Science.gov (United States)

    Deslippe, Jack; Del Ben, Mauro; da Jornada, Felipe; Canning, Andrew; Louie, Steven

    The NERSC Cori system, powered by 9000+ Intel Xeon-Phi processors, represents one of the largest HPC systems for open-science in the United States and the world. We discuss the optimization of the GW methodology for this system, including both node level and system-scale optimizations. We highlight multiple large scale (thousands of atoms) case studies and discuss both absolute application performance and comparison to calculations on more traditional HPC architectures. We find that the GW method is particularly well suited for many-core architectures due to the ability to exploit a large amount of parallelism across many layers of the system. This work was supported by the U.S. Department of Energy, Office of Science, Basic Energy Sciences, Materials Sciences and Engineering Division, as part of the Computational Materials Sciences Program.

  17. Cosmic ray acceleration by large scale galactic shocks

    International Nuclear Information System (INIS)

    Cesarsky, C.J.; Lagage, P.O.

    1987-01-01

    The mechanism of diffusive shock acceleration may account for the existence of galactic cosmic rays detailed application to stellar wind shocks and especially to supernova shocks have been developed. Existing models can usually deal with the energetics or the spectral slope, but the observed energy range of cosmic rays is not explained. Therefore it seems worthwhile to examine the effect that large scale, long-lived galactic shocks may have on galactic cosmic rays, in the frame of the diffusive shock acceleration mechanism. Large scale fast shocks can only be expected to exist in the galactic halo. We consider three situations where they may arise: expansion of a supernova shock in the halo, galactic wind, galactic infall; and discuss the possible existence of these shocks and their role in accelerating cosmic rays

  18. Lagrangian space consistency relation for large scale structure

    International Nuclear Information System (INIS)

    Horn, Bart; Hui, Lam; Xiao, Xiao

    2015-01-01

    Consistency relations, which relate the squeezed limit of an (N+1)-point correlation function to an N-point function, are non-perturbative symmetry statements that hold even if the associated high momentum modes are deep in the nonlinear regime and astrophysically complex. Recently, Kehagias and Riotto and Peloso and Pietroni discovered a consistency relation applicable to large scale structure. We show that this can be recast into a simple physical statement in Lagrangian space: that the squeezed correlation function (suitably normalized) vanishes. This holds regardless of whether the correlation observables are at the same time or not, and regardless of whether multiple-streaming is present. The simplicity of this statement suggests that an analytic understanding of large scale structure in the nonlinear regime may be particularly promising in Lagrangian space

  19. Electron drift in a large scale solid xenon

    International Nuclear Information System (INIS)

    Yoo, J.; Jaskierny, W.F.

    2015-01-01

    A study of charge drift in a large scale optically transparent solid xenon is reported. A pulsed high power xenon light source is used to liberate electrons from a photocathode. The drift speeds of the electrons are measured using a 8.7 cm long electrode in both the liquid and solid phase of xenon. In the liquid phase (163 K), the drift speed is 0.193 ± 0.003 cm/μs while the drift speed in the solid phase (157 K) is 0.397 ± 0.006 cm/μs at 900 V/cm over 8.0 cm of uniform electric fields. Therefore, it is demonstrated that a factor two faster electron drift speed in solid phase xenon compared to that in liquid in a large scale solid xenon

  20. Wind and Photovoltaic Large-Scale Regional Models for hourly production evaluation

    DEFF Research Database (Denmark)

    Marinelli, Mattia; Maule, Petr; Hahmann, Andrea N.

    2015-01-01

    This work presents two large-scale regional models used for the evaluation of normalized power output from wind turbines and photovoltaic power plants on a European regional scale. The models give an estimate of renewable production on a regional scale with 1 h resolution, starting from a mesosca...... of the transmission system, especially regarding the cross-border power flows. The tuning of these regional models is done using historical meteorological data acquired on a per-country basis and using publicly available data of installed capacity.......This work presents two large-scale regional models used for the evaluation of normalized power output from wind turbines and photovoltaic power plants on a European regional scale. The models give an estimate of renewable production on a regional scale with 1 h resolution, starting from a mesoscale...