WorldWideScience

Sample records for computational biology research

  1. Computational biology

    DEFF Research Database (Denmark)

    Hartmann, Lars Røeboe; Jones, Neil; Simonsen, Jakob Grue

    2011-01-01

    Computation via biological devices has been the subject of close scrutiny since von Neumann’s early work some 60 years ago. In spite of the many relevant works in this field, the notion of programming biological devices seems to be, at best, ill-defined. While many devices are claimed or proved t...

  2. Computer Literacy for Life Sciences: Helping the Digital-Era Biology Undergraduates Face Today's Research

    Science.gov (United States)

    Smolinski, Tomasz G.

    2010-01-01

    Computer literacy plays a critical role in today's life sciences research. Without the ability to use computers to efficiently manipulate and analyze large amounts of data resulting from biological experiments and simulations, many of the pressing questions in the life sciences could not be answered. Today's undergraduates, despite the ubiquity of…

  3. ISCB Ebola Award for Important Future Research on the Computational Biology of Ebola Virus

    OpenAIRE

    Karp, P.D.; Berger, B.; Kovats, D.; Lengauer, T.; Linial, M.; Sabeti, P.; Hide, W.; Rost, B.

    2015-01-01

    Speed is of the essence in combating Ebola; thus, computational approaches should form a significant component of Ebola research. As for the development of any modern drug, computational biology is uniquely positioned to contribute through comparative analysis of the genome sequences of Ebola strains as well as 3-D protein modeling. Other computational approaches to Ebola may include large-scale docking studies of Ebola proteins with human proteins and with small-molecule libraries, computati...

  4. ISCB Ebola Award for Important Future Research on the Computational Biology of Ebola Virus.

    Directory of Open Access Journals (Sweden)

    Peter D. Karp

    2015-01-01

    Full Text Available Speed is of the essence in combating Ebola; thus, computational approaches should form a significant component of Ebola research. As for the development of any modern drug, computational biology is uniquely positioned to contribute through comparative analysis of the genome sequences of Ebola strains as well as 3-D protein modeling. Other computational approaches to Ebola may include large-scale docking studies of Ebola proteins with human proteins and with small-molecule libraries, computational modeling of the spread of the virus, computational mining of the Ebola literature, and creation of a curated Ebola database. Taken together, such computational efforts could significantly accelerate traditional scientific approaches. In recognition of the need for important and immediate solutions from the field of computational biology against Ebola, the International Society for Computational Biology (ISCB announces a prize for an important computational advance in fighting the Ebola virus. ISCB will confer the ISCB Fight against Ebola Award, along with a prize of US$2,000, at its July 2016 annual meeting (ISCB Intelligent Systems for Molecular Biology [ISMB] 2016, Orlando, Florida.

  5. Large Scale Computing and Storage Requirements for Biological and Environmental Research

    Energy Technology Data Exchange (ETDEWEB)

    DOE Office of Science, Biological and Environmental Research Program Office (BER),

    2009-09-30

    In May 2009, NERSC, DOE's Office of Advanced Scientific Computing Research (ASCR), and DOE's Office of Biological and Environmental Research (BER) held a workshop to characterize HPC requirements for BER-funded research over the subsequent three to five years. The workshop revealed several key points, in addition to achieving its goal of collecting and characterizing computing requirements. Chief among them: scientific progress in BER-funded research is limited by current allocations of computational resources. Additionally, growth in mission-critical computing -- combined with new requirements for collaborative data manipulation and analysis -- will demand ever increasing computing, storage, network, visualization, reliability and service richness from NERSC. This report expands upon these key points and adds others. It also presents a number of"case studies" as significant representative samples of the needs of science teams within BER. Workshop participants were asked to codify their requirements in this"case study" format, summarizing their science goals, methods of solution, current and 3-5 year computing requirements, and special software and support needs. Participants were also asked to describe their strategy for computing in the highly parallel,"multi-core" environment that is expected to dominate HPC architectures over the next few years.

  6. Interdisciplinary research and education at the biology-engineering-computer science interface: a perspective.

    Science.gov (United States)

    Tadmor, Brigitta; Tidor, Bruce

    2005-09-01

    Progress in the life sciences, including genome sequencing and high-throughput experimentation, offers an opportunity for understanding biology and medicine from a systems perspective. This 'new view', which complements the more traditional component-based approach, involves the integration of biological research with approaches from engineering disciplines and computer science. The result is more than a new set of technologies. Rather, it promises a fundamental reconceptualization of the life sciences based on the development of quantitative and predictive models to describe crucial processes. To achieve this change, learning communities are being formed at the interface of the life sciences, engineering and computer science. Through these communities, research and education will be integrated across disciplines and the challenges associated with multidisciplinary team-based science will be addressed.

  7. Computational biology for ageing

    Science.gov (United States)

    Wieser, Daniela; Papatheodorou, Irene; Ziehm, Matthias; Thornton, Janet M.

    2011-01-01

    High-throughput genomic and proteomic technologies have generated a wealth of publicly available data on ageing. Easy access to these data, and their computational analysis, is of great importance in order to pinpoint the causes and effects of ageing. Here, we provide a description of the existing databases and computational tools on ageing that are available for researchers. We also describe the computational approaches to data interpretation in the field of ageing including gene expression, comparative and pathway analyses, and highlight the challenges for future developments. We review recent biological insights gained from applying bioinformatics methods to analyse and interpret ageing data in different organisms, tissues and conditions. PMID:21115530

  8. High Performance Computing and Storage Requirements for Biological and Environmental Research Target 2017

    Energy Technology Data Exchange (ETDEWEB)

    Gerber, Richard [Lawrence Berkeley National Lab. (LBNL), Berkeley, CA (United States). National Energy Research Scientific Computing Center (NERSC); Wasserman, Harvey [Lawrence Berkeley National Lab. (LBNL), Berkeley, CA (United States). National Energy Research Scientific Computing Center (NERSC)

    2013-05-01

    The National Energy Research Scientific Computing Center (NERSC) is the primary computing center for the DOE Office of Science, serving approximately 4,500 users working on some 650 projects that involve nearly 600 codes in a wide variety of scientific disciplines. In addition to large-­scale computing and storage resources NERSC provides support and expertise that help scientists make efficient use of its systems. The latest review revealed several key requirements, in addition to achieving its goal of characterizing BER computing and storage needs.

  9. Application of computational intelligence to biology

    CERN Document Server

    Sekhar, Akula

    2016-01-01

    This book is a contribution of translational and allied research to the proceedings of the International Conference on Computational Intelligence and Soft Computing. It explains how various computational intelligence techniques can be applied to investigate various biological problems. It is a good read for Research Scholars, Engineers, Medical Doctors and Bioinformatics researchers.

  10. Biological and Environmental Research Exascale Requirements Review. An Office of Science review sponsored jointly by Advanced Scientific Computing Research and Biological and Environmental Research, March 28-31, 2016, Rockville, Maryland

    Energy Technology Data Exchange (ETDEWEB)

    Arkin, Adam [Lawrence Berkeley National Lab. (LBNL), Berkeley, CA (United States); Bader, David C. [Lawrence Livermore National Lab. (LLNL), Livermore, CA (United States); Coffey, Richard [Argonne National Lab. (ANL), Argonne, IL (United States); Antypas, Katie [Lawrence Berkeley National Lab. (LBNL), Berkeley, CA (United States); Bard, Deborah [Lawrence Berkeley National Lab. (LBNL), Berkeley, CA (United States). National Energy Research Scientific Computing Center (NERSC); Dart, Eli [Lawrence Berkeley National Lab. (LBNL), Berkeley, CA (United States). Esnet; Dosanjh, Sudip [Lawrence Berkeley National Lab. (LBNL), Berkeley, CA (United States); Gerber, Richard [Lawrence Berkeley National Lab. (LBNL), Berkeley, CA (United States); Hack, James [Oak Ridge National Lab. (ORNL), Oak Ridge, TN (United States); Monga, Inder [Lawrence Berkeley National Lab. (LBNL), Berkeley, CA (United States). Esnet; Papka, Michael E. [Argonne National Lab. (ANL), Argonne, IL (United States); Riley, Katherine [Argonne National Lab. (ANL), Argonne, IL (United States); Rotman, Lauren [Lawrence Berkeley National Lab. (LBNL), Berkeley, CA (United States). Esnet; Straatsma, Tjerk [Oak Ridge National Lab. (ORNL), Oak Ridge, TN (United States); Wells, Jack [Oak Ridge National Lab. (ORNL), Oak Ridge, TN (United States); Aluru, Srinivas [Georgia Inst. of Technology, Atlanta, GA (United States); Andersen, Amity [Pacific Northwest National Lab. (PNNL), Richland, WA (United States); Aprá, Edoardo [Pacific Northwest National Lab. (PNNL), Richland, WA (United States). EMSL; Azad, Ariful [Lawrence Berkeley National Lab. (LBNL), Berkeley, CA (United States); Bates, Susan [National Center for Atmospheric Research, Boulder, CO (United States); Blaby, Ian [Brookhaven National Lab. (BNL), Upton, NY (United States); Blaby-Haas, Crysten [Brookhaven National Lab. (BNL), Upton, NY (United States); Bonneau, Rich [New York Univ. (NYU), NY (United States); Bowen, Ben [Lawrence Berkeley National Lab. (LBNL), Berkeley, CA (United States); Bradford, Mark A. [Yale Univ., New Haven, CT (United States); Brodie, Eoin [Lawrence Berkeley National Lab. (LBNL), Berkeley, CA (United States); Brown, James (Ben) [Lawrence Berkeley National Lab. (LBNL), Berkeley, CA (United States); Buluc, Aydin [Lawrence Berkeley National Lab. (LBNL), Berkeley, CA (United States); Bernholdt, David [Oak Ridge National Lab. (ORNL), Oak Ridge, TN (United States); Bylaska, Eric [Pacific Northwest National Lab. (PNNL), Richland, WA (United States); Calvin, Kate [Pacific Northwest National Lab. (PNNL), Richland, WA (United States); Cannon, Bill [Pacific Northwest National Lab. (PNNL), Richland, WA (United States); Chen, Xingyuan [Pacific Northwest National Lab. (PNNL), Richland, WA (United States); Cheng, Xiaolin [Oak Ridge National Lab. (ORNL), Oak Ridge, TN (United States); Cheung, Margaret [Univ. of Houston, Houston, TX (United States); Chowdhary, Kenny [Sandia National Lab. (SNL-NM), Albuquerque, NM (United States); Colella, Phillip [Lawrence Berkeley National Lab. (LBNL), Berkeley, CA (United States); Collins, Bill [Lawrence Berkeley National Lab. (LBNL), Berkeley, CA (United States); Compo, Gil [National Oceanic and Atmospheric Administration (NOAA), Boulder, CO (United States); Crowley, Mike [National Renewable Energy Lab. (NREL), Golden, CO (United States); Debusschere, Bert [Sandia National Lab. (SNL-CA), Livermore, CA (United States); D’Imperio, Nicholas [Brookhaven National Lab. (BNL), Upton, NY (United States); Dror, Ron [Stanford Univ., Stanford, CA (United States); Egan, Rob [Lawrence Berkeley National Lab. (LBNL), Berkeley, CA (United States); Evans, Katherine [Oak Ridge National Lab. (ORNL), Oak Ridge, TN (United States); Friedberg, Iddo [Iowa State Univ., Ames, IA (United States); Fyke, Jeremy [Los Alamos National Lab. (LANL), Los Alamos, NM (United States); Gao, Zheng [Stony Brook Univ., Stony Brook, NY (United States); Georganas, Evangelos [Lawrence Berkeley National Lab. (LBNL), Berkeley, CA (United States); Giraldo, Frank [Naval Postgraduate School, Monterey, CA (United States); Gnanakaran, Gnana [Los Alamos National Lab. (LANL), Los Alamos, NM (United States); Govind, Niri [Pacific Northwest National Lab. (PNNL), Richland, WA (United States). EMSL; Grandy, Stuart [Univ. of New Hampshire, Durham, NH (United States); Gustafson, Bill [Pacific Northwest National Lab. (PNNL), Richland, WA (United States); Hammond, Glenn [Sandia National Lab. (SNL-NM), Albuquerque, NM (United States); Hargrove, William [USDA Forest Service, Washington, D.C. (United States); Heroux, Michael [Sandia National Lab. (SNL-NM), Albuquerque, NM (United States); Hoffman, Forrest [Oak Ridge National Lab. (ORNL), Oak Ridge, TN (United States); Hofmeyr, Steven [Lawrence Berkeley National Lab. (LBNL), Berkeley, CA (United States); Hunke, Elizabeth [Los Alamos National Lab. (LANL), Los Alamos, NM (United States); Jackson, Charles [Univ. of Texas-Austin, Austin, TX (United States); Jacob, Rob [Argonne National Lab. (ANL), Argonne, IL (United States); Jacobson, Dan [Oak Ridge National Lab. (ORNL), Oak Ridge, TN (United States); Jacobson, Matt [Univ. of California, San Francisco, CA (United States); Jain, Chirag [Georgia Inst. of Technology, Atlanta, GA (United States); Johansen, Hans [Lawrence Berkeley National Lab. (LBNL), Berkeley, CA (United States); Johnson, Jeff [Lawrence Berkeley National Lab. (LBNL), Berkeley, CA (United States); Jones, Andy [Lawrence Berkeley National Lab. (LBNL), Berkeley, CA (United States); Jones, Phil [Los Alamos National Lab. (LANL), Los Alamos, NM (United States); Kalyanaraman, Ananth [Washington State Univ., Pullman, WA (United States); Kang, Senghwa [Pacific Northwest National Lab. (PNNL), Richland, WA (United States); King, Eric [Lawrence Berkeley National Lab. (LBNL), Berkeley, CA (United States); Koanantakool, Penporn [Univ. of California, Berkeley, CA (United States); Kollias, Pavlos [Stony Brook Univ., Stony Brook, NY (United States); Kopera, Michal [Univ. of California, Santa Cruz, CA (United States); Kotamarthi, Rao [Argonne National Lab. (ANL), Argonne, IL (United States); Kowalski, Karol [Pacific Northwest National Lab. (PNNL), Richland, WA (United States). EMSL; Kumar, Jitendra [Oak Ridge National Lab. (ORNL), Oak Ridge, TN (United States); Kyrpides, Nikos [Lawrence Berkeley National Lab. (LBNL), Berkeley, CA (United States); Leung, Ruby [Pacific Northwest National Lab. (PNNL), Richland, WA (United States); Li, Xiaolin [Stony Brook Univ., Stony Brook, NY (United States); Lin, Wuyin [Brookhaven National Lab. (BNL), Upton, NY (United States); Link, Robert [Pacific Northwest National Lab. (PNNL), Richland, WA (United States); Liu, Yangang [Brookhaven National Lab. (BNL), Upton, NY (United States); Loew, Leslie [Univ. of Connecticut, Storrs, CT (United States); Luke, Edward [Brookhaven National Lab. (BNL), Upton, NY (United States); Ma, Hsi -Yen [Lawrence Livermore National Lab. (LLNL), Livermore, CA (United States); Mahadevan, Radhakrishnan [Univ. of Toronto, Toronto, ON (Canada); Maranas, Costas [Pennsylvania State Univ., University Park, PA (United States); Martin, Daniel [Lawrence Berkeley National Lab. (LBNL), Berkeley, CA (United States); Maslowski, Wieslaw [Naval Postgraduate School, Monterey, CA (United States); McCue, Lee Ann [Pacific Northwest National Lab. (PNNL), Richland, WA (United States); McInnes, Lois Curfman [Argonne National Lab. (ANL), Argonne, IL (United States); Mills, Richard [Intel Corp., Santa Clara, CA (United States); Molins Rafa, Sergi [Lawrence Berkeley National Lab. (LBNL), Berkeley, CA (United States); Morozov, Dmitriy [Lawrence Berkeley National Lab. (LBNL), Berkeley, CA (United States); Mostafavi, Sara [Center for Molecular Medicine and Therapeutics, Vancouver, BC (Canada); Moulton, David J. [Los Alamos National Lab. (LANL), Los Alamos, NM (United States); Mourao, Zenaida [Univ. of Cambridge (United Kingdom); Najm, Habib [Sandia National Lab. (SNL-CA), Livermore, CA (United States); Ng, Bernard [Center for Molecular Medicine and Therapeutics, Vancouver, BC (Canada); Ng, Esmond [Lawrence Berkeley National Lab. (LBNL), Berkeley, CA (United States); Norman, Matt [Oak Ridge National Lab. (ORNL), Oak Ridge, TN (United States); Oh, Sang -Yun [Univ. of California, Santa Barbara, CA (United States); Oliker, Leonid [Lawrence Berkeley National Lab. (LBNL), Berkeley, CA (United States); Pan, Chongle [Oak Ridge National Lab. (ORNL), Oak Ridge, TN (United States); Pass, Rebecca [Lawrence Berkeley National Lab. (LBNL), Berkeley, CA (United States); Pau, George S. H. [Lawrence Berkeley National Lab. (LBNL), Berkeley, CA (United States); Petridis, Loukas [Oak Ridge National Lab. (ORNL), Oak Ridge, TN (United States); Prakash, Giri [Oak Ridge National Lab. (ORNL), Oak Ridge, TN (United States); Price, Stephen [Los Alamos National Lab. (LANL), Los Alamos, NM (United States); Randall, David [Colorado State Univ., Fort Collins, CO (United States); Renslow, Ryan [Pacific Northwest National Lab. (PNNL), Richland, WA (United States); Riihimaki, Laura [Pacific Northwest National Lab. (PNNL), Richland, WA (United States); Ringler, Todd [Los Alamos National Lab. (LANL), Los Alamos, NM (United States); Roberts, Andrew [Naval Postgraduate School, Monterey, CA (United States); Rokhsar, Dan [Lawrence Berkeley National Lab. (LBNL), Berkeley, CA (United States); Ruebel, Oliver [Lawrence Berkeley National Lab. (LBNL), Berkeley, CA (United States); Salinger, Andrew [Sandia National Lab. (SNL-NM), Albuquerque, NM (United States); Scheibe, Tim [Pacific Northwest National Lab. (PNNL), Richland, WA (United States); Schulz, Roland [Intel, Mountain View, CA (United States); Sivaraman, Chitra [Pacific Northwest National Lab. (PNNL), Richland, WA (United States); Smith, Jeremy [Oak Ridge National Lab. (ORNL), Oak Ridge, TN (United States); Sreepathi, Sarat [Oak Ridge National Lab. (ORNL), Oak Ridge, TN (United States); Steefel, Carl [Lawrence Berkeley National Lab. (LBNL), Berkeley, CA (United States); Talbot, Jenifer [Boston Univ., Boston, MA (United States); Tantillo, D. J. [Univ. of California, Davis, CA (United States); Tartakovsky, Alex [Pacific Northwest National Lab. (PNNL), Richland, WA (United States); Taylor, Mark [Sandia National Lab. (SNL-NM), Albuquerque, NM (United States); Taylor, Ronald [Pacific Northwest National Lab. (PNNL), Richland, WA (United States); Trebotich, David [Lawrence Berkeley National Lab. (LBNL), Berkeley, CA (United States); Urban, Nathan [Los Alamos National Lab. (LANL), Los Alamos, NM (United States); Valiev, Marat [Pacific Northwest National Lab. (PNNL), Richland, WA (United States). EMSL; Wagner, Allon [Univ. of California, Berkeley, CA (United States); Wainwright, Haruko [Lawrence Berkeley National Lab. (LBNL), Berkeley, CA (United States); Wieder, Will [NCAR/Univ. of Colorado, Boulder, CO (United States); Wiley, Steven [Pacific Northwest National Lab. (PNNL), Richland, WA (United States); Williams, Dean [Lawrence Livermore National Lab. (LLNL), Livermore, CA (United States); Worley, Pat [Oak Ridge National Lab. (ORNL), Oak Ridge, TN (United States); Xie, Shaocheng [Lawrence Livermore National Lab. (LLNL), Livermore, CA (United States); Yelick, Kathy [Lawrence Berkeley National Lab. (LBNL), Berkeley, CA (United States); Yoo, Shinjae [Brookhaven National Lab. (BNL), Upton, NY (United States); Yosef, Niri [Univ. of California, Berkeley, CA (United States); Zhang, Minghua [Stony Brook Univ., Stony Brook, NY (United States)

    2016-03-31

    Understanding the fundamentals of genomic systems or the processes governing impactful weather patterns are examples of the types of simulation and modeling performed on the most advanced computing resources in America. High-performance computing and computational science together provide a necessary platform for the mission science conducted by the Biological and Environmental Research (BER) office at the U.S. Department of Energy (DOE). This report reviews BER’s computing needs and their importance for solving some of the toughest problems in BER’s portfolio. BER’s impact on science has been transformative. Mapping the human genome, including the U.S.-supported international Human Genome Project that DOE began in 1987, initiated the era of modern biotechnology and genomics-based systems biology. And since the 1950s, BER has been a core contributor to atmospheric, environmental, and climate science research, beginning with atmospheric circulation studies that were the forerunners of modern Earth system models (ESMs) and by pioneering the implementation of climate codes onto high-performance computers. See http://exascaleage.org/ber/ for more information.

  11. Michael Levitt and Computational Biology

    Science.gov (United States)

    dropdown arrow Site Map A-Z Index Menu Synopsis Michael Levitt and Computational Biology Resources with Michael Levitt, PhD, professor of structural biology at the Stanford University School of Medicine, has function. ... Levitt's early work pioneered computational structural biology, which helped to predict

  12. Computational Systems Chemical Biology

    OpenAIRE

    Oprea, Tudor I.; May, Elebeoba E.; Leitão, Andrei; Tropsha, Alexander

    2011-01-01

    There is a critical need for improving the level of chemistry awareness in systems biology. The data and information related to modulation of genes and proteins by small molecules continue to accumulate at the same time as simulation tools in systems biology and whole body physiologically-based pharmacokinetics (PBPK) continue to evolve. We called this emerging area at the interface between chemical biology and systems biology systems chemical biology, SCB (Oprea et al., 2007).

  13. Space biology research development

    Science.gov (United States)

    Bonting, Sjoerd L.

    1993-01-01

    The purpose of the Search for Extraterrestrial Intelligence (SETI) Institute is to conduct and promote research related activities regarding the search for extraterrestrial life, particularly intelligent life. Such research encompasses the broad discipline of 'Life in the Universe', including all scientific and technological aspects of astronomy and the planetary sciences, chemical evolution, the origin of life, biological evolution, and cultural evolution. The primary purpose was to provide funding for the Principal Investigator to collaborate with the personnel of the SETI Institute and the NASA-Ames Research center in order to plan and develop space biology research on and in connection with Space Station Freedom; to promote cooperation with the international partners in the space station; to conduct a study on the use of biosensors in space biology research and life support system operation; and to promote space biology research through the initiation of an annual publication 'Advances in Space Biology and Medicine'.

  14. Computational aspects of systematic biology.

    Science.gov (United States)

    Lilburn, Timothy G; Harrison, Scott H; Cole, James R; Garrity, George M

    2006-06-01

    We review the resources available to systematic biologists who wish to use computers to build classifications. Algorithm development is in an early stage, and only a few examples of integrated applications for systematic biology are available. The availability of data is crucial if systematic biology is to enter the computer age.

  15. MODELING HOST-PATHOGEN INTERACTIONS: COMPUTATIONAL BIOLOGY AND BIOINFORMATICS FOR INFECTIOUS DISEASE RESEARCH (Session introduction)

    Energy Technology Data Exchange (ETDEWEB)

    McDermott, Jason E.; Braun, Pascal; Bonneau, Richard A.; Hyduke, Daniel R.

    2011-12-01

    Pathogenic infections are a major cause of both human disease and loss of crop yields and animal stocks and thus cause immense damage to the worldwide economy. The significance of infectious diseases is expected to increase in an ever more connected warming world, in which new viral, bacterial and fungal pathogens can find novel hosts and ecologic niches. At the same time, the complex and sophisticated mechanisms by which diverse pathogenic agents evade defense mechanisms and subvert their hosts networks to suit their lifestyle needs is still very incompletely understood especially from a systems perspective [1]. Thus, understanding host-pathogen interactions is both an important and a scientifically fascinating topic. Recently, technology has offered the opportunity to investigate host-pathogen interactions on a level of detail and scope that offers immense computational and analytical possibilities. Genome sequencing was pioneered on some of these pathogens, and the number of strains and variants of pathogens sequenced to date vastly outnumbers the number of host genomes available. At the same time, for both plant and human hosts more and more data on population level genomic variation becomes available and offers a rich field for analysis into the genetic interactions between host and pathogen.

  16. Computational Biology and High Performance Computing 2000

    Energy Technology Data Exchange (ETDEWEB)

    Simon, Horst D.; Zorn, Manfred D.; Spengler, Sylvia J.; Shoichet, Brian K.; Stewart, Craig; Dubchak, Inna L.; Arkin, Adam P.

    2000-10-19

    The pace of extraordinary advances in molecular biology has accelerated in the past decade due in large part to discoveries coming from genome projects on human and model organisms. The advances in the genome project so far, happening well ahead of schedule and under budget, have exceeded any dreams by its protagonists, let alone formal expectations. Biologists expect the next phase of the genome project to be even more startling in terms of dramatic breakthroughs in our understanding of human biology, the biology of health and of disease. Only today can biologists begin to envision the necessary experimental, computational and theoretical steps necessary to exploit genome sequence information for its medical impact, its contribution to biotechnology and economic competitiveness, and its ultimate contribution to environmental quality. High performance computing has become one of the critical enabling technologies, which will help to translate this vision of future advances in biology into reality. Biologists are increasingly becoming aware of the potential of high performance computing. The goal of this tutorial is to introduce the exciting new developments in computational biology and genomics to the high performance computing community.

  17. Women are underrepresented in computational biology: An analysis of the scholarly literature in biology, computer science and computational biology.

    Science.gov (United States)

    Bonham, Kevin S; Stefan, Melanie I

    2017-10-01

    While women are generally underrepresented in STEM fields, there are noticeable differences between fields. For instance, the gender ratio in biology is more balanced than in computer science. We were interested in how this difference is reflected in the interdisciplinary field of computational/quantitative biology. To this end, we examined the proportion of female authors in publications from the PubMed and arXiv databases. There are fewer female authors on research papers in computational biology, as compared to biology in general. This is true across authorship position, year, and journal impact factor. A comparison with arXiv shows that quantitative biology papers have a higher ratio of female authors than computer science papers, placing computational biology in between its two parent fields in terms of gender representation. Both in biology and in computational biology, a female last author increases the probability of other authors on the paper being female, pointing to a potential role of female PIs in influencing the gender balance.

  18. Women are underrepresented in computational biology: An analysis of the scholarly literature in biology, computer science and computational biology.

    Directory of Open Access Journals (Sweden)

    Kevin S Bonham

    2017-10-01

    Full Text Available While women are generally underrepresented in STEM fields, there are noticeable differences between fields. For instance, the gender ratio in biology is more balanced than in computer science. We were interested in how this difference is reflected in the interdisciplinary field of computational/quantitative biology. To this end, we examined the proportion of female authors in publications from the PubMed and arXiv databases. There are fewer female authors on research papers in computational biology, as compared to biology in general. This is true across authorship position, year, and journal impact factor. A comparison with arXiv shows that quantitative biology papers have a higher ratio of female authors than computer science papers, placing computational biology in between its two parent fields in terms of gender representation. Both in biology and in computational biology, a female last author increases the probability of other authors on the paper being female, pointing to a potential role of female PIs in influencing the gender balance.

  19. Synthetic biology: engineering molecular computers

    CERN Multimedia

    CERN. Geneva

    2018-01-01

    Complicated systems cannot survive the rigors of a chaotic environment, without balancing mechanisms that sense, decide upon and counteract the exerted disturbances. Especially so with living organisms, forced by competition to incredible complexities, escalating also their self-controlling plight. Therefore, they compute. Can we harness biological mechanisms to create artificial computing systems? Biology offers several levels of design abstraction: molecular machines, cells, organisms... ranging from the more easily-defined to the more inherently complex. At the bottom of this stack we find the nucleic acids, RNA and DNA, with their digital structure and relatively precise interactions. They are central enablers of designing artificial biological systems, in the confluence of engineering and biology, that we call Synthetic biology. In the first part, let us follow their trail towards an overview of building computing machines with molecules -- and in the second part, take the case study of iGEM Greece 201...

  20. PAC research in biology

    Energy Technology Data Exchange (ETDEWEB)

    Chain, C. Y., E-mail: yamil@fisica.unlp.edu.ar [Universidad Nacional de La Plata, IFLP (Argentina); Ceolin, M. [Instituto de Investigaciones Fisicoquimicas Teoricas y Aplicadas, Dto de Quimica, Fac. Cs. Exactas, UNLP (Argentina); Pasquevich, A. F. [Universidad Nacional de La Plata, IFLP (Argentina)

    2008-01-15

    In this paper possible applications of the Perturbed Angular Correlations (PAC) technique in Biology are considered. Previous PAC experiments in biology are globally analyzed. All the work that appears in the literature has been grouped in a few research lines, just to make the analysis and discussion easy. The commonly used radioactive probes are listed and the experimental difficulties are analyzed. We also report applications of {sup 181}Hf and {sup 111}In isotopes in life sciences other than their use in PAC. The possibility of extending these studies using the PAC technique is discussed.

  1. Progress in Computational Physics (PiCP) Vol 2 Coupled Fluid Flow in Energy, Biology and Environmental Research

    CERN Document Server

    Ehrhardt, Matthias

    2012-01-01

    This second volume contains both, the mathematical analysis of the coupling between fluid flow and porous media flow and state-of-the art numerical techniques, like tailor-made finite element and finite volume methods. Readers will come across articles devoted to concrete applications of these models in the field of energy, biology and environmental research.

  2. How Computers are Arming biology!

    Indian Academy of Sciences (India)

    Home; Journals; Resonance – Journal of Science Education; Volume 23; Issue 1. In-vitro to In-silico - How Computers are Arming biology! Geetha Sugumaran Sushila Rajagopal. Face to Face Volume 23 Issue 1 January 2018 pp 83-102. Fulltext. Click here to view fulltext PDF. Permanent link:

  3. 2K09 and thereafter : the coming era of integrative bioinformatics, systems biology and intelligent computing for functional genomics and personalized medicine research

    Science.gov (United States)

    2010-01-01

    Significant interest exists in establishing synergistic research in bioinformatics, systems biology and intelligent computing. Supported by the United States National Science Foundation (NSF), International Society of Intelligent Biological Medicine (http://www.ISIBM.org), International Journal of Computational Biology and Drug Design (IJCBDD) and International Journal of Functional Informatics and Personalized Medicine, the ISIBM International Joint Conferences on Bioinformatics, Systems Biology and Intelligent Computing (ISIBM IJCBS 2009) attracted more than 300 papers and 400 researchers and medical doctors world-wide. It was the only inter/multidisciplinary conference aimed to promote synergistic research and education in bioinformatics, systems biology and intelligent computing. The conference committee was very grateful for the valuable advice and suggestions from honorary chairs, steering committee members and scientific leaders including Dr. Michael S. Waterman (USC, Member of United States National Academy of Sciences), Dr. Chih-Ming Ho (UCLA, Member of United States National Academy of Engineering and Academician of Academia Sinica), Dr. Wing H. Wong (Stanford, Member of United States National Academy of Sciences), Dr. Ruzena Bajcsy (UC Berkeley, Member of United States National Academy of Engineering and Member of United States Institute of Medicine of the National Academies), Dr. Mary Qu Yang (United States National Institutes of Health and Oak Ridge, DOE), Dr. Andrzej Niemierko (Harvard), Dr. A. Keith Dunker (Indiana), Dr. Brian D. Athey (Michigan), Dr. Weida Tong (FDA, United States Department of Health and Human Services), Dr. Cathy H. Wu (Georgetown), Dr. Dong Xu (Missouri), Drs. Arif Ghafoor and Okan K Ersoy (Purdue), Dr. Mark Borodovsky (Georgia Tech, President of ISIBM), Dr. Hamid R. Arabnia (UGA, Vice-President of ISIBM), and other scientific leaders. The committee presented the 2009 ISIBM Outstanding Achievement Awards to Dr. Joydeep Ghosh (UT

  4. Applicability of Computational Systems Biology in Toxicology

    DEFF Research Database (Denmark)

    Kongsbak, Kristine Grønning; Hadrup, Niels; Audouze, Karine Marie Laure

    2014-01-01

    be used to establish hypotheses on links between the chemical and human diseases. Such information can also be applied for designing more intelligent animal/cell experiments that can test the established hypotheses. Here, we describe how and why to apply an integrative systems biology method......Systems biology as a research field has emerged within the last few decades. Systems biology, often defined as the antithesis of the reductionist approach, integrates information about individual components of a biological system. In integrative systems biology, large data sets from various sources...... and databases are used to model and predict effects of chemicals on, for instance, human health. In toxicology, computational systems biology enables identification of important pathways and molecules from large data sets; tasks that can be extremely laborious when performed by a classical literature search...

  5. Graphics processing units in bioinformatics, computational biology and systems biology.

    Science.gov (United States)

    Nobile, Marco S; Cazzaniga, Paolo; Tangherloni, Andrea; Besozzi, Daniela

    2017-09-01

    Several studies in Bioinformatics, Computational Biology and Systems Biology rely on the definition of physico-chemical or mathematical models of biological systems at different scales and levels of complexity, ranging from the interaction of atoms in single molecules up to genome-wide interaction networks. Traditional computational methods and software tools developed in these research fields share a common trait: they can be computationally demanding on Central Processing Units (CPUs), therefore limiting their applicability in many circumstances. To overcome this issue, general-purpose Graphics Processing Units (GPUs) are gaining an increasing attention by the scientific community, as they can considerably reduce the running time required by standard CPU-based software, and allow more intensive investigations of biological systems. In this review, we present a collection of GPU tools recently developed to perform computational analyses in life science disciplines, emphasizing the advantages and the drawbacks in the use of these parallel architectures. The complete list of GPU-powered tools here reviewed is available at http://bit.ly/gputools. © The Author 2016. Published by Oxford University Press.

  6. Computational Modeling of Biological Systems From Molecules to Pathways

    CERN Document Server

    2012-01-01

    Computational modeling is emerging as a powerful new approach for studying and manipulating biological systems. Many diverse methods have been developed to model, visualize, and rationally alter these systems at various length scales, from atomic resolution to the level of cellular pathways. Processes taking place at larger time and length scales, such as molecular evolution, have also greatly benefited from new breeds of computational approaches. Computational Modeling of Biological Systems: From Molecules to Pathways provides an overview of established computational methods for the modeling of biologically and medically relevant systems. It is suitable for researchers and professionals working in the fields of biophysics, computational biology, systems biology, and molecular medicine.

  7. Computational biology and bioinformatics in Nigeria.

    Science.gov (United States)

    Fatumo, Segun A; Adoga, Moses P; Ojo, Opeolu O; Oluwagbemi, Olugbenga; Adeoye, Tolulope; Ewejobi, Itunuoluwa; Adebiyi, Marion; Adebiyi, Ezekiel; Bewaji, Clement; Nashiru, Oyekanmi

    2014-04-01

    Over the past few decades, major advances in the field of molecular biology, coupled with advances in genomic technologies, have led to an explosive growth in the biological data generated by the scientific community. The critical need to process and analyze such a deluge of data and turn it into useful knowledge has caused bioinformatics to gain prominence and importance. Bioinformatics is an interdisciplinary research area that applies techniques, methodologies, and tools in computer and information science to solve biological problems. In Nigeria, bioinformatics has recently played a vital role in the advancement of biological sciences. As a developing country, the importance of bioinformatics is rapidly gaining acceptance, and bioinformatics groups comprised of biologists, computer scientists, and computer engineers are being constituted at Nigerian universities and research institutes. In this article, we present an overview of bioinformatics education and research in Nigeria. We also discuss professional societies and academic and research institutions that play central roles in advancing the discipline in Nigeria. Finally, we propose strategies that can bolster bioinformatics education and support from policy makers in Nigeria, with potential positive implications for other developing countries.

  8. Computational biology and bioinformatics in Nigeria.

    Directory of Open Access Journals (Sweden)

    Segun A Fatumo

    2014-04-01

    Full Text Available Over the past few decades, major advances in the field of molecular biology, coupled with advances in genomic technologies, have led to an explosive growth in the biological data generated by the scientific community. The critical need to process and analyze such a deluge of data and turn it into useful knowledge has caused bioinformatics to gain prominence and importance. Bioinformatics is an interdisciplinary research area that applies techniques, methodologies, and tools in computer and information science to solve biological problems. In Nigeria, bioinformatics has recently played a vital role in the advancement of biological sciences. As a developing country, the importance of bioinformatics is rapidly gaining acceptance, and bioinformatics groups comprised of biologists, computer scientists, and computer engineers are being constituted at Nigerian universities and research institutes. In this article, we present an overview of bioinformatics education and research in Nigeria. We also discuss professional societies and academic and research institutions that play central roles in advancing the discipline in Nigeria. Finally, we propose strategies that can bolster bioinformatics education and support from policy makers in Nigeria, with potential positive implications for other developing countries.

  9. Computing in Research.

    Science.gov (United States)

    Ashenhurst, Robert L.

    The introduction and diffusion of automatic computing facilities during the 1960's is reviewed; it is described as a time when research strategies in a broad variety of disciplines changed to take advantage of the newfound power provided by the computer. Several types of typical problems encountered by researchers who adopted the new technologies,…

  10. Computer Models and Automata Theory in Biology and Medicine

    CERN Document Server

    Baianu, I C

    2004-01-01

    The applications of computers to biological and biomedical problem solving goes back to the very beginnings of computer science, automata theory [1], and mathematical biology [2]. With the advent of more versatile and powerful computers, biological and biomedical applications of computers have proliferated so rapidly that it would be virtually impossible to compile a comprehensive review of all developments in this field. Limitations of computer simulations in biology have also come under close scrutiny, and claims have been made that biological systems have limited information processing power [3]. Such general conjectures do not, however, deter biologists and biomedical researchers from developing new computer applications in biology and medicine. Microprocessors are being widely employed in biological laboratories both for automatic data acquisition/processing and modeling; one particular area, which is of great biomedical interest, involves fast digital image processing and is already established for rout...

  11. Data integration in biological research: an overview.

    Science.gov (United States)

    Lapatas, Vasileios; Stefanidakis, Michalis; Jimenez, Rafael C; Via, Allegra; Schneider, Maria Victoria

    2015-12-01

    Data sharing, integration and annotation are essential to ensure the reproducibility of the analysis and interpretation of the experimental findings. Often these activities are perceived as a role that bioinformaticians and computer scientists have to take with no or little input from the experimental biologist. On the contrary, biological researchers, being the producers and often the end users of such data, have a big role in enabling biological data integration. The quality and usefulness of data integration depend on the existence and adoption of standards, shared formats, and mechanisms that are suitable for biological researchers to submit and annotate the data, so it can be easily searchable, conveniently linked and consequently used for further biological analysis and discovery. Here, we provide background on what is data integration from a computational science point of view, how it has been applied to biological research, which key aspects contributed to its success and future directions.

  12. Research in computer science

    Science.gov (United States)

    Ortega, J. M.

    1986-01-01

    Various graduate research activities in the field of computer science are reported. Among the topics discussed are: (1) failure probabilities in multi-version software; (2) Gaussian Elimination on parallel computers; (3) three dimensional Poisson solvers on parallel/vector computers; (4) automated task decomposition for multiple robot arms; (5) multi-color incomplete cholesky conjugate gradient methods on the Cyber 205; and (6) parallel implementation of iterative methods for solving linear equations.

  13. Tumor Biology and Microenvironment Research

    Science.gov (United States)

    Part of NCI's Division of Cancer Biology's research portfolio, research in this area seeks to understand the role of tumor cells and the tumor microenvironment (TME) in driving cancer initiation, progression, maintenance and recurrence.

  14. The case for biological quantum computer elements

    Science.gov (United States)

    Baer, Wolfgang; Pizzi, Rita

    2009-05-01

    An extension to vonNeumann's analysis of quantum theory suggests self-measurement is a fundamental process of Nature. By mapping the quantum computer to the brain architecture we will argue that the cognitive experience results from a measurement of a quantum memory maintained by biological entities. The insight provided by this mapping suggests quantum effects are not restricted to small atomic and nuclear phenomena but are an integral part of our own cognitive experience and further that the architecture of a quantum computer system parallels that of a conscious brain. We will then review the suggestions for biological quantum elements in basic neural structures and address the de-coherence objection by arguing for a self- measurement event model of Nature. We will argue that to first order approximation the universe is composed of isolated self-measurement events which guaranties coherence. Controlled de-coherence is treated as the input/output interactions between quantum elements of a quantum computer and the quantum memory maintained by biological entities cognizant of the quantum calculation results. Lastly we will present stem-cell based neuron experiments conducted by one of us with the aim of demonstrating the occurrence of quantum effects in living neural networks and discuss future research projects intended to reach this objective.

  15. A first attempt to bring computational biology into advanced high school biology classrooms.

    Science.gov (United States)

    Gallagher, Suzanne Renick; Coon, William; Donley, Kristin; Scott, Abby; Goldberg, Debra S

    2011-10-01

    Computer science has become ubiquitous in many areas of biological research, yet most high school and even college students are unaware of this. As a result, many college biology majors graduate without adequate computational skills for contemporary fields of biology. The absence of a computational element in secondary school biology classrooms is of growing concern to the computational biology community and biology teachers who would like to acquaint their students with updated approaches in the discipline. We present a first attempt to correct this absence by introducing a computational biology element to teach genetic evolution into advanced biology classes in two local high schools. Our primary goal was to show students how computation is used in biology and why a basic understanding of computation is necessary for research in many fields of biology. This curriculum is intended to be taught by a computational biologist who has worked with a high school advanced biology teacher to adapt the unit for his/her classroom, but a motivated high school teacher comfortable with mathematics and computing may be able to teach this alone. In this paper, we present our curriculum, which takes into consideration the constraints of the required curriculum, and discuss our experiences teaching it. We describe the successes and challenges we encountered while bringing this unit to high school students, discuss how we addressed these challenges, and make suggestions for future versions of this curriculum.We believe that our curriculum can be a valuable seed for further development of computational activities aimed at high school biology students. Further, our experiences may be of value to others teaching computational biology at this level. Our curriculum can be obtained at http://ecsite.cs.colorado.edu/?page_id=149#biology or by contacting the authors.

  16. SIMS applications in biological research

    International Nuclear Information System (INIS)

    Prince, K.E.; Burke, P.T.; Kelly, I.J.

    2000-01-01

    Full text: SIMS has been utilised as a tool for biological research since the early 1970's. SIMS' abilities in isotopic detection with high sensitivity, imaging capabilities at a subcellular level, and the possibility of molecular imaging have been the main areas of interest for biological development. However, whilst hundreds of instruments are available in industrial and university laboratories for semiconductor and materials analysis, only a handful successfully perform biological research. For this reason there is generally a lack of awareness of SIMS by the biological community. Biological SIMS analysis requires a working knowledge of both biology and SIMS. Sample preparation is a critical and time consuming prerequisite for any successful biological SIMS study. In addition, for quantification to be possible a homogeneous, matrix matched standard must be available. Once these difficulties are more widely understood and overcome there will be a greater motivation for the biological community to embrace SIMS as a unique tool in their research. This paper provides an overview of some of the more successful biological SIMS application areas internationally, and summarises the types of biological SIMS requests received by ANSTO

  17. Biological Defense Research Program

    Science.gov (United States)

    1989-04-01

    difference between life and death. Some recent examples are: BDRP developed VEE vaccine used in Central America, Mexico , and Texas (1969- 1971.) and Rift...Complex, is adn area owned by the Bureau of Land Management, which is available for grazina, and with specific permission, for use by DPG. 2.3...2.01 A Large European Laboratory, 1944-1950 50.00 Tuberculosis Laboratory 4 Technicians, Canada, 1947-1954 19.00 Research Institutes, 1930-1950 4.10

  18. Notions of similarity for computational biology models

    KAUST Repository

    Waltemath, Dagmar

    2016-03-21

    Computational models used in biology are rapidly increasing in complexity, size, and numbers. To build such large models, researchers need to rely on software tools for model retrieval, model combination, and version control. These tools need to be able to quantify the differences and similarities between computational models. However, depending on the specific application, the notion of similarity may greatly vary. A general notion of model similarity, applicable to various types of models, is still missing. Here, we introduce a general notion of quantitative model similarities, survey the use of existing model comparison methods in model building and management, and discuss potential applications of model comparison. To frame model comparison as a general problem, we describe a theoretical approach to defining and computing similarities based on different model aspects. Potentially relevant aspects of a model comprise its references to biological entities, network structure, mathematical equations and parameters, and dynamic behaviour. Future similarity measures could combine these model aspects in flexible, problem-specific ways in order to mimic users\\' intuition about model similarity, and to support complex model searches in databases.

  19. Notions of similarity for computational biology models

    KAUST Repository

    Waltemath, Dagmar; Henkel, Ron; Hoehndorf, Robert; Kacprowski, Tim; Knuepfer, Christian; Liebermeister, Wolfram

    2016-01-01

    Computational models used in biology are rapidly increasing in complexity, size, and numbers. To build such large models, researchers need to rely on software tools for model retrieval, model combination, and version control. These tools need to be able to quantify the differences and similarities between computational models. However, depending on the specific application, the notion of similarity may greatly vary. A general notion of model similarity, applicable to various types of models, is still missing. Here, we introduce a general notion of quantitative model similarities, survey the use of existing model comparison methods in model building and management, and discuss potential applications of model comparison. To frame model comparison as a general problem, we describe a theoretical approach to defining and computing similarities based on different model aspects. Potentially relevant aspects of a model comprise its references to biological entities, network structure, mathematical equations and parameters, and dynamic behaviour. Future similarity measures could combine these model aspects in flexible, problem-specific ways in order to mimic users' intuition about model similarity, and to support complex model searches in databases.

  20. Optical Computing - Research Trends

    Indian Academy of Sciences (India)

    Home; Journals; Resonance – Journal of Science Education; Volume 8; Issue 7. Optical Computing - Research Trends. Debabrata Goswami. General Article Volume 8 Issue 7 July 2003 pp 8-21. Fulltext. Click here to view fulltext PDF. Permanent link: https://www.ias.ac.in/article/fulltext/reso/008/07/0008-0021. Keywords.

  1. Bringing Advanced Computational Techniques to Energy Research

    Energy Technology Data Exchange (ETDEWEB)

    Mitchell, Julie C

    2012-11-17

    Please find attached our final technical report for the BACTER Institute award. BACTER was created as a graduate and postdoctoral training program for the advancement of computational biology applied to questions of relevance to bioenergy research.

  2. UC Merced Center for Computational Biology Final Report

    Energy Technology Data Exchange (ETDEWEB)

    Colvin, Michael; Watanabe, Masakatsu

    2010-11-30

    Final report for the UC Merced Center for Computational Biology. The Center for Computational Biology (CCB) was established to support multidisciplinary scientific research and academic programs in computational biology at the new University of California campus in Merced. In 2003, the growing gap between biology research and education was documented in a report from the National Academy of Sciences, Bio2010 Transforming Undergraduate Education for Future Research Biologists. We believed that a new type of biological sciences undergraduate and graduate programs that emphasized biological concepts and considered biology as an information science would have a dramatic impact in enabling the transformation of biology. UC Merced as newest UC campus and the first new U.S. research university of the 21st century was ideally suited to adopt an alternate strategy - to create a new Biological Sciences majors and graduate group that incorporated the strong computational and mathematical vision articulated in the Bio2010 report. CCB aimed to leverage this strong commitment at UC Merced to develop a new educational program based on the principle of biology as a quantitative, model-driven science. Also we expected that the center would be enable the dissemination of computational biology course materials to other university and feeder institutions, and foster research projects that exemplify a mathematical and computations-based approach to the life sciences. As this report describes, the CCB has been successful in achieving these goals, and multidisciplinary computational biology is now an integral part of UC Merced undergraduate, graduate and research programs in the life sciences. The CCB began in fall 2004 with the aid of an award from U.S. Department of Energy (DOE), under its Genomes to Life program of support for the development of research and educational infrastructure in the modern biological sciences. This report to DOE describes the research and academic programs

  3. Computer science, biology and biomedical informatics academy: outcomes from 5 years of immersing high-school students into informatics research

    Directory of Open Access Journals (Sweden)

    Andrew J King

    2017-01-01

    Full Text Available The University of Pittsburgh's Department of Biomedical Informatics and Division of Pathology Informatics created a Science, Technology, Engineering, and Mathematics (STEM pipeline in 2011 dedicated to providing cutting-edge informatics research and career preparatory experiences to a diverse group of highly motivated high-school students. In this third editorial installment describing the program, we provide a brief overview of the pipeline, report on achievements of the past scholars, and present results from self-reported assessments by the 2015 cohort of scholars. The pipeline continues to expand with the 2015 addition of the innovation internship, and the introduction of a program in 2016 aimed at offering first-time research experiences to undergraduates who are underrepresented in pathology and biomedical informatics. Achievements of program scholars include authorship of journal articles, symposium and summit presentations, and attendance at top 25 universities. All of our alumni matriculated into higher education and 90% remain in STEM majors. The 2015 high-school program had ten participating scholars who self-reported gains in confidence in their research abilities and understanding of what it means to be a scientist.

  4. Computer Science, Biology and Biomedical Informatics academy: Outcomes from 5 years of Immersing High-school Students into Informatics Research.

    Science.gov (United States)

    King, Andrew J; Fisher, Arielle M; Becich, Michael J; Boone, David N

    2017-01-01

    The University of Pittsburgh's Department of Biomedical Informatics and Division of Pathology Informatics created a Science, Technology, Engineering, and Mathematics (STEM) pipeline in 2011 dedicated to providing cutting-edge informatics research and career preparatory experiences to a diverse group of highly motivated high-school students. In this third editorial installment describing the program, we provide a brief overview of the pipeline, report on achievements of the past scholars, and present results from self-reported assessments by the 2015 cohort of scholars. The pipeline continues to expand with the 2015 addition of the innovation internship, and the introduction of a program in 2016 aimed at offering first-time research experiences to undergraduates who are underrepresented in pathology and biomedical informatics. Achievements of program scholars include authorship of journal articles, symposium and summit presentations, and attendance at top 25 universities. All of our alumni matriculated into higher education and 90% remain in STEM majors. The 2015 high-school program had ten participating scholars who self-reported gains in confidence in their research abilities and understanding of what it means to be a scientist.

  5. CSBB: synthetic biology research at Newcastle University.

    Science.gov (United States)

    Goñi-Moreno, Angel; Wipat, Anil; Krasnogor, Natalio

    2017-06-15

    The Centre for Synthetic Biology and the Bioeconomy (CSBB) brings together a far-reaching multidisciplinary community across all Newcastle University's faculties - Medical Sciences, Science, Agriculture and Engineering, and Humanities, Arts and Social Sciences. The CSBB focuses on many different areas of Synthetic Biology, including bioprocessing, computational design and in vivo computation, as well as improving understanding of basic molecular machinery. Such breadth is supported by major national and international research funding, a range of industrial partners in the North East of England and beyond, as well as a large number of doctoral and post-doctoral researchers. The CSBB trains the next generation of scientists through a 1-year MSc in Synthetic Biology. © 2017 The Author(s).

  6. Computational structural biology: methods and applications

    National Research Council Canada - National Science Library

    Schwede, Torsten; Peitsch, Manuel Claude

    2008-01-01

    ... sequencing reinforced the observation that structural information is needed to understand the detailed function and mechanism of biological molecules such as enzyme reactions and molecular recognition events. Furthermore, structures are obviously key to the design of molecules with new or improved functions. In this context, computational structural biology...

  7. Toward computational cumulative biology by combining models of biological datasets.

    Science.gov (United States)

    Faisal, Ali; Peltonen, Jaakko; Georgii, Elisabeth; Rung, Johan; Kaski, Samuel

    2014-01-01

    A main challenge of data-driven sciences is how to make maximal use of the progressively expanding databases of experimental datasets in order to keep research cumulative. We introduce the idea of a modeling-based dataset retrieval engine designed for relating a researcher's experimental dataset to earlier work in the field. The search is (i) data-driven to enable new findings, going beyond the state of the art of keyword searches in annotations, (ii) modeling-driven, to include both biological knowledge and insights learned from data, and (iii) scalable, as it is accomplished without building one unified grand model of all data. Assuming each dataset has been modeled beforehand, by the researchers or automatically by database managers, we apply a rapidly computable and optimizable combination model to decompose a new dataset into contributions from earlier relevant models. By using the data-driven decomposition, we identify a network of interrelated datasets from a large annotated human gene expression atlas. While tissue type and disease were major driving forces for determining relevant datasets, the found relationships were richer, and the model-based search was more accurate than the keyword search; moreover, it recovered biologically meaningful relationships that are not straightforwardly visible from annotations-for instance, between cells in different developmental stages such as thymocytes and T-cells. Data-driven links and citations matched to a large extent; the data-driven links even uncovered corrections to the publication data, as two of the most linked datasets were not highly cited and turned out to have wrong publication entries in the database.

  8. Optical Computing Research.

    Science.gov (United States)

    1987-10-30

    1489-1496, 1985. 13. W.T. Welford and R. Winston, The Optics of Nonimaging Concentrators, Academic Press, New York, N.Y., 1978 (see Appendix A). 14. R.H...AD-fIB? Ŗ OPTICAL CONPIITINO RESEAIRCII(U STANFORD UlNIV CA STINFORD / ELECTRONICS LASS J N 0000W4 30 OCT 97 SMAFOSR-TR-S?-1635 RFOSR-96...Force Base ELEMENT NO. NO. NO. NO. Washington, DC 20332-6448 11. TITLE ,Include Security ClaaticaonUNCLASSIFIED 61102F 2305 B4 OPTICAL COMPUTING RESEARCH

  9. Evolutionary Biology Research in India

    Indian Academy of Sciences (India)

    Home; Journals; Resonance – Journal of Science Education; Volume 5; Issue 10. Evolutionary Biology Research in India. Information and Announcements Volume 5 Issue 10 October 2000 pp 102-104. Fulltext. Click here to view fulltext PDF. Permanent link: https://www.ias.ac.in/article/fulltext/reso/005/10/0102-0104 ...

  10. Integrating interactive computational modeling in biology curricula.

    Directory of Open Access Journals (Sweden)

    Tomáš Helikar

    2015-03-01

    Full Text Available While the use of computer tools to simulate complex processes such as computer circuits is normal practice in fields like engineering, the majority of life sciences/biological sciences courses continue to rely on the traditional textbook and memorization approach. To address this issue, we explored the use of the Cell Collective platform as a novel, interactive, and evolving pedagogical tool to foster student engagement, creativity, and higher-level thinking. Cell Collective is a Web-based platform used to create and simulate dynamical models of various biological processes. Students can create models of cells, diseases, or pathways themselves or explore existing models. This technology was implemented in both undergraduate and graduate courses as a pilot study to determine the feasibility of such software at the university level. First, a new (In Silico Biology class was developed to enable students to learn biology by "building and breaking it" via computer models and their simulations. This class and technology also provide a non-intimidating way to incorporate mathematical and computational concepts into a class with students who have a limited mathematical background. Second, we used the technology to mediate the use of simulations and modeling modules as a learning tool for traditional biological concepts, such as T cell differentiation or cell cycle regulation, in existing biology courses. Results of this pilot application suggest that there is promise in the use of computational modeling and software tools such as Cell Collective to provide new teaching methods in biology and contribute to the implementation of the "Vision and Change" call to action in undergraduate biology education by providing a hands-on approach to biology.

  11. Integrating interactive computational modeling in biology curricula.

    Science.gov (United States)

    Helikar, Tomáš; Cutucache, Christine E; Dahlquist, Lauren M; Herek, Tyler A; Larson, Joshua J; Rogers, Jim A

    2015-03-01

    While the use of computer tools to simulate complex processes such as computer circuits is normal practice in fields like engineering, the majority of life sciences/biological sciences courses continue to rely on the traditional textbook and memorization approach. To address this issue, we explored the use of the Cell Collective platform as a novel, interactive, and evolving pedagogical tool to foster student engagement, creativity, and higher-level thinking. Cell Collective is a Web-based platform used to create and simulate dynamical models of various biological processes. Students can create models of cells, diseases, or pathways themselves or explore existing models. This technology was implemented in both undergraduate and graduate courses as a pilot study to determine the feasibility of such software at the university level. First, a new (In Silico Biology) class was developed to enable students to learn biology by "building and breaking it" via computer models and their simulations. This class and technology also provide a non-intimidating way to incorporate mathematical and computational concepts into a class with students who have a limited mathematical background. Second, we used the technology to mediate the use of simulations and modeling modules as a learning tool for traditional biological concepts, such as T cell differentiation or cell cycle regulation, in existing biology courses. Results of this pilot application suggest that there is promise in the use of computational modeling and software tools such as Cell Collective to provide new teaching methods in biology and contribute to the implementation of the "Vision and Change" call to action in undergraduate biology education by providing a hands-on approach to biology.

  12. [Biological research and security institutes].

    Science.gov (United States)

    Darsie, G; Falczuk, A J; Bergmann, I E

    2006-04-01

    The threat of using biological material for ago-bioterrorist ends has risen in recent years, which means that research and diagnostic laboratories, biological agent banks and other institutions authorised to carry out scientific activities have had to implement biosafety and biosecurity measures to counter the threat, while carrying out activities to help prevent and monitor the accidental or intentional introduction of exotic animal diseases. This article briefly sets outthe basic components of biosafety and biosecurity, as well as recommendations on organisational strategies to consider in laboratories that support agro-bioterrorist surveillance and prevention programs.

  13. Development trend of radiation biology research-systems radiation biology

    International Nuclear Information System (INIS)

    Min Rui

    2010-01-01

    Radiation biology research has past 80 years. We have known much more about fundamentals, processes and results of biology effects induced by radiation and various factors that influence biology effects wide and deep, however many old and new scientific problems occurring in the field of radiation biology research remain to be illustrated. To explore and figure these scientific problems need systemic concept, methods and multi dimension view on the base of considerations of complexity of biology system, diversity of biology response, temporal and spatial process of biological effects during occurrence, and complex feed back network of biological regulations. (authors)

  14. Computational Tools for Stem Cell Biology.

    Science.gov (United States)

    Bian, Qin; Cahan, Patrick

    2016-12-01

    For over half a century, the field of developmental biology has leveraged computation to explore mechanisms of developmental processes. More recently, computational approaches have been critical in the translation of high throughput data into knowledge of both developmental and stem cell biology. In the past several years, a new subdiscipline of computational stem cell biology has emerged that synthesizes the modeling of systems-level aspects of stem cells with high-throughput molecular data. In this review, we provide an overview of this new field and pay particular attention to the impact that single cell transcriptomics is expected to have on our understanding of development and our ability to engineer cell fate. Copyright © 2016 Elsevier Ltd. All rights reserved.

  15. Filling the gap between biology and computer science.

    Science.gov (United States)

    Aguilar-Ruiz, Jesús S; Moore, Jason H; Ritchie, Marylyn D

    2008-07-17

    This editorial introduces BioData Mining, a new journal which publishes research articles related to advances in computational methods and techniques for the extraction of useful knowledge from heterogeneous biological data. We outline the aims and scope of the journal, introduce the publishing model and describe the open peer review policy, which fosters interaction within the research community.

  16. Bioconductor: open software development for computational biology and bioinformatics

    DEFF Research Database (Denmark)

    Gentleman, R.C.; Carey, V.J.; Bates, D.M.

    2004-01-01

    The Bioconductor project is an initiative for the collaborative creation of extensible software for computational biology and bioinformatics. The goals of the project include: fostering collaborative development and widespread use of innovative software, reducing barriers to entry into interdisci......The Bioconductor project is an initiative for the collaborative creation of extensible software for computational biology and bioinformatics. The goals of the project include: fostering collaborative development and widespread use of innovative software, reducing barriers to entry...... into interdisciplinary scientific research, and promoting the achievement of remote reproducibility of research results. We describe details of our aims and methods, identify current challenges, compare Bioconductor to other open bioinformatics projects, and provide working examples....

  17. Micro-Computers in Biology Inquiry.

    Science.gov (United States)

    Barnato, Carolyn; Barrett, Kathy

    1981-01-01

    Describes the modification of computer programs (BISON and POLLUT) to accommodate species and areas indigenous to the Pacific Coast area. Suggests that these programs, suitable for PET microcomputers, may foster a long-term, ongoing, inquiry-directed approach in biology. (DS)

  18. Quantum computing for physics research

    International Nuclear Information System (INIS)

    Georgeot, B.

    2006-01-01

    Quantum computers hold great promises for the future of computation. In this paper, this new kind of computing device is presented, together with a short survey of the status of research in this field. The principal algorithms are introduced, with an emphasis on the applications of quantum computing to physics. Experimental implementations are also briefly discussed

  19. Research in computer forensics

    OpenAIRE

    Wai, Hor Cheong

    2002-01-01

    Approved for public release; distribution is unlimited Computer Forensics involves the preservation, identification, extraction and documentation of computer evidence stored in the form of magnetically encoded information. With the proliferation of E-commerce initiatives and the increasing criminal activities on the web, this area of study is catching on in the IT industry and among the law enforcement agencies. The objective of the study is to explore the techniques of computer forensics ...

  20. Multidisciplinary Computational Research

    National Research Council Canada - National Science Library

    Visbal, Miguel R

    2006-01-01

    The purpose of this work is to develop advanced multidisciplinary numerical simulation capabilities for aerospace vehicles with emphasis on highly accurate, massively parallel computational methods...

  1. XIV Mediterranean Conference on Medical and Biological Engineering and Computing

    CERN Document Server

    Christofides, Stelios; Pattichis, Constantinos

    2016-01-01

    This volume presents the proceedings of Medicon 2016, held in Paphos, Cyprus. Medicon 2016 is the XIV in the series of regional meetings of the International Federation of Medical and Biological Engineering (IFMBE) in the Mediterranean. The goal of Medicon 2016 is to provide updated information on the state of the art on Medical and Biological Engineering and Computing under the main theme “Systems Medicine for the Delivery of Better Healthcare Services”. Medical and Biological Engineering and Computing cover complementary disciplines that hold great promise for the advancement of research and development in complex medical and biological systems. Research and development in these areas are impacting the science and technology by advancing fundamental concepts in translational medicine, by helping us understand human physiology and function at multiple levels, by improving tools and techniques for the detection, prevention and treatment of disease. Medicon 2016 provides a common platform for the cross fer...

  2. Review of domestic radiation biology research

    International Nuclear Information System (INIS)

    Zheng Chun; Song Lingli; Ai Zihui

    2011-01-01

    Radiation biology research in China during the past ten years are reviewed. It should be noticed that radiation-biology should focus on microdosimetry, microbeam application, and radiation biological mechanism. (authors)

  3. Ranked retrieval of Computational Biology models.

    Science.gov (United States)

    Henkel, Ron; Endler, Lukas; Peters, Andre; Le Novère, Nicolas; Waltemath, Dagmar

    2010-08-11

    The study of biological systems demands computational support. If targeting a biological problem, the reuse of existing computational models can save time and effort. Deciding for potentially suitable models, however, becomes more challenging with the increasing number of computational models available, and even more when considering the models' growing complexity. Firstly, among a set of potential model candidates it is difficult to decide for the model that best suits ones needs. Secondly, it is hard to grasp the nature of an unknown model listed in a search result set, and to judge how well it fits for the particular problem one has in mind. Here we present an improved search approach for computational models of biological processes. It is based on existing retrieval and ranking methods from Information Retrieval. The approach incorporates annotations suggested by MIRIAM, and additional meta-information. It is now part of the search engine of BioModels Database, a standard repository for computational models. The introduced concept and implementation are, to our knowledge, the first application of Information Retrieval techniques on model search in Computational Systems Biology. Using the example of BioModels Database, it was shown that the approach is feasible and extends the current possibilities to search for relevant models. The advantages of our system over existing solutions are that we incorporate a rich set of meta-information, and that we provide the user with a relevance ranking of the models found for a query. Better search capabilities in model databases are expected to have a positive effect on the reuse of existing models.

  4. Novel opportunities for computational biology and sociology in drug discovery☆

    Science.gov (United States)

    Yao, Lixia; Evans, James A.; Rzhetsky, Andrey

    2013-01-01

    Current drug discovery is impossible without sophisticated modeling and computation. In this review we outline previous advances in computational biology and, by tracing the steps involved in pharmaceutical development, explore a range of novel, high-value opportunities for computational innovation in modeling the biological process of disease and the social process of drug discovery. These opportunities include text mining for new drug leads, modeling molecular pathways and predicting the efficacy of drug cocktails, analyzing genetic overlap between diseases and predicting alternative drug use. Computation can also be used to model research teams and innovative regions and to estimate the value of academy–industry links for scientific and human benefit. Attention to these opportunities could promise punctuated advance and will complement the well-established computational work on which drug discovery currently relies. PMID:20349528

  5. Novel opportunities for computational biology and sociology in drug discovery

    Science.gov (United States)

    Yao, Lixia

    2009-01-01

    Drug discovery today is impossible without sophisticated modeling and computation. In this review we touch on previous advances in computational biology and by tracing the steps involved in pharmaceutical development, we explore a range of novel, high value opportunities for computational innovation in modeling the biological process of disease and the social process of drug discovery. These opportunities include text mining for new drug leads, modeling molecular pathways and predicting the efficacy of drug cocktails, analyzing genetic overlap between diseases and predicting alternative drug use. Computation can also be used to model research teams and innovative regions and to estimate the value of academy-industry ties for scientific and human benefit. Attention to these opportunities could promise punctuated advance, and will complement the well-established computational work on which drug discovery currently relies. PMID:19674801

  6. The fusion of biology, computer science, and engineering: towards efficient and successful synthetic biology.

    Science.gov (United States)

    Linshiz, Gregory; Goldberg, Alex; Konry, Tania; Hillson, Nathan J

    2012-01-01

    Synthetic biology is a nascent field that emerged in earnest only around the turn of the millennium. It aims to engineer new biological systems and impart new biological functionality, often through genetic modifications. The design and construction of new biological systems is a complex, multistep process, requiring multidisciplinary collaborative efforts from "fusion" scientists who have formal training in computer science or engineering, as well as hands-on biological expertise. The public has high expectations for synthetic biology and eagerly anticipates the development of solutions to the major challenges facing humanity. This article discusses laboratory practices and the conduct of research in synthetic biology. It argues that the fusion science approach, which integrates biology with computer science and engineering best practices, including standardization, process optimization, computer-aided design and laboratory automation, miniaturization, and systematic management, will increase the predictability and reproducibility of experiments and lead to breakthroughs in the construction of new biological systems. The article also discusses several successful fusion projects, including the development of software tools for DNA construction design automation, recursive DNA construction, and the development of integrated microfluidics systems.

  7. NASA's computer science research program

    Science.gov (United States)

    Larsen, R. L.

    1983-01-01

    Following a major assessment of NASA's computing technology needs, a new program of computer science research has been initiated by the Agency. The program includes work in concurrent processing, management of large scale scientific databases, software engineering, reliable computing, and artificial intelligence. The program is driven by applications requirements in computational fluid dynamics, image processing, sensor data management, real-time mission control and autonomous systems. It consists of university research, in-house NASA research, and NASA's Research Institute for Advanced Computer Science (RIACS) and Institute for Computer Applications in Science and Engineering (ICASE). The overall goal is to provide the technical foundation within NASA to exploit advancing computing technology in aerospace applications.

  8. The Virtual Cell: a software environment for computational cell biology.

    Science.gov (United States)

    Loew, L M; Schaff, J C

    2001-10-01

    The newly emerging field of computational cell biology requires software tools that address the needs of a broad community of scientists. Cell biological processes are controlled by an interacting set of biochemical and electrophysiological events that are distributed within complex cellular structures. Computational modeling is familiar to researchers in fields such as molecular structure, neurobiology and metabolic pathway engineering, and is rapidly emerging in the area of gene expression. Although some of these established modeling approaches can be adapted to address problems of interest to cell biologists, relatively few software development efforts have been directed at the field as a whole. The Virtual Cell is a computational environment designed for cell biologists as well as for mathematical biologists and bioengineers. It serves to aid the construction of cell biological models and the generation of simulations from them. The system enables the formulation of both compartmental and spatial models, the latter with either idealized or experimentally derived geometries of one, two or three dimensions.

  9. Computation: A New Open Access Journal of Computational Chemistry, Computational Biology and Computational Engineering

    OpenAIRE

    Karlheinz Schwarz; Rainer Breitling; Christian Allen

    2013-01-01

    Computation (ISSN 2079-3197; http://www.mdpi.com/journal/computation) is an international scientific open access journal focusing on fundamental work in the field of computational science and engineering. Computational science has become essential in many research areas by contributing to solving complex problems in fundamental science all the way to engineering. The very broad range of application domains suggests structuring this journal into three sections, which are briefly characterized ...

  10. Computational chemistry research

    Science.gov (United States)

    Levin, Eugene

    1987-01-01

    Task 41 is composed of two parts: (1) analysis and design studies related to the Numerical Aerodynamic Simulation (NAS) Extended Operating Configuration (EOC) and (2) computational chemistry. During the first half of 1987, Dr. Levin served as a member of an advanced system planning team to establish the requirements, goals, and principal technical characteristics of the NAS EOC. A paper entitled 'Scaling of Data Communications for an Advanced Supercomputer Network' is included. The high temperature transport properties (such as viscosity, thermal conductivity, etc.) of the major constituents of air (oxygen and nitrogen) were correctly determined. The results of prior ab initio computer solutions of the Schroedinger equation were combined with the best available experimental data to obtain complete interaction potentials for both neutral and ion-atom collision partners. These potentials were then used in a computer program to evaluate the collision cross-sections from which the transport properties could be determined. A paper entitled 'High Temperature Transport Properties of Air' is included.

  11. Computational intelligence, medicine and biology selected links

    CERN Document Server

    Zaitseva, Elena

    2015-01-01

    This book contains an interesting and state-of the art collection of chapters presenting several examples of attempts to developing modern tools utilizing computational intelligence in different real life problems encountered by humans. Reasoning, prediction, modeling, optimization, decision making, etc. need modern, soft and intelligent algorithms, methods and methodologies to solve, in the efficient ways, problems appearing in human activity. The contents of the book is divided into two parts. Part I, consisting of four chapters, is devoted to selected links of computational intelligence, medicine, health care and biomechanics. Several problems are considered: estimation of healthcare system reliability, classification of ultrasound thyroid images, application of fuzzy logic to measure weight status and central fatness, and deriving kinematics directly from video records. Part II, also consisting of four chapters, is devoted to selected links of computational intelligence and biology. The common denominato...

  12. Catalyzing Inquiry at the Interface of Computing and Biology

    Energy Technology Data Exchange (ETDEWEB)

    John Wooley; Herbert S. Lin

    2005-10-30

    This study is the first comprehensive NRC study that suggests a high-level intellectual structure for Federal agencies for supporting work at the biology/computing interface. The report seeks to establish the intellectual legitimacy of a fundamentally cross-disciplinary collaboration between biologists and computer scientists. That is, while some universities are increasingly favorable to research at the intersection, life science researchers at other universities are strongly impeded in their efforts to collaborate. This report addresses these impediments and describes proven strategies for overcoming them. An important feature of the report is the use of well-documented examples that describe clearly to individuals not trained in computer science the value and usage of computing across the biological sciences, from genes and proteins to networks and pathways, from organelles to cells, and from individual organisms to populations and ecosystems. It is hoped that these examples will be useful to students in the life sciences to motivate (continued) study in computer science that will enable them to be more facile users of computing in their future biological studies.

  13. Biological Research for Radiation Protection

    International Nuclear Information System (INIS)

    Kim, In Gyu; Kim, Kug Chan; Jung, Il Lae; Choi, Yong Ho; Kim, Jin Sik; Moon, Myung Sook; Byun, Hee Sun; Phyo, Ki Heon; Kim, Sung Keun

    2005-04-01

    The work scope of 'Biological Research for the Radiation Protection' had contained the research about ornithine decarboxylase and its controlling proteins, thioredoxin, peroxiredoxin, S-adenosymethionine decarboxylase, and glutamate decarboxylase 67KD effect on the cell death triggered ionizing radiation and H 2 O 2 (toxic agents). In this study, to elucidate the role of these proteins in the ionizing radiation (or H 2 O 2 )-induced apoptotic cell death, we utilized sensesed (or antisensed) cells, which overexpress (or down-regulate) RNAs associated with these proteins biosynthesis, and investigated the effects of these genes on the cytotoxicity caused by ionizing radiation and H 2 O 2 (or paraquat). We also investigated whether genisteine(or thiamine) may enhance the cytotoxic efficacy of tumor cells caused by ionizing radiation (may enhance the preventing effect radiation or paraquat-induced damage) because such compounds are able to potentiate the cell-killing or cell protecting effects. Based on the above result, we suggest that the express regulation of theses genes have potentially importance for sensitizing the efficiency of radiation therapy of cancer or for protecting the radiation-induced damage of normal cells

  14. Biological research for radiation protection

    Energy Technology Data Exchange (ETDEWEB)

    Kim, In Gyu; Kim, Kug Chan; Shim, Hae Won; Oh, Tae Jeong; Park, Seon Young; Lee, Kang Suk

    2000-04-01

    The work scope of Biological research for the radiation protection had contained the search of biological microanalytic methods for assessing the health effect by {gamma}-radiation and toxic agents, the standardization of human T-lymphocyte cell culture and polymerase chain reaction, T-cell clonal assay, and the quantification of mutation frequency in the hypoxanthine (guanine) phosphoribosyl transferase (HPRT) gene locus by single exposure or combined exposure. Especially, the polymerase chain reaction methods using reverse transcriptase has been developed to analyze the mutant gene induced by {gamma}-radiation and chemical (pentachlorophenol) agent exposure, and to investigate the point mutations in the HPRT gene locus of T-lymphocytes. The HPRT T-cell clonal assay revealed that it could not differentiate {gamma}-irradiation from pentachlorophenol, because the frequency of somatic mutations induced by both damaging agents increased in a dose-dependent manner. The analysis of DNA sequence alterations of HPRT mutant clones clearly showed that both damaging agents induced different mutational spectra in the HPRT locus of T-cells. The large deletions, which account for 75 percent of the analyzed mutants, are characteristic mutations induced by {gamma}-irradiation. By contrast, point mutations such as base substitutions and insertion, come up to 97 percent in the case of pentachlorophenol-treated cells. The point mutation frequencies at 190 base pair and 444 base pair positions are 3-6 folds as high as in those at other mutation positions. It may be that these mutation sites are hot spots induced by pentachlorophenol. These results suggest that the HPRT mutation spectrum can be used as a potential bio marker for assessing a specific environmental risk. (author)

  15. Biological research for radiation protection

    International Nuclear Information System (INIS)

    Kim, In Gyu; Kim, Kug Chan; Shim, Hae Won; Oh, Tae Jeong; Park, Seon Young; Lee, Kang Suk

    2000-04-01

    The work scope of Biological research for the radiation protection had contained the search of biological microanalytic methods for assessing the health effect by γ-radiation and toxic agents, the standardization of human T-lymphocyte cell culture and polymerase chain reaction, T-cell clonal assay, and the quantification of mutation frequency in the hypoxanthine (guanine) phosphoribosyl transferase (HPRT) gene locus by single exposure or combined exposure. Especially, the polymerase chain reaction methods using reverse transcriptase has been developed to analyze the mutant gene induced by γ-radiation and chemical (pentachlorophenol) agent exposure, and to investigate the point mutations in the HPRT gene locus of T-lymphocytes. The HPRT T-cell clonal assay revealed that it could not differentiate γ-irradiation from pentachlorophenol, because the frequency of somatic mutations induced by both damaging agents increased in a dose-dependent manner. The analysis of DNA sequence alterations of HPRT mutant clones clearly showed that both damaging agents induced different mutational spectra in the HPRT locus of T-cells. The large deletions, which account for 75 percent of the analyzed mutants, are characteristic mutations induced by γ-irradiation. By contrast, point mutations such as base substitutions and insertion, come up to 97 percent in the case of pentachlorophenol-treated cells. The point mutation frequencies at 190 base pair and 444 base pair positions are 3-6 folds as high as in those at other mutation positions. It may be that these mutation sites are hot spots induced by pentachlorophenol. These results suggest that the HPRT mutation spectrum can be used as a potential bio marker for assessing a specific environmental risk. (author)

  16. Computer supported qualitative research

    CERN Document Server

    Reis, Luís; Sousa, Francislê; Moreira, António; Lamas, David

    2017-01-01

    This book contains an edited selection of the papers accepted for presentation and discussion at the first International Symposium on Qualitative Research (ISQR2016), held in Porto, Portugal, July 12th-14th, 2016. The book and the symposium features the four main application fields Education, Health, Social Sciences and Engineering and Technology and seven main subjects: Rationale and Paradigms of Qualitative Research (theoretical studies, critical reflection about epistemological dimensions, ontological and axiological); Systematization of approaches with Qualitative Studies (literature review, integrating results, aggregation studies, meta -analysis, meta- analysis of qualitative meta- synthesis, meta- ethnography); Qualitative and Mixed Methods Research (emphasis in research processes that build on mixed methodologies but with priority to qualitative approaches); Data Analysis Types (content analysis , discourse analysis , thematic analysis , narrative analysis , etc.); Innovative processes of Qualitative ...

  17. 7th World Congress on Nature and Biologically Inspired Computing

    CERN Document Server

    Engelbrecht, Andries; Abraham, Ajith; Plessis, Mathys; Snášel, Václav; Muda, Azah

    2016-01-01

    World Congress on Nature and Biologically Inspired Computing (NaBIC) is organized to discuss the state-of-the-art as well as to address various issues with respect to Nurturing Intelligent Computing Towards Advancement of Machine Intelligence. This Volume contains the papers presented in the Seventh World Congress (NaBIC’15) held in Pietermaritzburg, South Africa during December 01-03, 2015. The 39 papers presented in this Volume were carefully reviewed and selected. The Volume would be a valuable reference to researchers, students and practitioners in the computational intelligence field.

  18. Computational Biology Support: RECOMB Conference Series (Conference Support)

    Energy Technology Data Exchange (ETDEWEB)

    Michael Waterman

    2006-06-15

    This funding was support for student and postdoctoral attendance at the Annual Recomb Conference from 2001 to 2005. The RECOMB Conference series was founded in 1997 to provide a scientific forum for theoretical advances in computational biology and their applications in molecular biology and medicine. The conference series aims at attracting research contributions in all areas of computational molecular biology. Typical, but not exclusive, the topics of interest are: Genomics, Molecular sequence analysis, Recognition of genes and regulatory elements, Molecular evolution, Protein structure, Structural genomics, Gene Expression, Gene Networks, Drug Design, Combinatorial libraries, Computational proteomics, and Structural and functional genomics. The origins of the conference came from the mathematical and computational side of the field, and there remains to be a certain focus on computational advances. However, the effective use of computational techniques to biological innovation is also an important aspect of the conference. The conference had a growing number of attendees, topping 300 in recent years and often exceeding 500. The conference program includes between 30 and 40 contributed papers, that are selected by a international program committee with around 30 experts during a rigorous review process rivaling the editorial procedure for top-rate scientific journals. In previous years papers selection has been made from up to 130--200 submissions from well over a dozen countries. 10-page extended abstracts of the contributed papers are collected in a volume published by ACM Press and Springer, and are available at the conference. Full versions of a selection of the papers are published annually in a special issue of the Journal of Computational Biology devoted to the RECOMB Conference. A further point in the program is a lively poster session. From 120-300 posters have been presented each year at RECOMB 2000. One of the highlights of each RECOMB conference is a

  19. Multiobjective optimization in bioinformatics and computational biology.

    Science.gov (United States)

    Handl, Julia; Kell, Douglas B; Knowles, Joshua

    2007-01-01

    This paper reviews the application of multiobjective optimization in the fields of bioinformatics and computational biology. A survey of existing work, organized by application area, forms the main body of the review, following an introduction to the key concepts in multiobjective optimization. An original contribution of the review is the identification of five distinct "contexts," giving rise to multiple objectives: These are used to explain the reasons behind the use of multiobjective optimization in each application area and also to point the way to potential future uses of the technique.

  20. Computer Science Research at Langley

    Science.gov (United States)

    Voigt, S. J. (Editor)

    1982-01-01

    A workshop was held at Langley Research Center, November 2-5, 1981, to highlight ongoing computer science research at Langley and to identify additional areas of research based upon the computer user requirements. A panel discussion was held in each of nine application areas, and these are summarized in the proceedings. Slides presented by the invited speakers are also included. A survey of scientific, business, data reduction, and microprocessor computer users helped identify areas of focus for the workshop. Several areas of computer science which are of most concern to the Langley computer users were identified during the workshop discussions. These include graphics, distributed processing, programmer support systems and tools, database management, and numerical methods.

  1. Structure, function, and behaviour of computational models in systems biology.

    Science.gov (United States)

    Knüpfer, Christian; Beckstein, Clemens; Dittrich, Peter; Le Novère, Nicolas

    2013-05-31

    Systems Biology develops computational models in order to understand biological phenomena. The increasing number and complexity of such "bio-models" necessitate computer support for the overall modelling task. Computer-aided modelling has to be based on a formal semantic description of bio-models. But, even if computational bio-models themselves are represented precisely in terms of mathematical expressions their full meaning is not yet formally specified and only described in natural language. We present a conceptual framework - the meaning facets - which can be used to rigorously specify the semantics of bio-models. A bio-model has a dual interpretation: On the one hand it is a mathematical expression which can be used in computational simulations (intrinsic meaning). On the other hand the model is related to the biological reality (extrinsic meaning). We show that in both cases this interpretation should be performed from three perspectives: the meaning of the model's components (structure), the meaning of the model's intended use (function), and the meaning of the model's dynamics (behaviour). In order to demonstrate the strengths of the meaning facets framework we apply it to two semantically related models of the cell cycle. Thereby, we make use of existing approaches for computer representation of bio-models as much as possible and sketch the missing pieces. The meaning facets framework provides a systematic in-depth approach to the semantics of bio-models. It can serve two important purposes: First, it specifies and structures the information which biologists have to take into account if they build, use and exchange models. Secondly, because it can be formalised, the framework is a solid foundation for any sort of computer support in bio-modelling. The proposed conceptual framework establishes a new methodology for modelling in Systems Biology and constitutes a basis for computer-aided collaborative research.

  2. Computing chemical organizations in biological networks.

    Science.gov (United States)

    Centler, Florian; Kaleta, Christoph; di Fenizio, Pietro Speroni; Dittrich, Peter

    2008-07-15

    Novel techniques are required to analyze computational models of intracellular processes as they increase steadily in size and complexity. The theory of chemical organizations has recently been introduced as such a technique that links the topology of biochemical reaction network models to their dynamical repertoire. The network is decomposed into algebraically closed and self-maintaining subnetworks called organizations. They form a hierarchy representing all feasible system states including all steady states. We present three algorithms to compute the hierarchy of organizations for network models provided in SBML format. Two of them compute the complete organization hierarchy, while the third one uses heuristics to obtain a subset of all organizations for large models. While the constructive approach computes the hierarchy starting from the smallest organization in a bottom-up fashion, the flux-based approach employs self-maintaining flux distributions to determine organizations. A runtime comparison on 16 different network models of natural systems showed that none of the two exhaustive algorithms is superior in all cases. Studying a 'genome-scale' network model with 762 species and 1193 reactions, we demonstrate how the organization hierarchy helps to uncover the model structure and allows to evaluate the model's quality, for example by detecting components and subsystems of the model whose maintenance is not explained by the model. All data and a Java implementation that plugs into the Systems Biology Workbench is available from http://www.minet.uni-jena.de/csb/prj/ot/tools.

  3. Transportation Research & Analysis Computing Center

    Data.gov (United States)

    Federal Laboratory Consortium — The technical objectives of the TRACC project included the establishment of a high performance computing center for use by USDOT research teams, including those from...

  4. Structural Biology and Molecular Applications Research

    Science.gov (United States)

    Part of NCI's Division of Cancer Biology's research portfolio, research and development in this area focuses on enabling technologies, models, and methodologies to support basic and applied cancer research.

  5. Computation: A New Open Access Journal of Computational Chemistry, Computational Biology and Computational Engineering

    Directory of Open Access Journals (Sweden)

    Karlheinz Schwarz

    2013-09-01

    Full Text Available Computation (ISSN 2079-3197; http://www.mdpi.com/journal/computation is an international scientific open access journal focusing on fundamental work in the field of computational science and engineering. Computational science has become essential in many research areas by contributing to solving complex problems in fundamental science all the way to engineering. The very broad range of application domains suggests structuring this journal into three sections, which are briefly characterized below. In each section a further focusing will be provided by occasionally organizing special issues on topics of high interests, collecting papers on fundamental work in the field. More applied papers should be submitted to their corresponding specialist journals. To help us achieve our goal with this journal, we have an excellent editorial board to advise us on the exciting current and future trends in computation from methodology to application. We very much look forward to hearing all about the research going on across the world. [...

  6. Computer science and operations research

    CERN Document Server

    Balci, Osman

    1992-01-01

    The interface of Operation Research and Computer Science - although elusive to a precise definition - has been a fertile area of both methodological and applied research. The papers in this book, written by experts in their respective fields, convey the current state-of-the-art in this interface across a broad spectrum of research domains which include optimization techniques, linear programming, interior point algorithms, networks, computer graphics in operations research, parallel algorithms and implementations, planning and scheduling, genetic algorithms, heuristic search techniques and dat

  7. Uncertainty in biology a computational modeling approach

    CERN Document Server

    Gomez-Cabrero, David

    2016-01-01

    Computational modeling of biomedical processes is gaining more and more weight in the current research into the etiology of biomedical problems and potential treatment strategies.  Computational modeling allows to reduce, refine and replace animal experimentation as well as to translate findings obtained in these experiments to the human background. However these biomedical problems are inherently complex with a myriad of influencing factors, which strongly complicates the model building and validation process.  This book wants to address four main issues related to the building and validation of computational models of biomedical processes: Modeling establishment under uncertainty Model selection and parameter fitting Sensitivity analysis and model adaptation Model predictions under uncertainty In each of the abovementioned areas, the book discusses a number of key-techniques by means of a general theoretical description followed by one or more practical examples.  This book is intended for graduate stude...

  8. Biological and Environmental Research Network Requirements

    Energy Technology Data Exchange (ETDEWEB)

    Balaji, V. [Princeton Univ., NJ (United States). Earth Science Grid Federation (ESGF); Boden, Tom [Oak Ridge National Lab. (ORNL), Oak Ridge, TN (United States); Cowley, Dave [Pacific Northwest National Lab. (PNNL), Richland, WA (United States); Dart, Eli [Lawrence Livermore National Lab. (LLNL), Livermore, CA (United States). ESNet; Dattoria, Vince [Lawrence Livermore National Lab. (LLNL), Livermore, CA (United States). ESNet; Desai, Narayan [Argonne National Lab. (ANL), Argonne, IL (United States); Egan, Rob [Lawrence Berkeley National Lab. (LBNL), Berkeley, CA (United States); Foster, Ian [Argonne National Lab. (ANL), Argonne, IL (United States); Goldstone, Robin [Lawrence Livermore National Lab. (LLNL), Livermore, CA (United States); Gregurick, Susan [U.S. Dept. of Energy, Washington, DC (United States). Biological Systems Science Division; Houghton, John [U.S. Dept. of Energy, Washington, DC (United States). Biological and Environmental Research (BER) Program; Izaurralde, Cesar [Pacific Northwest National Lab. (PNNL), Richland, WA (United States); Johnston, Bill [Lawrence Livermore National Lab. (LLNL), Livermore, CA (United States). ESNet; Joseph, Renu [U.S. Dept. of Energy, Washington, DC (United States). Climate and Environmental Sciences Division; Kleese-van Dam, Kerstin [Pacific Northwest National Lab. (PNNL), Richland, WA (United States); Lipton, Mary [Pacific Northwest National Lab. (PNNL), Richland, WA (United States); Monga, Inder [Lawrence Livermore National Lab. (LLNL), Livermore, CA (United States). ESNet; Pritchard, Matt [British Atmospheric Data Centre (BADC), Oxon (United Kingdom); Rotman, Lauren [Lawrence Livermore National Lab. (LLNL), Livermore, CA (United States). ESNet; Strand, Gary [National Center for Atmospheric Research (NCAR), Boulder, CO (United States); Stuart, Cory [Argonne National Lab. (ANL), Argonne, IL (United States); Tatusova, Tatiana [National Inst. of Health (NIH), Bethesda, MD (United States); Tierney, Brian [Lawrence Livermore National Lab. (LLNL), Livermore, CA (United States). ESNet; Thomas, Brian [Univ. of California, Berkeley, CA (United States); Williams, Dean N. [Lawrence Livermore National Lab. (LLNL), Livermore, CA (United States); Zurawski, Jason [Internet2, Washington, DC (United States)

    2013-09-01

    The Energy Sciences Network (ESnet) is the primary provider of network connectivity for the U.S. Department of Energy (DOE) Office of Science (SC), the single largest supporter of basic research in the physical sciences in the United States. In support of SC programs, ESnet regularly updates and refreshes its understanding of the networking requirements of the instruments, facilities, scientists, and science programs that it serves. This focus has helped ESnet be a highly successful enabler of scientific discovery for over 25 years. In November 2012, ESnet and the Office of Biological and Environmental Research (BER) of the DOE SC organized a review to characterize the networking requirements of the programs funded by the BER program office. Several key findings resulted from the review. Among them: 1) The scale of data sets available to science collaborations continues to increase exponentially. This has broad impact, both on the network and on the computational and storage systems connected to the network. 2) Many science collaborations require assistance to cope with the systems and network engineering challenges inherent in managing the rapid growth in data scale. 3) Several science domains operate distributed facilities that rely on high-performance networking for success. Key examples illustrated in this report include the Earth System Grid Federation (ESGF) and the Systems Biology Knowledgebase (KBase). This report expands on these points, and addresses others as well. The report contains a findings section as well as the text of the case studies discussed at the review.

  9. Computational mechanics research at ONR

    International Nuclear Information System (INIS)

    Kushner, A.S.

    1986-01-01

    Computational mechanics is not an identified program at the Office of Naval Research (ONR), but rather plays a key role in the Solid Mechanics, Fluid Mechanics, Energy Conversion, and Materials Science programs. The basic philosophy of the Mechanics Division at ONR is to support fundamental research which expands the basis for understanding, predicting, and controlling the behavior of solid and fluid materials and systems at the physical and geometric scales appropriate to the phenomena of interest. It is shown in this paper that a strong commonalty of computational mechanics drivers exists for the forefront research areas in both solid and fluid mechanics

  10. The Systems Biology Research Tool: evolvable open-source software

    Directory of Open Access Journals (Sweden)

    Wright Jeremiah

    2008-06-01

    Full Text Available Abstract Background Research in the field of systems biology requires software for a variety of purposes. Software must be used to store, retrieve, analyze, and sometimes even to collect the data obtained from system-level (often high-throughput experiments. Software must also be used to implement mathematical models and algorithms required for simulation and theoretical predictions on the system-level. Results We introduce a free, easy-to-use, open-source, integrated software platform called the Systems Biology Research Tool (SBRT to facilitate the computational aspects of systems biology. The SBRT currently performs 35 methods for analyzing stoichiometric networks and 16 methods from fields such as graph theory, geometry, algebra, and combinatorics. New computational techniques can be added to the SBRT via process plug-ins, providing a high degree of evolvability and a unifying framework for software development in systems biology. Conclusion The Systems Biology Research Tool represents a technological advance for systems biology. This software can be used to make sophisticated computational techniques accessible to everyone (including those with no programming ability, to facilitate cooperation among researchers, and to expedite progress in the field of systems biology.

  11. Haldane's Contributions to Biological Research in India

    Indian Academy of Sciences (India)

    and Industrial Research, New Delhi, he moved to Bhubaneswar to start his own ... Brown, Foreign Secretary, US National Academy of Sciences, in. 1964, upon .... lectures contained new ideas for biological research that could be conducted in ...

  12. Computational Biology and the Limits of Shared Vision

    DEFF Research Database (Denmark)

    Carusi, Annamaria

    2011-01-01

    of cases is necessary in order to gain a better perspective on social sharing of practices, and on what other factors this sharing is dependent upon. The article presents the case of currently emerging inter-disciplinary visual practices in the domain of computational biology, where the sharing of visual...... practices would be beneficial to the collaborations necessary for the research. Computational biology includes sub-domains where visual practices are coming to be shared across disciplines, and those where this is not occurring, and where the practices of others are resisted. A significant point......, its domain of study. Social practices alone are not sufficient to account for the shaping of evidence. The philosophy of Merleau-Ponty is introduced as providing an alternative framework for thinking of the complex inter-relations between all of these factors. This [End Page 300] philosophy enables us...

  13. 9th International Conference on Practical Applications of Computational Biology and Bioinformatics

    CERN Document Server

    Rocha, Miguel; Fdez-Riverola, Florentino; Paz, Juan

    2015-01-01

    This proceedings presents recent practical applications of Computational Biology and  Bioinformatics. It contains the proceedings of the 9th International Conference on Practical Applications of Computational Biology & Bioinformatics held at University of Salamanca, Spain, at June 3rd-5th, 2015. The International Conference on Practical Applications of Computational Biology & Bioinformatics (PACBB) is an annual international meeting dedicated to emerging and challenging applied research in Bioinformatics and Computational Biology. Biological and biomedical research are increasingly driven by experimental techniques that challenge our ability to analyse, process and extract meaningful knowledge from the underlying data. The impressive capabilities of next generation sequencing technologies, together with novel and ever evolving distinct types of omics data technologies, have put an increasingly complex set of challenges for the growing fields of Bioinformatics and Computational Biology. The analysis o...

  14. Crosscut report: Exascale Requirements Reviews, March 9–10, 2017 – Tysons Corner, Virginia. An Office of Science review sponsored by: Advanced Scientific Computing Research, Basic Energy Sciences, Biological and Environmental Research, Fusion Energy Sciences, High Energy Physics, Nuclear Physics

    Energy Technology Data Exchange (ETDEWEB)

    Gerber, Richard [Lawrence Berkeley National Lab. (LBNL), Berkeley, CA (United States). National Energy Research Scientific Computing Center (NERSC); Hack, James [Oak Ridge National Lab. (ORNL), Oak Ridge, TN (United States). Oak Ridge Leadership Computing Facility (OLCF); Riley, Katherine [Argonne National Lab., IL (United States). Argonne Leadership Computing Facility (ALCF); Antypas, Katie [Lawrence Berkeley National Lab. (LBNL), Berkeley, CA (United States). National Energy Research Scientific Computing Center (NERSC); Coffey, Richard [Argonne National Lab. (ANL), Argonne, IL (United States). Argonne Leadership Computing Facility (ALCF); Dart, Eli [Lawrence Berkeley National Lab. (LBNL), Berkeley, CA (United States). ESnet; Straatsma, Tjerk [Oak Ridge National Lab. (ORNL), Oak Ridge, TN (United States). Oak Ridge Leadership Computing Facility (OLCF); Wells, Jack [Oak Ridge National Lab. (ORNL), Oak Ridge, TN (United States). Oak Ridge Leadership Computing Facility (OLCF); Bard, Deborah [Lawrence Berkeley National Lab. (LBNL), Berkeley, CA (United States). National Energy Research Scientific Computing Center (NERSC); Dosanjh, Sudip [Lawrence Berkeley National Lab. (LBNL), Berkeley, CA (United States). National Energy Research Scientific Computing Center (NERSC); Monga, Inder [Lawrence Berkeley National Lab. (LBNL), Berkeley, CA (United States). ESnet; Papka, Michael E. [Argonne National Lab. (ANL), Argonne, IL (United States). Argonne Leadership Computing Facility; Rotman, Lauren [Lawrence Berkeley National Lab. (LBNL), Berkeley, CA (United States). ESnet

    2018-01-22

    The mission of the U.S. Department of Energy Office of Science (DOE SC) is the delivery of scientific discoveries and major scientific tools to transform our understanding of nature and to advance the energy, economic, and national security missions of the United States. To achieve these goals in today’s world requires investments in not only the traditional scientific endeavors of theory and experiment, but also in computational science and the facilities that support large-scale simulation and data analysis. The Advanced Scientific Computing Research (ASCR) program addresses these challenges in the Office of Science. ASCR’s mission is to discover, develop, and deploy computational and networking capabilities to analyze, model, simulate, and predict complex phenomena important to DOE. ASCR supports research in computational science, three high-performance computing (HPC) facilities — the National Energy Research Scientific Computing Center (NERSC) at Lawrence Berkeley National Laboratory and Leadership Computing Facilities at Argonne (ALCF) and Oak Ridge (OLCF) National Laboratories — and the Energy Sciences Network (ESnet) at Berkeley Lab. ASCR is guided by science needs as it develops research programs, computers, and networks at the leading edge of technologies. As we approach the era of exascale computing, technology changes are creating challenges for science programs in SC for those who need to use high performance computing and data systems effectively. Numerous significant modifications to today’s tools and techniques will be needed to realize the full potential of emerging computing systems and other novel computing architectures. To assess these needs and challenges, ASCR held a series of Exascale Requirements Reviews in 2015–2017, one with each of the six SC program offices,1 and a subsequent Crosscut Review that sought to integrate the findings from each. Participants at the reviews were drawn from the communities of leading domain

  15. From biological neural networks to thinking machines: Transitioning biological organizational principles to computer technology

    Science.gov (United States)

    Ross, Muriel D.

    1991-01-01

    The three-dimensional organization of the vestibular macula is under study by computer assisted reconstruction and simulation methods as a model for more complex neural systems. One goal of this research is to transition knowledge of biological neural network architecture and functioning to computer technology, to contribute to the development of thinking computers. Maculas are organized as weighted neural networks for parallel distributed processing of information. The network is characterized by non-linearity of its terminal/receptive fields. Wiring appears to develop through constrained randomness. A further property is the presence of two main circuits, highly channeled and distributed modifying, that are connected through feedforward-feedback collaterals and biasing subcircuit. Computer simulations demonstrate that differences in geometry of the feedback (afferent) collaterals affects the timing and the magnitude of voltage changes delivered to the spike initiation zone. Feedforward (efferent) collaterals act as voltage followers and likely inhibit neurons of the distributed modifying circuit. These results illustrate the importance of feedforward-feedback loops, of timing, and of inhibition in refining neural network output. They also suggest that it is the distributed modifying network that is most involved in adaptation, memory, and learning. Tests of macular adaptation, through hyper- and microgravitational studies, support this hypothesis since synapses in the distributed modifying circuit, but not the channeled circuit, are altered. Transitioning knowledge of biological systems to computer technology, however, remains problematical.

  16. Chinese Herbal Medicine Meets Biological Networks of Complex Diseases: A Computational Perspective

    OpenAIRE

    Shuo Gu; Jianfeng Pei

    2017-01-01

    With the rapid development of cheminformatics, computational biology, and systems biology, great progress has been made recently in the computational research of Chinese herbal medicine with in-depth understanding towards pharmacognosy. This paper summarized these studies in the aspects of computational methods, traditional Chinese medicine (TCM) compound databases, and TCM network pharmacology. Furthermore, we chose arachidonic acid metabolic network as a case study to demonstrate the regula...

  17. South African antarctic biological research programme

    CSIR Research Space (South Africa)

    SASCAR

    1981-07-01

    Full Text Available This document provides a description of the past, current and planned South African biological research activities in the sub-Antarctic and Antarctic regions. Future activities will fall under one of the five components of the research programme...

  18. Computing Platforms for Big Biological Data Analytics: Perspectives and Challenges.

    Science.gov (United States)

    Yin, Zekun; Lan, Haidong; Tan, Guangming; Lu, Mian; Vasilakos, Athanasios V; Liu, Weiguo

    2017-01-01

    The last decade has witnessed an explosion in the amount of available biological sequence data, due to the rapid progress of high-throughput sequencing projects. However, the biological data amount is becoming so great that traditional data analysis platforms and methods can no longer meet the need to rapidly perform data analysis tasks in life sciences. As a result, both biologists and computer scientists are facing the challenge of gaining a profound insight into the deepest biological functions from big biological data. This in turn requires massive computational resources. Therefore, high performance computing (HPC) platforms are highly needed as well as efficient and scalable algorithms that can take advantage of these platforms. In this paper, we survey the state-of-the-art HPC platforms for big biological data analytics. We first list the characteristics of big biological data and popular computing platforms. Then we provide a taxonomy of different biological data analysis applications and a survey of the way they have been mapped onto various computing platforms. After that, we present a case study to compare the efficiency of different computing platforms for handling the classical biological sequence alignment problem. At last we discuss the open issues in big biological data analytics.

  19. Biocellion: accelerating computer simulation of multicellular biological system models.

    Science.gov (United States)

    Kang, Seunghwa; Kahan, Simon; McDermott, Jason; Flann, Nicholas; Shmulevich, Ilya

    2014-11-01

    Biological system behaviors are often the outcome of complex interactions among a large number of cells and their biotic and abiotic environment. Computational biologists attempt to understand, predict and manipulate biological system behavior through mathematical modeling and computer simulation. Discrete agent-based modeling (in combination with high-resolution grids to model the extracellular environment) is a popular approach for building biological system models. However, the computational complexity of this approach forces computational biologists to resort to coarser resolution approaches to simulate large biological systems. High-performance parallel computers have the potential to address the computing challenge, but writing efficient software for parallel computers is difficult and time-consuming. We have developed Biocellion, a high-performance software framework, to solve this computing challenge using parallel computers. To support a wide range of multicellular biological system models, Biocellion asks users to provide their model specifics by filling the function body of pre-defined model routines. Using Biocellion, modelers without parallel computing expertise can efficiently exploit parallel computers with less effort than writing sequential programs from scratch. We simulate cell sorting, microbial patterning and a bacterial system in soil aggregate as case studies. Biocellion runs on x86 compatible systems with the 64 bit Linux operating system and is freely available for academic use. Visit http://biocellion.com for additional information. © The Author 2014. Published by Oxford University Press. All rights reserved. For Permissions, please e-mail: journals.permissions@oup.com.

  20. Research on cloud computing solutions

    OpenAIRE

    Liudvikas Kaklauskas; Vaida Zdanytė

    2015-01-01

    Cloud computing can be defined as a new style of computing in which dynamically scala-ble and often virtualized resources are provided as a services over the Internet. Advantages of the cloud computing technology include cost savings, high availability, and easy scalability. Voas and Zhang adapted six phases of computing paradigms, from dummy termi-nals/mainframes, to PCs, networking computing, to grid and cloud computing. There are four types of cloud computing: public cloud, private cloud, ...

  1. Radioactive 63Ni in biological research

    International Nuclear Information System (INIS)

    Kasprzak, K.S.; Sunderman, F.W. Jr.

    1979-01-01

    Applications of 63 Ni in biological research are reviewed, with emphasis upon recent investigations of nickel metabolism and toxicology in experimental animals. The radiochemistry of 63 Ni is summarized, including consideration of the preparation of certain 63 Ni compounds (e.g. 63 Ni(CO) 4 and 63 Ni 3 S 2 ) that are of current interest in toxicology, teratology and cancer research. Practical guidance is given regarding the detection and determination of 63 Ni in biological materials by autoradiography and liquid scintillation spectrometry. (author)

  2. 75 FR 6651 - Biological and Environmental Research Advisory Committee

    Science.gov (United States)

    2010-02-10

    ... DEPARTMENT OF ENERGY Biological and Environmental Research Advisory Committee AGENCY: Department... meeting of the Biological and Environmental Research Advisory Committee (BERAC). Federal Advisory.... Department of Energy, Office of Science, Office of Biological and Environmental Research, SC-23/Germantown...

  3. 77 FR 4028 - Biological and Environmental Research Advisory Committee

    Science.gov (United States)

    2012-01-26

    ... DEPARTMENT OF ENERGY Biological and Environmental Research Advisory Committee AGENCY: Department... meeting of the Biological and Environmental Research Advisory Committee (BERAC). The Federal Advisory.... Department of Energy, Office of Science, Office of Biological and Environmental Research, SC-23/Germantown...

  4. Center for Computing Research Summer Research Proceedings 2015.

    Energy Technology Data Exchange (ETDEWEB)

    Bradley, Andrew Michael [Sandia National Lab. (SNL-NM), Albuquerque, NM (United States); Parks, Michael L. [Sandia National Lab. (SNL-NM), Albuquerque, NM (United States)

    2015-12-18

    The Center for Computing Research (CCR) at Sandia National Laboratories organizes a summer student program each summer, in coordination with the Computer Science Research Institute (CSRI) and Cyber Engineering Research Institute (CERI).

  5. Applications of membrane computing in systems and synthetic biology

    CERN Document Server

    Gheorghe, Marian; Pérez-Jiménez, Mario

    2014-01-01

    Membrane Computing was introduced as a computational paradigm in Natural Computing. The models introduced, called Membrane (or P) Systems, provide a coherent platform to describe and study living cells as computational systems. Membrane Systems have been investigated for their computational aspects and employed to model problems in other fields, like: Computer Science, Linguistics, Biology, Economy, Computer Graphics, Robotics, etc. Their inherent parallelism, heterogeneity and intrinsic versatility allow them to model a broad range of processes and phenomena, being also an efficient means to solve and analyze problems in a novel way. Membrane Computing has been used to model biological systems, becoming with time a thorough modeling paradigm comparable, in its modeling and predicting capabilities, to more established models in this area. This book is the result of the need to collect, in an organic way, different facets of this paradigm. The chapters of this book, together with the web pages accompanying th...

  6. Senior Computational Scientist | Center for Cancer Research

    Science.gov (United States)

    The Basic Science Program (BSP) pursues independent, multidisciplinary research in basic and applied molecular biology, immunology, retrovirology, cancer biology, and human genetics. Research efforts and support are an integral part of the Center for Cancer Research (CCR) at the Frederick National Laboratory for Cancer Research (FNLCR). The Cancer & Inflammation Program (CIP),

  7. Deep Learning and Applications in Computational Biology

    KAUST Repository

    Zeng, Jianyang

    2016-01-01

    -transcriptional gene regulation. Though numerous computational methods have been developed for modeling RBP binding preferences, discovering a complete structural representation of the RBP targets by integrating their available structural features in all three

  8. Community-driven computational biology with Debian Linux.

    Science.gov (United States)

    Möller, Steffen; Krabbenhöft, Hajo Nils; Tille, Andreas; Paleino, David; Williams, Alan; Wolstencroft, Katy; Goble, Carole; Holland, Richard; Belhachemi, Dominique; Plessy, Charles

    2010-12-21

    The Open Source movement and its technologies are popular in the bioinformatics community because they provide freely available tools and resources for research. In order to feed the steady demand for updates on software and associated data, a service infrastructure is required for sharing and providing these tools to heterogeneous computing environments. The Debian Med initiative provides ready and coherent software packages for medical informatics and bioinformatics. These packages can be used together in Taverna workflows via the UseCase plugin to manage execution on local or remote machines. If such packages are available in cloud computing environments, the underlying hardware and the analysis pipelines can be shared along with the software. Debian Med closes the gap between developers and users. It provides a simple method for offering new releases of software and data resources, thus provisioning a local infrastructure for computational biology. For geographically distributed teams it can ensure they are working on the same versions of tools, in the same conditions. This contributes to the world-wide networking of researchers.

  9. Research Computing and Data for Geoscience

    OpenAIRE

    Smith, Preston

    2015-01-01

    This presentation will discuss the data storage and computational resources available for GIS researchers at Purdue. This presentation will discuss the data storage and computational resources available for GIS researchers at Purdue.

  10. 6th International Conference on Practical Applications of Computational Biology & Bioinformatics

    CERN Document Server

    Luscombe, Nicholas; Fdez-Riverola, Florentino; Rodríguez, Juan; Practical Applications of Computational Biology & Bioinformatics

    2012-01-01

    The growth in the Bioinformatics and Computational Biology fields over the last few years has been remarkable.. The analysis of the datasets of Next Generation Sequencing needs new algorithms and approaches from fields such as Databases, Statistics, Data Mining, Machine Learning, Optimization, Computer Science and Artificial Intelligence. Also Systems Biology has also been emerging as an alternative to the reductionist view that dominated biological research in the last decades. This book presents the results of the  6th International Conference on Practical Applications of Computational Biology & Bioinformatics held at University of Salamanca, Spain, 28-30th March, 2012 which brought together interdisciplinary scientists that have a strong background in the biological and computational sciences.

  11. 7th International Conference on Practical Applications of Computational Biology & Bioinformatics

    CERN Document Server

    Nanni, Loris; Rocha, Miguel; Fdez-Riverola, Florentino

    2013-01-01

    The growth in the Bioinformatics and Computational Biology fields over the last few years has been remarkable and the trend is to increase its pace. In fact, the need for computational techniques that can efficiently handle the huge amounts of data produced by the new experimental techniques in Biology is still increasing driven by new advances in Next Generation Sequencing, several types of the so called omics data and image acquisition, just to name a few. The analysis of the datasets that produces and its integration call for new algorithms and approaches from fields such as Databases, Statistics, Data Mining, Machine Learning, Optimization, Computer Science and Artificial Intelligence. Within this scenario of increasing data availability, Systems Biology has also been emerging as an alternative to the reductionist view that dominated biological research in the last decades. Indeed, Biology is more and more a science of information requiring tools from the computational sciences. In the last few years, we ...

  12. Modelling, abstraction, and computation in systems biology: A view from computer science.

    Science.gov (United States)

    Melham, Tom

    2013-04-01

    Systems biology is centrally engaged with computational modelling across multiple scales and at many levels of abstraction. Formal modelling, precise and formalised abstraction relationships, and computation also lie at the heart of computer science--and over the past decade a growing number of computer scientists have been bringing their discipline's core intellectual and computational tools to bear on biology in fascinating new ways. This paper explores some of the apparent points of contact between the two fields, in the context of a multi-disciplinary discussion on conceptual foundations of systems biology. Copyright © 2012 Elsevier Ltd. All rights reserved.

  13. 8th International Conference on Practical Applications of Computational Biology & Bioinformatics

    CERN Document Server

    Rocha, Miguel; Fdez-Riverola, Florentino; Santana, Juan

    2014-01-01

    Biological and biomedical research are increasingly driven by experimental techniques that challenge our ability to analyse, process and extract meaningful knowledge from the underlying data. The impressive capabilities of next generation sequencing technologies, together with novel and ever evolving distinct types of omics data technologies, have put an increasingly complex set of challenges for the growing fields of Bioinformatics and Computational Biology. The analysis of the datasets produced and their integration call for new algorithms and approaches from fields such as Databases, Statistics, Data Mining, Machine Learning, Optimization, Computer Science and Artificial Intelligence. Clearly, Biology is more and more a science of information requiring tools from the computational sciences. In the last few years, we have seen the surge of a new generation of interdisciplinary scientists that have a strong background in the biological and computational sciences. In this context, the interaction of researche...

  14. 10th International Conference on Practical Applications of Computational Biology & Bioinformatics

    CERN Document Server

    Rocha, Miguel; Fdez-Riverola, Florentino; Mayo, Francisco; Paz, Juan

    2016-01-01

    Biological and biomedical research are increasingly driven by experimental techniques that challenge our ability to analyse, process and extract meaningful knowledge from the underlying data. The impressive capabilities of next generation sequencing technologies, together with novel and ever evolving distinct types of omics data technologies, have put an increasingly complex set of challenges for the growing fields of Bioinformatics and Computational Biology. The analysis of the datasets produced and their integration call for new algorithms and approaches from fields such as Databases, Statistics, Data Mining, Machine Learning, Optimization, Computer Science and Artificial Intelligence. Clearly, Biology is more and more a science of information requiring tools from the computational sciences. In the last few years, we have seen the surge of a new generation of interdisciplinary scientists that have a strong background in the biological and computational sciences. In this context, the interaction of researche...

  15. Biologically Inspired Micro-Flight Research

    Science.gov (United States)

    Raney, David L.; Waszak, Martin R.

    2003-01-01

    Natural fliers demonstrate a diverse array of flight capabilities, many of which are poorly understood. NASA has established a research project to explore and exploit flight technologies inspired by biological systems. One part of this project focuses on dynamic modeling and control of micro aerial vehicles that incorporate flexible wing structures inspired by natural fliers such as insects, hummingbirds and bats. With a vast number of potential civil and military applications, micro aerial vehicles represent an emerging sector of the aerospace market. This paper describes an ongoing research activity in which mechanization and control concepts for biologically inspired micro aerial vehicles are being explored. Research activities focusing on a flexible fixed- wing micro aerial vehicle design and a flapping-based micro aerial vehicle concept are presented.

  16. The computational linguistics of biological sequences

    Energy Technology Data Exchange (ETDEWEB)

    Searls, D. [Univ. of Pennsylvania, Philadelphia, PA (United States)

    1995-12-31

    This tutorial was one of eight tutorials selected to be presented at the Third International Conference on Intelligent Systems for Molecular Biology which was held in the United Kingdom from July 16 to 19, 1995. Protein sequences are analogous in many respects, particularly their folding behavior. Proteins have a much richer variety of interactions, but in theory the same linguistic principles could come to bear in describing dependencies between distant residues that arise by virtue of three-dimensional structure. This tutorial will concentrate on nucleic acid sequences.

  17. Computational brain models: Advances from system biology and future challenges

    Directory of Open Access Journals (Sweden)

    George E. Barreto

    2015-02-01

    Full Text Available Computational brain models focused on the interactions between neurons and astrocytes, modeled via metabolic reconstructions, are reviewed. The large source of experimental data provided by the -omics techniques and the advance/application of computational and data-management tools are being fundamental. For instance, in the understanding of the crosstalk between these cells, the key neuroprotective mechanisms mediated by astrocytes in specific metabolic scenarios (1 and the identification of biomarkers for neurodegenerative diseases (2,3. However, the modeling of these interactions demands a clear view of the metabolic and signaling pathways implicated, but most of them are controversial and are still under evaluation (4. Hence, to gain insight into the complexity of these interactions a current view of the main pathways implicated in the neuron-astrocyte communication processes have been made from recent experimental reports and reviews. Furthermore, target problems, limitations and main conclusions have been identified from metabolic models of the brain reported from 2010. Finally, key aspects to take into account into the development of a computational model of the brain and topics that could be approached from a systems biology perspective in future research are highlighted.

  18. 78 FR 6087 - Biological and Environmental Research Advisory Committee

    Science.gov (United States)

    2013-01-29

    ... DEPARTMENT OF ENERGY Biological and Environmental Research Advisory Committee AGENCY: Office of... the Biological and Environmental Research Advisory Committee (BERAC). The Federal Advisory Committee... Federal Officer, BERAC, U.S. Department of Energy, Office of Science, Office of Biological and...

  19. Chinese Herbal Medicine Meets Biological Networks of Complex Diseases: A Computational Perspective

    Directory of Open Access Journals (Sweden)

    Shuo Gu

    2017-01-01

    Full Text Available With the rapid development of cheminformatics, computational biology, and systems biology, great progress has been made recently in the computational research of Chinese herbal medicine with in-depth understanding towards pharmacognosy. This paper summarized these studies in the aspects of computational methods, traditional Chinese medicine (TCM compound databases, and TCM network pharmacology. Furthermore, we chose arachidonic acid metabolic network as a case study to demonstrate the regulatory function of herbal medicine in the treatment of inflammation at network level. Finally, a computational workflow for the network-based TCM study, derived from our previous successful applications, was proposed.

  20. Chinese Herbal Medicine Meets Biological Networks of Complex Diseases: A Computational Perspective.

    Science.gov (United States)

    Gu, Shuo; Pei, Jianfeng

    2017-01-01

    With the rapid development of cheminformatics, computational biology, and systems biology, great progress has been made recently in the computational research of Chinese herbal medicine with in-depth understanding towards pharmacognosy. This paper summarized these studies in the aspects of computational methods, traditional Chinese medicine (TCM) compound databases, and TCM network pharmacology. Furthermore, we chose arachidonic acid metabolic network as a case study to demonstrate the regulatory function of herbal medicine in the treatment of inflammation at network level. Finally, a computational workflow for the network-based TCM study, derived from our previous successful applications, was proposed.

  1. Using a Computer Animation to Teach High School Molecular Biology

    Science.gov (United States)

    Rotbain, Yosi; Marbach-Ad, Gili; Stavy, Ruth

    2008-01-01

    We present an active way to use a computer animation in secondary molecular genetics class. For this purpose we developed an activity booklet that helps students to work interactively with a computer animation which deals with abstract concepts and processes in molecular biology. The achievements of the experimental group were compared with those…

  2. Research on cloud computing solutions

    Directory of Open Access Journals (Sweden)

    Liudvikas Kaklauskas

    2015-07-01

    Full Text Available Cloud computing can be defined as a new style of computing in which dynamically scala-ble and often virtualized resources are provided as a services over the Internet. Advantages of the cloud computing technology include cost savings, high availability, and easy scalability. Voas and Zhang adapted six phases of computing paradigms, from dummy termi-nals/mainframes, to PCs, networking computing, to grid and cloud computing. There are four types of cloud computing: public cloud, private cloud, hybrid cloud and community. The most common and well-known deployment model is Public Cloud. A Private Cloud is suited for sensitive data, where the customer is dependent on a certain degree of security.According to the different types of services offered, cloud computing can be considered to consist of three layers (services models: IaaS (infrastructure as a service, PaaS (platform as a service, SaaS (software as a service. Main cloud computing solutions: web applications, data hosting, virtualization, database clusters and terminal services. The advantage of cloud com-puting is the ability to virtualize and share resources among different applications with the objective for better server utilization and without a clustering solution, a service may fail at the moment the server crashes.DOI: 10.15181/csat.v2i2.914

  3. Computer graphics and research projects

    International Nuclear Information System (INIS)

    Ingtrakul, P.

    1994-01-01

    This report was prepared as an account of scientific visualization tools and application tools for scientists and engineers. It is provided a set of tools to create pictures and to interact with them in natural ways. It applied many techniques of computer graphics and computer animation through a number of full-color presentations as computer animated commercials, 3D computer graphics, dynamic and environmental simulations, scientific modeling and visualization, physically based modelling, and beavioral, skelatal, dynamics, and particle animation. It took in depth at original hardware and limitations of existing PC graphics adapters contain syste m performance, especially with graphics intensive application programs and user interfaces

  4. Onchocerciasis control: biological research is still needed

    Directory of Open Access Journals (Sweden)

    Boussinesq M.

    2008-09-01

    Full Text Available Achievements obtained by the onchocerciasis control programmes should not lead to a relaxation in the biological research on Onchocerca volvulus. Issues such as the Loa loa-related postivermectin serious adverse events, the uncertainties as to whether onchocerciasis can be eliminated by ivermectin treatments, and the possible emergence of ivermectin-resistant O. volvulus populations should be addressed proactively. Doxycycline, moxidectin and emodepside appear to be promising as alternative drugs against onchocerciasis but support to researches in immunology and genomics should also be increased to develop new control tools, including both vaccines and macrofilaricidal drugs.

  5. Biological effectiveness of neutrons: Research needs

    Energy Technology Data Exchange (ETDEWEB)

    Casarett, G.W.; Braby, L.A.; Broerse, J.J.; Elkind, M.M.; Goodhead, D.T.; Oleinick, N.L.

    1994-02-01

    The goal of this report was to provide a conceptual plan for a research program that would provide a basis for determining more precisely the biological effectiveness of neutron radiation with emphasis on endpoints relevant to the protection of human health. This report presents the findings of the experts for seven particular categories of scientific information on neutron biological effectiveness. Chapter 2 examines the radiobiological mechanisms underlying the assumptions used to estimate human risk from neutrons and other radiations. Chapter 3 discusses the qualitative and quantitative models used to organize and evaluate experimental observations and to provide extrapolations where direct observations cannot be made. Chapter 4 discusses the physical principles governing the interaction of radiation with biological systems and the importance of accurate dosimetry in evaluating radiation risk and reducing the uncertainty in the biological data. Chapter 5 deals with the chemical and molecular changes underlying cellular responses and the LET dependence of these changes. Chapter 6, in turn, discusses those cellular and genetic changes which lead to mutation or neoplastic transformation. Chapters 7 and 8 examine deterministic and stochastic effects, respectively, and the data required for the prediction of such effects at different organizational levels and for the extrapolation from experimental results in animals to risks for man. Gaps and uncertainties in this data are examined relative to data required for establishing radiation protection standards for neutrons and procedures for the effective and safe use of neutron and other high-LET radiation therapy.

  6. Biological effectiveness of neutrons: Research needs

    International Nuclear Information System (INIS)

    Casarett, G.W.; Braby, L.A.; Broerse, J.J.; Elkind, M.M.; Goodhead, D.T.; Oleinick, N.L.

    1994-02-01

    The goal of this report was to provide a conceptual plan for a research program that would provide a basis for determining more precisely the biological effectiveness of neutron radiation with emphasis on endpoints relevant to the protection of human health. This report presents the findings of the experts for seven particular categories of scientific information on neutron biological effectiveness. Chapter 2 examines the radiobiological mechanisms underlying the assumptions used to estimate human risk from neutrons and other radiations. Chapter 3 discusses the qualitative and quantitative models used to organize and evaluate experimental observations and to provide extrapolations where direct observations cannot be made. Chapter 4 discusses the physical principles governing the interaction of radiation with biological systems and the importance of accurate dosimetry in evaluating radiation risk and reducing the uncertainty in the biological data. Chapter 5 deals with the chemical and molecular changes underlying cellular responses and the LET dependence of these changes. Chapter 6, in turn, discusses those cellular and genetic changes which lead to mutation or neoplastic transformation. Chapters 7 and 8 examine deterministic and stochastic effects, respectively, and the data required for the prediction of such effects at different organizational levels and for the extrapolation from experimental results in animals to risks for man. Gaps and uncertainties in this data are examined relative to data required for establishing radiation protection standards for neutrons and procedures for the effective and safe use of neutron and other high-LET radiation therapy

  7. 11th International Conference on Practical Applications of Computational Biology & Bioinformatics

    CERN Document Server

    Mohamad, Mohd; Rocha, Miguel; Paz, Juan; Pinto, Tiago

    2017-01-01

    Biological and biomedical research are increasingly driven by experimental techniques that challenge our ability to analyse, process and extract meaningful knowledge from the underlying data. The impressive capabilities of next-generation sequencing technologies, together with novel and constantly evolving, distinct types of omics data technologies, have created an increasingly complex set of challenges for the growing fields of Bioinformatics and Computational Biology. The analysis of the datasets produced and their integration call for new algorithms and approaches from fields such as Databases, Statistics, Data Mining, Machine Learning, Optimization, Computer Science and Artificial Intelligence. Clearly, Biology is more and more a science of information and requires tools from the computational sciences. In the last few years, we have seen the rise of a new generation of interdisciplinary scientists with a strong background in the biological and computational sciences. In this context, the interaction of r...

  8. Fundamentals of bioinformatics and computational biology methods and exercises in matlab

    CERN Document Server

    Singh, Gautam B

    2015-01-01

    This book offers comprehensive coverage of all the core topics of bioinformatics, and includes practical examples completed using the MATLAB bioinformatics toolbox™. It is primarily intended as a textbook for engineering and computer science students attending advanced undergraduate and graduate courses in bioinformatics and computational biology. The book develops bioinformatics concepts from the ground up, starting with an introductory chapter on molecular biology and genetics. This chapter will enable physical science students to fully understand and appreciate the ultimate goals of applying the principles of information technology to challenges in biological data management, sequence analysis, and systems biology. The first part of the book also includes a survey of existing biological databases, tools that have become essential in today’s biotechnology research. The second part of the book covers methodologies for retrieving biological information, including fundamental algorithms for sequence compar...

  9. Gordon Research Conference on Mammary Gland Biology

    International Nuclear Information System (INIS)

    1989-01-01

    The 1989 conference was the tenth in the series of biennial Gordon Research Conferences on Mammary Gland Biology. Traditionally this conference brings together scientists from diverse backgrounds and experience but with a common interest in the biology of the mammary gland. Investigators from agricultural and medical schools, biochemists, cell and molecular biologists, endocrinologists, immunologists, and representatives from the emerging biotechnology industries met to discuss current concepts and results on the function and regulation of the normal and neoplastic mammary gland in a variety of species. Of the participants, approximately three-fourths were engaged in studying the normal mammary gland function, whereas the other quarter were engaged in studying the neoplastic gland. The interactions between scientists, clinicians, veterinarians examining both normal and neoplastic cell function serves to foster the multi-disciplinary goals of the conference and has stimulated many cooperative projects among participants in previous years

  10. National Biological Service Research Supports Watershed Planning

    Science.gov (United States)

    Snyder, Craig D.

    1996-01-01

    The National Biological Service's Leetown Science Center is investigating how human impacts on watershed, riparian, and in-stream habitats affect fish communities. The research will provide the basis for a Ridge and Valley model that will allow resource managers to accurately predict and effectively mitigate human impacts on water quality. The study takes place in the Opequon Creek drainage basin of West Virginia. A fourth-order tributary of the Potomac, the basin falls within the Ridge and Valley. The study will identify biological components sensitive to land use patterns and the condition of the riparian zone; the effect of stream size, location, and other characteristics on fish communities; the extent to which remote sensing can reliable measure the riparian zone; and the relationship between the rate of landscape change and the structure of fish communities.

  11. Deep Learning and Applications in Computational Biology

    KAUST Repository

    Zeng, Jianyang

    2016-01-26

    RNA-binding proteins (RBPs) play important roles in the post-transcriptional control of RNAs. Identifying RBP binding sites and characterizing RBP binding preferences are key steps toward understanding the basic mechanisms of the post-transcriptional gene regulation. Though numerous computational methods have been developed for modeling RBP binding preferences, discovering a complete structural representation of the RBP targets by integrating their available structural features in all three dimensions is still a challenging task. In this work, we develop a general and flexible deep learning framework for modeling structural binding preferences and predicting binding sites of RBPs, which takes (predicted) RNA tertiary structural information into account for the first time. Our framework constructs a unified representation that characterizes the structural specificities of RBP targets in all three dimensions, which can be further used to predict novel candidate binding sites and discover potential binding motifs. Through testing on the real CLIP-seq datasets, we have demonstrated that our deep learning framework can automatically extract effective hidden structural features from the encoded raw sequence and structural profiles, and predict accurate RBP binding sites. In addition, we have conducted the first study to show that integrating the additional RNA tertiary structural features can improve the model performance in predicting RBP binding sites, especially for the polypyrimidine tract-binding protein (PTB), which also provides a new evidence to support the view that RBPs may own specific tertiary structural binding preferences. In particular, the tests on the internal ribosome entry site (IRES) segments yield satisfiable results with experimental support from the literature and further demonstrate the necessity of incorporating RNA tertiary structural information into the prediction model. The source code of our approach can be found in https://github.com/thucombio/deepnet-rbp.

  12. Parallel computing in genomic research: advances and applications

    Directory of Open Access Journals (Sweden)

    Ocaña K

    2015-11-01

    Full Text Available Kary Ocaña,1 Daniel de Oliveira2 1National Laboratory of Scientific Computing, Petrópolis, Rio de Janeiro, 2Institute of Computing, Fluminense Federal University, Niterói, Brazil Abstract: Today's genomic experiments have to process the so-called "biological big data" that is now reaching the size of Terabytes and Petabytes. To process this huge amount of data, scientists may require weeks or months if they use their own workstations. Parallelism techniques and high-performance computing (HPC environments can be applied for reducing the total processing time and to ease the management, treatment, and analyses of this data. However, running bioinformatics experiments in HPC environments such as clouds, grids, clusters, and graphics processing unit requires the expertise from scientists to integrate computational, biological, and mathematical techniques and technologies. Several solutions have already been proposed to allow scientists for processing their genomic experiments using HPC capabilities and parallelism techniques. This article brings a systematic review of literature that surveys the most recently published research involving genomics and parallel computing. Our objective is to gather the main characteristics, benefits, and challenges that can be considered by scientists when running their genomic experiments to benefit from parallelism techniques and HPC capabilities. Keywords: high-performance computing, genomic research, cloud computing, grid computing, cluster computing, parallel computing

  13. Computer science research and technology volume 3

    CERN Document Server

    Bauer, Janice P

    2011-01-01

    This book presents leading-edge research from across the globe in the field of computer science research, technology and applications. Each contribution has been carefully selected for inclusion based on the significance of the research to this fast-moving and diverse field. Some topics included are: network topology; agile programming; virtualization; and reconfigurable computing.

  14. Applications of NMR in biological metabolic research

    International Nuclear Information System (INIS)

    Nie Jiarui; Li Xiuqin; He Chunjian

    1989-01-01

    The nuclear magnetic resonance has become a powerful means of studying biological metabolism in non-invasive and non-destructive way. Being used to study the metabolic processes of living system in normal physiological conditions as well as in molecular level, the method is better than other conventional approaches. Using important parameters such as NMR-chemical shifts, longitudinal relaxation time and transverse relaxation time, it is possible to probe the metabolic processes as well as conformation, concentration, transportation and distribution of reacting and resulting substances. The NMR spectroscopy of 1 H, 31 P and 13 C nuclei has already been widely used in metabolic researches

  15. Synergy between medicinal chemistry and biological research.

    Science.gov (United States)

    Moncada, Salvador; Coaker, Hannah

    2014-09-01

    Salvador Moncada studied medicine at the University of El Salvador (El Salvador) before coming to the UK in 1971 to work on a PhD with Professor John Vane at the Institute of Basic Medical Sciences, Royal College of Surgeons (UK). After a short period of research at the University of Honduras (Honduras), he joined the Wellcome Research Laboratories (UK) where he became Head of the Department of Prostaglandin Research and later, Director of Research. He returned to academic life in 1996 as founder and director of the Wolfson Institute for Biomedical Research at University College London (UK). Moncada played a role in the discovery of the mechanism of action of aspirin-like drugs and later led the teams which discover prostacyclin and identified nitric oxide as a biological mediator. In his role as a Director of Research of the Wellcome Laboratories, he oversaw the discovery and development of medicines for epilepsy, migraine, malaria and cancer. Currently, he is working on the regulation of cell proliferation as Director of the Institute of Cancer Sciences at the University of Manchester (UK). Moncada has won numerous awards from the international scientific community and in 2010, he received a knighthood from Her Majesty Queen Elizabeth II for his services to science.

  16. Computational Biomechanics Theoretical Background and BiologicalBiomedical Problems

    CERN Document Server

    Tanaka, Masao; Nakamura, Masanori

    2012-01-01

    Rapid developments have taken place in biological/biomedical measurement and imaging technologies as well as in computer analysis and information technologies. The increase in data obtained with such technologies invites the reader into a virtual world that represents realistic biological tissue or organ structures in digital form and allows for simulation and what is called “in silico medicine.” This volume is the third in a textbook series and covers both the basics of continuum mechanics of biosolids and biofluids and the theoretical core of computational methods for continuum mechanics analyses. Several biomechanics problems are provided for better understanding of computational modeling and analysis. Topics include the mechanics of solid and fluid bodies, fundamental characteristics of biosolids and biofluids, computational methods in biomechanics analysis/simulation, practical problems in orthopedic biomechanics, dental biomechanics, ophthalmic biomechanics, cardiovascular biomechanics, hemodynamics...

  17. Revision history aware repositories of computational models of biological systems.

    Science.gov (United States)

    Miller, Andrew K; Yu, Tommy; Britten, Randall; Cooling, Mike T; Lawson, James; Cowan, Dougal; Garny, Alan; Halstead, Matt D B; Hunter, Peter J; Nickerson, David P; Nunns, Geo; Wimalaratne, Sarala M; Nielsen, Poul M F

    2011-01-14

    Building repositories of computational models of biological systems ensures that published models are available for both education and further research, and can provide a source of smaller, previously verified models to integrate into a larger model. One problem with earlier repositories has been the limitations in facilities to record the revision history of models. Often, these facilities are limited to a linear series of versions which were deposited in the repository. This is problematic for several reasons. Firstly, there are many instances in the history of biological systems modelling where an 'ancestral' model is modified by different groups to create many different models. With a linear series of versions, if the changes made to one model are merged into another model, the merge appears as a single item in the history. This hides useful revision history information, and also makes further merges much more difficult, as there is no record of which changes have or have not already been merged. In addition, a long series of individual changes made outside of the repository are also all merged into a single revision when they are put back into the repository, making it difficult to separate out individual changes. Furthermore, many earlier repositories only retain the revision history of individual files, rather than of a group of files. This is an important limitation to overcome, because some types of models, such as CellML 1.1 models, can be developed as a collection of modules, each in a separate file. The need for revision history is widely recognised for computer software, and a lot of work has gone into developing version control systems and distributed version control systems (DVCSs) for tracking the revision history. However, to date, there has been no published research on how DVCSs can be applied to repositories of computational models of biological systems. We have extended the Physiome Model Repository software to be fully revision history aware

  18. Revision history aware repositories of computational models of biological systems

    Directory of Open Access Journals (Sweden)

    Nickerson David P

    2011-01-01

    Full Text Available Abstract Background Building repositories of computational models of biological systems ensures that published models are available for both education and further research, and can provide a source of smaller, previously verified models to integrate into a larger model. One problem with earlier repositories has been the limitations in facilities to record the revision history of models. Often, these facilities are limited to a linear series of versions which were deposited in the repository. This is problematic for several reasons. Firstly, there are many instances in the history of biological systems modelling where an 'ancestral' model is modified by different groups to create many different models. With a linear series of versions, if the changes made to one model are merged into another model, the merge appears as a single item in the history. This hides useful revision history information, and also makes further merges much more difficult, as there is no record of which changes have or have not already been merged. In addition, a long series of individual changes made outside of the repository are also all merged into a single revision when they are put back into the repository, making it difficult to separate out individual changes. Furthermore, many earlier repositories only retain the revision history of individual files, rather than of a group of files. This is an important limitation to overcome, because some types of models, such as CellML 1.1 models, can be developed as a collection of modules, each in a separate file. The need for revision history is widely recognised for computer software, and a lot of work has gone into developing version control systems and distributed version control systems (DVCSs for tracking the revision history. However, to date, there has been no published research on how DVCSs can be applied to repositories of computational models of biological systems. Results We have extended the Physiome Model

  19. DOE EPSCoR Initiative in Structural and computational Biology/Bioinformatics

    Energy Technology Data Exchange (ETDEWEB)

    Wallace, Susan S.

    2008-02-21

    The overall goal of the DOE EPSCoR Initiative in Structural and Computational Biology was to enhance the competiveness of Vermont research in these scientific areas. To develop self-sustaining infrastructure, we increased the critical mass of faculty, developed shared resources that made junior researchers more competitive for federal research grants, implemented programs to train graduate and undergraduate students who participated in these research areas and provided seed money for research projects. During the time period funded by this DOE initiative: (1) four new faculty were recruited to the University of Vermont using DOE resources, three in Computational Biology and one in Structural Biology; (2) technical support was provided for the Computational and Structural Biology facilities; (3) twenty-two graduate students were directly funded by fellowships; (4) fifteen undergraduate students were supported during the summer; and (5) twenty-eight pilot projects were supported. Taken together these dollars resulted in a plethora of published papers, many in high profile journals in the fields and directly impacted competitive extramural funding based on structural or computational biology resulting in 49 million dollars awarded in grants (Appendix I), a 600% return on investment by DOE, the State and University.

  20. Activity report of Computing Research Center

    Energy Technology Data Exchange (ETDEWEB)

    1997-07-01

    On April 1997, National Laboratory for High Energy Physics (KEK), Institute of Nuclear Study, University of Tokyo (INS), and Meson Science Laboratory, Faculty of Science, University of Tokyo began to work newly as High Energy Accelerator Research Organization after reconstructing and converting their systems, under aiming at further development of a wide field of accelerator science using a high energy accelerator. In this Research Organization, Applied Research Laboratory is composed of four Centers to execute assistance of research actions common to one of the Research Organization and their relating research and development (R and D) by integrating the present four centers and their relating sections in Tanashi. What is expected for the assistance of research actions is not only its general assistance but also its preparation and R and D of a system required for promotion and future plan of the research. Computer technology is essential to development of the research and can communize for various researches in the Research Organization. On response to such expectation, new Computing Research Center is required for promoting its duty by coworking and cooperating with every researchers at a range from R and D on data analysis of various experiments to computation physics acting under driving powerful computer capacity such as supercomputer and so forth. Here were described on report of works and present state of Data Processing Center of KEK at the first chapter and of the computer room of INS at the second chapter and on future problems for the Computing Research Center. (G.K.)

  1. Computational intelligence techniques for biological data mining: An overview

    Science.gov (United States)

    Faye, Ibrahima; Iqbal, Muhammad Javed; Said, Abas Md; Samir, Brahim Belhaouari

    2014-10-01

    Computational techniques have been successfully utilized for a highly accurate analysis and modeling of multifaceted and raw biological data gathered from various genome sequencing projects. These techniques are proving much more effective to overcome the limitations of the traditional in-vitro experiments on the constantly increasing sequence data. However, most critical problems that caught the attention of the researchers may include, but not limited to these: accurate structure and function prediction of unknown proteins, protein subcellular localization prediction, finding protein-protein interactions, protein fold recognition, analysis of microarray gene expression data, etc. To solve these problems, various classification and clustering techniques using machine learning have been extensively used in the published literature. These techniques include neural network algorithms, genetic algorithms, fuzzy ARTMAP, K-Means, K-NN, SVM, Rough set classifiers, decision tree and HMM based algorithms. Major difficulties in applying the above algorithms include the limitations found in the previous feature encoding and selection methods while extracting the best features, increasing classification accuracy and decreasing the running time overheads of the learning algorithms. The application of this research would be potentially useful in the drug design and in the diagnosis of some diseases. This paper presents a concise overview of the well-known protein classification techniques.

  2. 2010 Plant Molecular Biology Gordon Research Conference

    Energy Technology Data Exchange (ETDEWEB)

    Michael Sussman

    2010-07-23

    The Plant Molecular Biology Conference has traditionally covered a breadth of exciting topics and the 2010 conference will continue in that tradition. Emerging concerns about food security have inspired a program with three main themes: (1) genomics, natural variation and breeding to understand adaptation and crop improvement, (2) hormonal cross talk, and (3) plant/microbe interactions. There are also sessions on epigenetics and proteomics/metabolomics. Thus this conference will bring together a range of disciplines, will foster the exchange of ideas and enable participants to learn of the latest developments and ideas in diverse areas of plant biology. The conference provides an excellent opportunity for individuals to discuss their research because additional speakers in each session will be selected from submitted abstracts. There will also be a poster session each day for a two-hour period prior to dinner. In particular, this conference plays a key role in enabling students and postdocs (the next generation of research leaders) to mingle with pioneers in multiple areas of plant science.

  3. The ISCB Student Council Internship Program: Expanding computational biology capacity worldwide.

    Science.gov (United States)

    Anupama, Jigisha; Francescatto, Margherita; Rahman, Farzana; Fatima, Nazeefa; DeBlasio, Dan; Shanmugam, Avinash Kumar; Satagopam, Venkata; Santos, Alberto; Kolekar, Pandurang; Michaut, Magali; Guney, Emre

    2018-01-01

    Education and training are two essential ingredients for a successful career. On one hand, universities provide students a curriculum for specializing in one's field of study, and on the other, internships complement coursework and provide invaluable training experience for a fruitful career. Consequently, undergraduates and graduates are encouraged to undertake an internship during the course of their degree. The opportunity to explore one's research interests in the early stages of their education is important for students because it improves their skill set and gives their career a boost. In the long term, this helps to close the gap between skills and employability among students across the globe and balance the research capacity in the field of computational biology. However, training opportunities are often scarce for computational biology students, particularly for those who reside in less-privileged regions. Aimed at helping students develop research and academic skills in computational biology and alleviating the divide across countries, the Student Council of the International Society for Computational Biology introduced its Internship Program in 2009. The Internship Program is committed to providing access to computational biology training, especially for students from developing regions, and improving competencies in the field. Here, we present how the Internship Program works and the impact of the internship opportunities so far, along with the challenges associated with this program.

  4. The ISCB Student Council Internship Program: Expanding computational biology capacity worldwide.

    Directory of Open Access Journals (Sweden)

    Jigisha Anupama

    2018-01-01

    Full Text Available Education and training are two essential ingredients for a successful career. On one hand, universities provide students a curriculum for specializing in one's field of study, and on the other, internships complement coursework and provide invaluable training experience for a fruitful career. Consequently, undergraduates and graduates are encouraged to undertake an internship during the course of their degree. The opportunity to explore one's research interests in the early stages of their education is important for students because it improves their skill set and gives their career a boost. In the long term, this helps to close the gap between skills and employability among students across the globe and balance the research capacity in the field of computational biology. However, training opportunities are often scarce for computational biology students, particularly for those who reside in less-privileged regions. Aimed at helping students develop research and academic skills in computational biology and alleviating the divide across countries, the Student Council of the International Society for Computational Biology introduced its Internship Program in 2009. The Internship Program is committed to providing access to computational biology training, especially for students from developing regions, and improving competencies in the field. Here, we present how the Internship Program works and the impact of the internship opportunities so far, along with the challenges associated with this program.

  5. Division of Biological and Medical Research annual research summary, 1983

    Energy Technology Data Exchange (ETDEWEB)

    Barr, S.H. (ed.)

    1984-08-01

    This research summary contains brief descriptions of research in the following areas: (1) mechanisms of hepatocarcinogenesis; (2) role of metals in cocarcinogenesis and the use of liposomes for metal mobilization; (3) control of mutagenesis and cell differentiation in cultured cells by tumor promoters; (4) radiation effects in mammalian cells; (5) radiation carcinogenesis and radioprotectors; (6) life shortening, tumor induction, and tissue dose for fission-neutron and gamma-ray irradiations; (7) mammalian genetics and biostatistics; (8) radiation toxicity studies; (9) hematopoiesis in chronic toxicity; (10) molecular biology studies; (11) chemical toxicology; (12) carcinogen identification and metabolism; (13) metal metabolism and toxicity; and (14) neurobehavioral chronobiology. (ACR)

  6. Division of Biological and Medical Research annual research summary, 1983

    International Nuclear Information System (INIS)

    Barr, S.H.

    1984-08-01

    This research summary contains brief descriptions of research in the following areas: (1) mechanisms of hepatocarcinogenesis; (2) role of metals in cocarcinogenesis and the use of liposomes for metal mobilization; (3) control of mutagenesis and cell differentiation in cultured cells by tumor promoters; (4) radiation effects in mammalian cells; (5) radiation carcinogenesis and radioprotectors; (6) life shortening, tumor induction, and tissue dose for fission-neutron and gamma-ray irradiations; (7) mammalian genetics and biostatistics; (8) radiation toxicity studies; (9) hematopoiesis in chronic toxicity; (10) molecular biology studies; (11) chemical toxicology; (12) carcinogen identification and metabolism; (13) metal metabolism and toxicity; and (14) neurobehavioral chronobiology

  7. Inter-level relations in computer science, biology, and psychology

    NARCIS (Netherlands)

    Boogerd, F.; Bruggeman, F.; Jonker, C.M.; Looren de Jong, H.; Tamminga, A.; Treur, J.; Westerhoff, H.V.; Wijngaards, W.C.A.

    2002-01-01

    Investigations into inter-level relations in computer science, biology and psychology call for an empirical turn in the philosophy of mind. Rather than concentrate on a priori discussions of inter-level relations between 'completed' sciences, a case is made for the actual study of the way

  8. Inter-level relations in computer science, biology and psychology

    NARCIS (Netherlands)

    Boogerd, F.C.; Bruggeman, F.J.; Jonker, C.M.; Looren De Jong, H.; Tamminga, A.M.; Treur, J.; Westerhoff, H.V.; Wijngaards, W.C.A.

    2002-01-01

    Investigations into inter-level relations in computer science, biology and psychology call for an empirical turn in the philosophy of mind. Rather than concentrate on a priori discussions of inter-level relations between "completed" sciences, a case is made for the actual study of the way

  9. Inter-level relations in computer science, biology, and psychology

    NARCIS (Netherlands)

    Boogerd, Fred; Bruggeman, Frank; Jonker, Catholijn; Looren de Jong, Huib; Tamminga, Allard; Treur, Jan; Westerhoff, Hans; Wijngaards, Wouter

    2002-01-01

    Investigations into inter-level relations in computer science, biology and psychology call for an *empirical* turn in the philosophy of mind. Rather than concentrate on *a priori* discussions of inter-level relations between “completed” sciences, a case is made for the actual study of the way

  10. Biology Students Building Computer Simulations Using StarLogo TNG

    Science.gov (United States)

    Smith, V. Anne; Duncan, Ishbel

    2011-01-01

    Confidence is an important issue for biology students in handling computational concepts. This paper describes a practical in which honours-level bioscience students simulate complex animal behaviour using StarLogo TNG, a freely-available graphical programming environment. The practical consists of two sessions, the first of which guides students…

  11. 2nd Colombian Congress on Computational Biology and Bioinformatics

    CERN Document Server

    Cristancho, Marco; Isaza, Gustavo; Pinzón, Andrés; Rodríguez, Juan

    2014-01-01

    This volume compiles accepted contributions for the 2nd Edition of the Colombian Computational Biology and Bioinformatics Congress CCBCOL, after a rigorous review process in which 54 papers were accepted for publication from 119 submitted contributions. Bioinformatics and Computational Biology are areas of knowledge that have emerged due to advances that have taken place in the Biological Sciences and its integration with Information Sciences. The expansion of projects involving the study of genomes has led the way in the production of vast amounts of sequence data which needs to be organized, analyzed and stored to understand phenomena associated with living organisms related to their evolution, behavior in different ecosystems, and the development of applications that can be derived from this analysis.  .

  12. Graphics supercomputer for computational fluid dynamics research

    Science.gov (United States)

    Liaw, Goang S.

    1994-11-01

    The objective of this project is to purchase a state-of-the-art graphics supercomputer to improve the Computational Fluid Dynamics (CFD) research capability at Alabama A & M University (AAMU) and to support the Air Force research projects. A cutting-edge graphics supercomputer system, Onyx VTX, from Silicon Graphics Computer Systems (SGI), was purchased and installed. Other equipment including a desktop personal computer, PC-486 DX2 with a built-in 10-BaseT Ethernet card, a 10-BaseT hub, an Apple Laser Printer Select 360, and a notebook computer from Zenith were also purchased. A reading room has been converted to a research computer lab by adding some furniture and an air conditioning unit in order to provide an appropriate working environments for researchers and the purchase equipment. All the purchased equipment were successfully installed and are fully functional. Several research projects, including two existing Air Force projects, are being performed using these facilities.

  13. The Learning of Biology: A Structural Basis for Future Research

    Science.gov (United States)

    Murray, Darrel L.

    1977-01-01

    This article reviews recent research studies and experiences relating the learning theories of Ausubel to biology instruction. Also some suggestions are made for future research on the learning of biology. (MR)

  14. Molecular biology approaches in bioadhesion research

    Directory of Open Access Journals (Sweden)

    Marcelo Rodrigues

    2014-07-01

    Full Text Available The use of molecular biology tools in the field of bioadhesion is still in its infancy. For new research groups who are considering taking a molecular approach, the techniques presented here are essential to unravelling the sequence of a gene, its expression and its biological function. Here we provide an outline for addressing adhesion-related genes in diverse organisms. We show how to gradually narrow down the number of candidate transcripts that are involved in adhesion by (1 generating a transcriptome and a differentially expressed cDNA list enriched for adhesion-related transcripts, (2 setting up a BLAST search facility, (3 perform an in situ hybridization screen, and (4 functional analyses of selected genes by using RNA interference knock-down. Furthermore, latest developments in genome-editing are presented as new tools to study gene function. By using this iterative multi-technologies approach, the identification, isolation, expression and function of adhesion-related genes can be studied in most organisms. These tools will improve our understanding of the diversity of molecules used for adhesion in different organisms and these findings will help to develop innovative bio-inspired adhesives.

  15. Computational Biology Methods for Characterization of Pluripotent Cells.

    Science.gov (United States)

    Araúzo-Bravo, Marcos J

    2016-01-01

    Pluripotent cells are a powerful tool for regenerative medicine and drug discovery. Several techniques have been developed to induce pluripotency, or to extract pluripotent cells from different tissues and biological fluids. However, the characterization of pluripotency requires tedious, expensive, time-consuming, and not always reliable wet-lab experiments; thus, an easy, standard quality-control protocol of pluripotency assessment remains to be established. Here to help comes the use of high-throughput techniques, and in particular, the employment of gene expression microarrays, which has become a complementary technique for cellular characterization. Research has shown that the transcriptomics comparison with an Embryonic Stem Cell (ESC) of reference is a good approach to assess the pluripotency. Under the premise that the best protocol is a computer software source code, here I propose and explain line by line a software protocol coded in R-Bioconductor for pluripotency assessment based on the comparison of transcriptomics data of pluripotent cells with an ESC of reference. I provide advice for experimental design, warning about possible pitfalls, and guides for results interpretation.

  16. Discovery of novel bacterial toxins by genomics and computational biology.

    Science.gov (United States)

    Doxey, Andrew C; Mansfield, Michael J; Montecucco, Cesare

    2018-06-01

    Hundreds and hundreds of bacterial protein toxins are presently known. Traditionally, toxin identification begins with pathological studies of bacterial infectious disease. Following identification and cultivation of a bacterial pathogen, the protein toxin is purified from the culture medium and its pathogenic activity is studied using the methods of biochemistry and structural biology, cell biology, tissue and organ biology, and appropriate animal models, supplemented by bioimaging techniques. The ongoing and explosive development of high-throughput DNA sequencing and bioinformatic approaches have set in motion a revolution in many fields of biology, including microbiology. One consequence is that genes encoding novel bacterial toxins can be identified by bioinformatic and computational methods based on previous knowledge accumulated from studies of the biology and pathology of thousands of known bacterial protein toxins. Starting from the paradigmatic cases of diphtheria toxin, tetanus and botulinum neurotoxins, this review discusses traditional experimental approaches as well as bioinformatics and genomics-driven approaches that facilitate the discovery of novel bacterial toxins. We discuss recent work on the identification of novel botulinum-like toxins from genera such as Weissella, Chryseobacterium, and Enteroccocus, and the implications of these computationally identified toxins in the field. Finally, we discuss the promise of metagenomics in the discovery of novel toxins and their ecological niches, and present data suggesting the existence of uncharacterized, botulinum-like toxin genes in insect gut metagenomes. Copyright © 2018. Published by Elsevier Ltd.

  17. Computer research in teaching geometry future bachelors

    Directory of Open Access Journals (Sweden)

    Aliya V. Bukusheva

    2017-12-01

    Full Text Available The article is devoted to the study of the problem of usage educational studies and experiments in the geometric education of IT specialists. We consider research method applied in teaching Computer Geometry intending Bachelors studying `Mathematics and Computer Science` 02.03.01. Examples of educational and research geometric problems that require usage of computer means in order to be solved are given. These tasks are considered as variations of educational and research tasks creating problems that demand experiments with dynamic models of mathematic objects in order to be solved.

  18. Biological research for the radiation protection

    International Nuclear Information System (INIS)

    Kim, In Gyu; Kim, Chan Kug; Shim, Hae Won; Jung, Il Lae; Byun, Hee Sun; Moon, Myung Sook; Cho, Hye Jeong; Kim, Jin Sik

    2003-04-01

    The work scope of 'Biological Research for the Radiation Protection' had contained the research about polyamine effect on cell death triggered ionizing radiation, H 2 O 2 and toxic agents. In this paper, to elucidate the role of polyamines as mediator in lysosomal damage and stress(H 2 O 2 )- induced apoptosis, we utilized α-DiFluoroMethylOrnithine (DFMO), which inhibited ornithine decarboxylase and depleted intracellular putrescine, and investigated the effects of polyamine on the apoptosis caused by H 2 O 2 , ionizing radiation and paraquat. We also showed that MGBG, inhibitor of polyamine biosynthesis, treatment affected intracellular redox steady states, intracellular ROS levels and protein oxidation. Thereafter we also investigated whether MGBG may enhance the cytotoxic efficacy of tumor cells caused by ionizing radiation or H 2 O 2 because such compounds are able to potentiate the cell-killing effects. In addition, ceruloplasmin and thioredoxin, possible antioxidant proteins, were shown to have protective effect on radiation- or H 2 O 2 (or chemicals)-induced macromolecular damage or cell death

  19. A Systems Biology Approach to Infectious Disease Research: Innovating the Pathogen-Host Research Paradigm

    Energy Technology Data Exchange (ETDEWEB)

    Aderem, Alan; Adkins, Joshua N.; Ansong, Charles; Galagan, James; Kaiser, Shari; Korth, Marcus J.; Law, G. L.; McDermott, Jason E.; Proll, Sean; Rosenberger, Carrie; Schoolnik, Gary; Katze, Michael G.

    2011-02-01

    The 20th century was marked by extraordinary advances in our understanding of microbes and infectious disease, but pandemics remain, food and water borne illnesses are frequent, multi-drug resistant microbes are on the rise, and the needed drugs and vaccines have not been developed. The scientific approaches of the past—including the intense focus on individual genes and proteins typical of molecular biology—have not been sufficient to address these challenges. The first decade of the 21st century has seen remarkable innovations in technology and computational methods. These new tools provide nearly comprehensive views of complex biological systems and can provide a correspondingly deeper understanding of pathogen-host interactions. To take full advantage of these innovations, the National Institute of Allergy and Infectious Diseases recently initiated the Systems Biology Program for Infectious Disease Research. As participants of the Systems Biology Program we think that the time is at hand to redefine the pathogen-host research paradigm.

  20. Research progress on space radiation biology

    International Nuclear Information System (INIS)

    Li Wenjian; Dang Bingrong; Wang Zhuanzi; Wei Wei; Jing Xigang; Wang Biqian; Zhang Bintuan

    2010-01-01

    Space radiation, particularly induced by the high-energy charged particles, may cause serious injury on living organisms. So it is one critical restriction factor in Manned Spaceflight. Studies have shown that the biological effects of charged particles were associated with their quality, the dose and the different biological end points. In addition, the microgravity conditions may affect the biological effects of space radiation. In this paper we give a review on the biological damage effects of space radiation and the combined biological effects of the space radiation coupled with the microgravity from the results of space flight and ground simulation experiments. (authors)

  1. Radiation chemistry in development and research of radiation biology

    International Nuclear Information System (INIS)

    Min Rui

    2010-01-01

    During the establishment and development of radiation biology, radiation chemistry acts like bridge which units the spatial and temporal insight coming from radiation physics with radiation biology. The theory, model, and methodology of radiation chemistry play an important role in promoting research and development of radiation biology. Following research development of radiation biology effects towards systems radiation biology the illustration and exploration both diversity of biological responses and complex process of biological effect occurring remain to need the theory, model, and methodology come from radiation chemistry. (authors)

  2. [Animal experimentation, computer simulation and surgical research].

    Science.gov (United States)

    Carpentier, Alain

    2009-11-01

    We live in a digital world In medicine, computers are providing new tools for data collection, imaging, and treatment. During research and development of complex technologies and devices such as artificial hearts, computer simulation can provide more reliable information than experimentation on large animals. In these specific settings, animal experimentation should serve more to validate computer models of complex devices than to demonstrate their reliability.

  3. 10 years for the Journal of Bioinformatics and Computational Biology (2003-2013) -- a retrospective.

    Science.gov (United States)

    Eisenhaber, Frank; Sherman, Westley Arthur

    2014-06-01

    The Journal of Bioinformatics and Computational Biology (JBCB) started publishing scientific articles in 2003. It has established itself as home for solid research articles in the field (~ 60 per year) that are surprisingly well cited. JBCB has an important function as alternative publishing channel in addition to other, bigger journals.

  4. Has Modern Biology Entered the Mouth? The Clinical Impact of Biological Research.

    Science.gov (United States)

    Baum, Bruce J.

    1991-01-01

    Three areas of biological research that are beginning to have an impact on clinical medicine are examined, including molecular biology, cell biology, and biotechnology. It is concluded that oral biologists and educators must work cooperatively to bring rapid biological and biomedical advances into dental training in a meaningful way. (MSE)

  5. Application of computational systems biology to explore environmental toxicity hazards

    DEFF Research Database (Denmark)

    Audouze, Karine Marie Laure; Grandjean, Philippe

    2011-01-01

    Background: Computer-based modeling is part of a new approach to predictive toxicology.Objectives: We investigated the usefulness of an integrated computational systems biology approach in a case study involving the isomers and metabolites of the pesticide dichlorodiphenyltrichloroethane (DDT......) to ascertain their possible links to relevant adverse effects.Methods: We extracted chemical-protein association networks for each DDT isomer and its metabolites using ChemProt, a disease chemical biology database that includes both binding and gene expression data, and we explored protein-protein interactions...... using a human interactome network. To identify associated dysfunctions and diseases, we integrated protein-disease annotations into the protein complexes using the Online Mendelian Inheritance in Man database and the Comparative Toxicogenomics Database.Results: We found 175 human proteins linked to p,p´-DDT...

  6. A comprehensive approach to decipher biological computation to achieve next generation high-performance exascale computing.

    Energy Technology Data Exchange (ETDEWEB)

    James, Conrad D.; Schiess, Adrian B.; Howell, Jamie; Baca, Michael J.; Partridge, L. Donald; Finnegan, Patrick Sean; Wolfley, Steven L.; Dagel, Daryl James; Spahn, Olga Blum; Harper, Jason C.; Pohl, Kenneth Roy; Mickel, Patrick R.; Lohn, Andrew; Marinella, Matthew

    2013-10-01

    The human brain (volume=1200cm3) consumes 20W and is capable of performing > 10^16 operations/s. Current supercomputer technology has reached 1015 operations/s, yet it requires 1500m^3 and 3MW, giving the brain a 10^12 advantage in operations/s/W/cm^3. Thus, to reach exascale computation, two achievements are required: 1) improved understanding of computation in biological tissue, and 2) a paradigm shift towards neuromorphic computing where hardware circuits mimic properties of neural tissue. To address 1), we will interrogate corticostriatal networks in mouse brain tissue slices, specifically with regard to their frequency filtering capabilities as a function of input stimulus. To address 2), we will instantiate biological computing characteristics such as multi-bit storage into hardware devices with future computational and memory applications. Resistive memory devices will be modeled, designed, and fabricated in the MESA facility in consultation with our internal and external collaborators.

  7. AHPCRC - Army High Performance Computing Research Center

    Science.gov (United States)

    2010-01-01

    computing. Of particular interest is the ability of a distrib- uted jamming network (DJN) to jam signals in all or part of a sensor or communications net...and reasoning, assistive technologies. FRIEDRICH (FRITZ) PRINZ Finmeccanica Professor of Engineering, Robert Bosch Chair, Department of Engineering...High Performance Computing Research Center www.ahpcrc.org BARBARA BRYAN AHPCRC Research and Outreach Manager, HPTi (650) 604-3732 bbryan@hpti.com Ms

  8. Computing paths and cycles in biological interaction graphs

    Directory of Open Access Journals (Sweden)

    von Kamp Axel

    2009-06-01

    Full Text Available Abstract Background Interaction graphs (signed directed graphs provide an important qualitative modeling approach for Systems Biology. They enable the analysis of causal relationships in cellular networks and can even be useful for predicting qualitative aspects of systems dynamics. Fundamental issues in the analysis of interaction graphs are the enumeration of paths and cycles (feedback loops and the calculation of shortest positive/negative paths. These computational problems have been discussed only to a minor extent in the context of Systems Biology and in particular the shortest signed paths problem requires algorithmic developments. Results We first review algorithms for the enumeration of paths and cycles and show that these algorithms are superior to a recently proposed enumeration approach based on elementary-modes computation. The main part of this work deals with the computation of shortest positive/negative paths, an NP-complete problem for which only very few algorithms are described in the literature. We propose extensions and several new algorithm variants for computing either exact results or approximations. Benchmarks with various concrete biological networks show that exact results can sometimes be obtained in networks with several hundred nodes. A class of even larger graphs can still be treated exactly by a new algorithm combining exhaustive and simple search strategies. For graphs, where the computation of exact solutions becomes time-consuming or infeasible, we devised an approximative algorithm with polynomial complexity. Strikingly, in realistic networks (where a comparison with exact results was possible this algorithm delivered results that are very close or equal to the exact values. This phenomenon can probably be attributed to the particular topology of cellular signaling and regulatory networks which contain a relatively low number of negative feedback loops. Conclusion The calculation of shortest positive

  9. Exploiting graphics processing units for computational biology and bioinformatics.

    Science.gov (United States)

    Payne, Joshua L; Sinnott-Armstrong, Nicholas A; Moore, Jason H

    2010-09-01

    Advances in the video gaming industry have led to the production of low-cost, high-performance graphics processing units (GPUs) that possess more memory bandwidth and computational capability than central processing units (CPUs), the standard workhorses of scientific computing. With the recent release of generalpurpose GPUs and NVIDIA's GPU programming language, CUDA, graphics engines are being adopted widely in scientific computing applications, particularly in the fields of computational biology and bioinformatics. The goal of this article is to concisely present an introduction to GPU hardware and programming, aimed at the computational biologist or bioinformaticist. To this end, we discuss the primary differences between GPU and CPU architecture, introduce the basics of the CUDA programming language, and discuss important CUDA programming practices, such as the proper use of coalesced reads, data types, and memory hierarchies. We highlight each of these topics in the context of computing the all-pairs distance between instances in a dataset, a common procedure in numerous disciplines of scientific computing. We conclude with a runtime analysis of the GPU and CPU implementations of the all-pairs distance calculation. We show our final GPU implementation to outperform the CPU implementation by a factor of 1700.

  10. An interdepartmental Ph.D. program in computational biology and bioinformatics: the Yale perspective.

    Science.gov (United States)

    Gerstein, Mark; Greenbaum, Dov; Cheung, Kei; Miller, Perry L

    2007-02-01

    Computational biology and bioinformatics (CBB), the terms often used interchangeably, represent a rapidly evolving biological discipline. With the clear potential for discovery and innovation, and the need to deal with the deluge of biological data, many academic institutions are committing significant resources to develop CBB research and training programs. Yale formally established an interdepartmental Ph.D. program in CBB in May 2003. This paper describes Yale's program, discussing the scope of the field, the program's goals and curriculum, as well as a number of issues that arose in implementing the program. (Further updated information is available from the program's website, www.cbb.yale.edu.)

  11. Natural computing for mechanical systems research: A tutorial overview

    Science.gov (United States)

    Worden, Keith; Staszewski, Wieslaw J.; Hensman, James J.

    2011-01-01

    A great many computational algorithms developed over the past half-century have been motivated or suggested by biological systems or processes, the most well-known being the artificial neural networks. These algorithms are commonly grouped together under the terms soft or natural computing. A property shared by most natural computing algorithms is that they allow exploration of, or learning from, data. This property has proved extremely valuable in the solution of many diverse problems in science and engineering. The current paper is intended as a tutorial overview of the basic theory of some of the most common methods of natural computing as they are applied in the context of mechanical systems research. The application of some of the main algorithms is illustrated using case studies. The paper also attempts to give some indication as to which of the algorithms emerging now from the machine learning community are likely to be important for mechanical systems research in the future.

  12. Research Institute for Advanced Computer Science

    Science.gov (United States)

    Gross, Anthony R. (Technical Monitor); Leiner, Barry M.

    2000-01-01

    The Research Institute for Advanced Computer Science (RIACS) carries out basic research and technology development in computer science, in support of the National Aeronautics and Space Administration's missions. RIACS is located at the NASA Ames Research Center. It currently operates under a multiple year grant/cooperative agreement that began on October 1, 1997 and is up for renewal in the year 2002. Ames has been designated NASA's Center of Excellence in Information Technology. In this capacity, Ames is charged with the responsibility to build an Information Technology Research Program that is preeminent within NASA. RIACS serves as a bridge between NASA Ames and the academic community, and RIACS scientists and visitors work in close collaboration with NASA scientists. RIACS has the additional goal of broadening the base of researchers in these areas of importance to the nation's space and aeronautics enterprises. RIACS research focuses on the three cornerstones of information technology research necessary to meet the future challenges of NASA missions: (1) Automated Reasoning for Autonomous Systems. Techniques are being developed enabling spacecraft that will be self-guiding and self-correcting to the extent that they will require little or no human intervention. Such craft will be equipped to independently solve problems as they arise, and fulfill their missions with minimum direction from Earth; (2) Human-Centered Computing. Many NASA missions require synergy between humans and computers, with sophisticated computational aids amplifying human cognitive and perceptual abilities; (3) High Performance Computing and Networking. Advances in the performance of computing and networking continue to have major impact on a variety of NASA endeavors, ranging from modeling and simulation to data analysis of large datasets to collaborative engineering, planning and execution. In addition, RIACS collaborates with NASA scientists to apply information technology research to a

  13. Computer applications in controlled fusion research

    International Nuclear Information System (INIS)

    Killeen, J.

    1975-01-01

    The application of computers to controlled thermonuclear research (CTR) is essential. In the near future the use of computers in the numerical modeling of fusion systems should increase substantially. A recent panel has identified five categories of computational models to study the physics of magnetically confined plasmas. A comparable number of types of models for engineering studies is called for. The development and application of computer codes to implement these models is a vital step in reaching the goal of fusion power. To meet the needs of the fusion program the National CTR Computer Center has been established at the Lawrence Livermore Laboratory. A large central computing facility is linked to smaller computing centers at each of the major CTR Laboratories by a communication network. The crucial element needed for success is trained personnel. The number of people with knowledge of plasma science and engineering trained in numerical methods and computer science must be increased substantially in the next few years. Nuclear engineering departments should encourage students to enter this field and provide the necessary courses and research programs in fusion computing

  14. Computer applications in controlled fusion research

    International Nuclear Information System (INIS)

    Killeen, J.

    1975-02-01

    The role of Nuclear Engineering Education in the application of computers to controlled fusion research can be a very important one. In the near future the use of computers in the numerical modelling of fusion systems should increase substantially. A recent study group has identified five categories of computational models to study the physics of magnetically confined plasmas. A comparable number of types of models for engineering studies are called for. The development and application of computer codes to implement these models is a vital step in reaching the goal of fusion power. In order to meet the needs of the fusion program the National CTR Computer Center has been established at the Lawrence Livermore Laboratory. A large central computing facility is linked to smaller computing centers at each of the major CTR laboratories by a communications network. The crucial element that is needed for success is trained personnel. The number of people with knowledge of plasma science and engineering that are trained in numerical methods and computer science is quite small, and must be increased substantially in the next few years. Nuclear Engineering departments should encourage students to enter this field and provide the necessary courses and research programs in fusion computing. (U.S.)

  15. Delivering The Benefits of Chemical-Biological Integration in Computational Toxicology at the EPA (ACS Fall meeting)

    Science.gov (United States)

    Abstract: Researchers at the EPA’s National Center for Computational Toxicology integrate advances in biology, chemistry, and computer science to examine the toxicity of chemicals and help prioritize chemicals for further research based on potential human health risks. The intent...

  16. pClone: Synthetic Biology Tool Makes Promoter Research Accessible to Beginning Biology Students

    Science.gov (United States)

    Campbell, A. Malcolm; Eckdahl, Todd; Cronk, Brian; Andresen, Corinne; Frederick, Paul; Huckuntod, Samantha; Shinneman, Claire; Wacker, Annie; Yuan, Jason

    2014-01-01

    The "Vision and Change" report recommended genuine research experiences for undergraduate biology students. Authentic research improves science education, increases the number of scientifically literate citizens, and encourages students to pursue research. Synthetic biology is well suited for undergraduate research and is a growing area…

  17. Plant biology research and training for the 21st century

    Energy Technology Data Exchange (ETDEWEB)

    Kelly, K. [ed.

    1992-12-31

    The committee was assembled in response to a request from the National Science Foundation (NSF), the US Department of Agriculture (USDA), and the US Department of Energy (DoE). The leadership of these agencies asked the National Academy of Sciences through the National Research Council (NRC) to assess the status of plant-science research in the United States in light of the opportunities arising from advances inother areas of biology. NRC was asked to suggest ways of accelerating the application of these new biologic concepts and tools to research in plant science with the aim of enhancing the acquisition of new knowledge about plants. The charge to the committee was to examine the following: Organizations, departments, and institutions conducting plant biology research; human resources involved in plant biology research; graduate training programs in plant biology; federal, state, and private sources of support for plant-biology research; the role of industry in conducting and supporting plant-biology research; the international status of US plant-biology research; and the relationship of plant biology to leading-edge research in biology.

  18. Plant biology research and training for the 21st century

    Energy Technology Data Exchange (ETDEWEB)

    Kelly, K. (ed.)

    1992-01-01

    The committee was assembled in response to a request from the National Science Foundation (NSF), the US Department of Agriculture (USDA), and the US Department of Energy (DoE). The leadership of these agencies asked the National Academy of Sciences through the National Research Council (NRC) to assess the status of plant-science research in the United States in light of the opportunities arising from advances inother areas of biology. NRC was asked to suggest ways of accelerating the application of these new biologic concepts and tools to research in plant science with the aim of enhancing the acquisition of new knowledge about plants. The charge to the committee was to examine the following: Organizations, departments, and institutions conducting plant biology research; human resources involved in plant biology research; graduate training programs in plant biology; federal, state, and private sources of support for plant-biology research; the role of industry in conducting and supporting plant-biology research; the international status of US plant-biology research; and the relationship of plant biology to leading-edge research in biology.

  19. Convolutional Deep Belief Networks for Single-Cell/Object Tracking in Computational Biology and Computer Vision

    OpenAIRE

    Zhong, Bineng; Pan, Shengnan; Zhang, Hongbo; Wang, Tian; Du, Jixiang; Chen, Duansheng; Cao, Liujuan

    2016-01-01

    In this paper, we propose deep architecture to dynamically learn the most discriminative features from data for both single-cell and object tracking in computational biology and computer vision. Firstly, the discriminative features are automatically learned via a convolutional deep belief network (CDBN). Secondly, we design a simple yet effective method to transfer features learned from CDBNs on the source tasks for generic purpose to the object tracking tasks using only limited amount of tra...

  20. Application of Computational Methods in Planaria Research: A Current Update

    Directory of Open Access Journals (Sweden)

    Ghosh Shyamasree

    2017-07-01

    Full Text Available Planaria is a member of the Phylum Platyhelminthes including flatworms. Planarians possess the unique ability of regeneration from adult stem cells or neoblasts and finds importance as a model organism for regeneration and developmental studies. Although research is being actively carried out globally through conventional methods to understand the process of regeneration from neoblasts, biology of development, neurobiology and immunology of Planaria, there are many thought provoking questions related to stem cell plasticity, and uniqueness of regenerative potential in Planarians amongst other members of Phylum Platyhelminthes. The complexity of receptors and signalling mechanisms, immune system network, biology of repair, responses to injury are yet to be understood in Planaria. Genomic and transcriptomic studies have generated a vast repository of data, but their availability and analysis is a challenging task. Data mining, computational approaches of gene curation, bioinformatics tools for analysis of transcriptomic data, designing of databases, application of algorithms in deciphering changes of morphology by RNA interference (RNAi approaches, understanding regeneration experiments is a new venture in Planaria research that is helping researchers across the globe in understanding the biology. We highlight the applications of Hidden Markov models (HMMs in designing of computational tools and their applications in Planaria decoding their complex biology.

  1. Center for Biologics Evaluation and Research (CBER)

    Data.gov (United States)

    Federal Laboratory Consortium — CBER is the Center within FDA that regulates biological products for human use under applicable federal laws, including the Public Health Service Act and the Federal...

  2. Postharvest biology and technology research and development ...

    African Journals Online (AJOL)

    STORAGESEVER

    2008-09-03

    Sep 3, 2008 ... The applications of biological control agents in pre- and post-harvest operations and .... production, with regards to food safety, operator health and the ... and to work out sustainable compliance modalities for small-scale ...

  3. WE-DE-202-00: Connecting Radiation Physics with Computational Biology

    International Nuclear Information System (INIS)

    2016-01-01

    processes are too complex for a mechanistic approach. Can computer simulations be used to guide future biological research? We will debate the feasibility of explaining biology from a physicists’ perspective. Learning Objectives: Understand the potential applications and limitations of computational methods for dose-response modeling at the molecular, cellular and tissue levels Learn about mechanism of action underlying the induction, repair and biological processing of damage to DNA and other constituents Understand how effects and processes at one biological scale impact on biological processes and outcomes on other scales J. Schuemann, NCI/NIH grantsS. McMahon, Funding: European Commission FP7 (grant EC FP7 MC-IOF-623630)

  4. WE-DE-202-00: Connecting Radiation Physics with Computational Biology

    Energy Technology Data Exchange (ETDEWEB)

    NONE

    2016-06-15

    processes are too complex for a mechanistic approach. Can computer simulations be used to guide future biological research? We will debate the feasibility of explaining biology from a physicists’ perspective. Learning Objectives: Understand the potential applications and limitations of computational methods for dose-response modeling at the molecular, cellular and tissue levels Learn about mechanism of action underlying the induction, repair and biological processing of damage to DNA and other constituents Understand how effects and processes at one biological scale impact on biological processes and outcomes on other scales J. Schuemann, NCI/NIH grantsS. McMahon, Funding: European Commission FP7 (grant EC FP7 MC-IOF-623630)

  5. A framework to establish credibility of computational models in biology.

    Science.gov (United States)

    Patterson, Eann A; Whelan, Maurice P

    2017-10-01

    Computational models in biology and biomedical science are often constructed to aid people's understanding of phenomena or to inform decisions with socioeconomic consequences. Model credibility is the willingness of people to trust a model's predictions and is often difficult to establish for computational biology models. A 3 × 3 matrix has been proposed to allow such models to be categorised with respect to their testability and epistemic foundation in order to guide the selection of an appropriate process of validation to supply evidence to establish credibility. Three approaches to validation are identified that can be deployed depending on whether a model is deemed untestable, testable or lies somewhere in between. In the latter two cases, the validation process involves the quantification of uncertainty which is a key output. The issues arising due to the complexity and inherent variability of biological systems are discussed and the creation of 'digital twins' proposed as a means to alleviate the issues and provide a more robust, transparent and traceable route to model credibility and acceptance. Copyright © 2016 The Authors. Published by Elsevier Ltd.. All rights reserved.

  6. Accelerating cancer systems biology research through Semantic Web technology.

    Science.gov (United States)

    Wang, Zhihui; Sagotsky, Jonathan; Taylor, Thomas; Shironoshita, Patrick; Deisboeck, Thomas S

    2013-01-01

    Cancer systems biology is an interdisciplinary, rapidly expanding research field in which collaborations are a critical means to advance the field. Yet the prevalent database technologies often isolate data rather than making it easily accessible. The Semantic Web has the potential to help facilitate web-based collaborative cancer research by presenting data in a manner that is self-descriptive, human and machine readable, and easily sharable. We have created a semantically linked online Digital Model Repository (DMR) for storing, managing, executing, annotating, and sharing computational cancer models. Within the DMR, distributed, multidisciplinary, and inter-organizational teams can collaborate on projects, without forfeiting intellectual property. This is achieved by the introduction of a new stakeholder to the collaboration workflow, the institutional licensing officer, part of the Technology Transfer Office. Furthermore, the DMR has achieved silver level compatibility with the National Cancer Institute's caBIG, so users can interact with the DMR not only through a web browser but also through a semantically annotated and secure web service. We also discuss the technology behind the DMR leveraging the Semantic Web, ontologies, and grid computing to provide secure inter-institutional collaboration on cancer modeling projects, online grid-based execution of shared models, and the collaboration workflow protecting researchers' intellectual property. Copyright © 2012 Wiley Periodicals, Inc.

  7. Final report for Conference Support Grant "From Computational Biophysics to Systems Biology - CBSB12"

    Energy Technology Data Exchange (ETDEWEB)

    Hansmann, Ulrich H.E.

    2012-07-02

    This report summarizes the outcome of the international workshop From Computational Biophysics to Systems Biology (CBSB12) which was held June 3-5, 2012, at the University of Tennessee Conference Center in Knoxville, TN, and supported by DOE through the Conference Support Grant 120174. The purpose of CBSB12 was to provide a forum for the interaction between a data-mining interested systems biology community and a simulation and first-principle oriented computational biophysics/biochemistry community. CBSB12 was the sixth in a series of workshops of the same name organized in recent years, and the second that has been held in the USA. As in previous years, it gave researchers from physics, biology, and computer science an opportunity to acquaint each other with current trends in computational biophysics and systems biology, to explore venues of cooperation, and to establish together a detailed understanding of cells at a molecular level. The conference grant of $10,000 was used to cover registration fees and provide travel fellowships to selected students and postdoctoral scientists. By educating graduate students and providing a forum for young scientists to perform research into the working of cells at a molecular level, the workshop adds to DOE's mission of paving the way to exploit the abilities of living systems to capture, store and utilize energy.

  8. Research Directions for AI in Computer Games

    OpenAIRE

    Fairclough, Chris; Fagan, Michael; Cunningham, Padraig; Mac Namee, Brian

    2001-01-01

    The computer games industry is now bigger than the film industry. Until recently, technology in games was driven by a desire to achieve real-time, photo-realistic graphics. To a large extent, this has now been achieved. As game developers look for new and innovative technologies to drive games development, AI is coming to the fore. This paper will examine how sophisticated AI techniques, such as those being used in mainstream academic research, can be applied to computer games ...

  9. Computer modeling in developmental biology: growing today, essential tomorrow.

    Science.gov (United States)

    Sharpe, James

    2017-12-01

    D'Arcy Thompson was a true pioneer, applying mathematical concepts and analyses to the question of morphogenesis over 100 years ago. The centenary of his famous book, On Growth and Form , is therefore a great occasion on which to review the types of computer modeling now being pursued to understand the development of organs and organisms. Here, I present some of the latest modeling projects in the field, covering a wide range of developmental biology concepts, from molecular patterning to tissue morphogenesis. Rather than classifying them according to scientific question, or scale of problem, I focus instead on the different ways that modeling contributes to the scientific process and discuss the likely future of modeling in developmental biology. © 2017. Published by The Company of Biologists Ltd.

  10. iTools: a framework for classification, categorization and integration of computational biology resources.

    Directory of Open Access Journals (Sweden)

    Ivo D Dinov

    2008-05-01

    Full Text Available The advancement of the computational biology field hinges on progress in three fundamental directions--the development of new computational algorithms, the availability of informatics resource management infrastructures and the capability of tools to interoperate and synergize. There is an explosion in algorithms and tools for computational biology, which makes it difficult for biologists to find, compare and integrate such resources. We describe a new infrastructure, iTools, for managing the query, traversal and comparison of diverse computational biology resources. Specifically, iTools stores information about three types of resources--data, software tools and web-services. The iTools design, implementation and resource meta-data content reflect the broad research, computational, applied and scientific expertise available at the seven National Centers for Biomedical Computing. iTools provides a system for classification, categorization and integration of different computational biology resources across space-and-time scales, biomedical problems, computational infrastructures and mathematical foundations. A large number of resources are already iTools-accessible to the community and this infrastructure is rapidly growing. iTools includes human and machine interfaces to its resource meta-data repository. Investigators or computer programs may utilize these interfaces to search, compare, expand, revise and mine meta-data descriptions of existent computational biology resources. We propose two ways to browse and display the iTools dynamic collection of resources. The first one is based on an ontology of computational biology resources, and the second one is derived from hyperbolic projections of manifolds or complex structures onto planar discs. iTools is an open source project both in terms of the source code development as well as its meta-data content. iTools employs a decentralized, portable, scalable and lightweight framework for long

  11. Computational biology in the cloud: methods and new insights from computing at scale.

    Science.gov (United States)

    Kasson, Peter M

    2013-01-01

    The past few years have seen both explosions in the size of biological data sets and the proliferation of new, highly flexible on-demand computing capabilities. The sheer amount of information available from genomic and metagenomic sequencing, high-throughput proteomics, experimental and simulation datasets on molecular structure and dynamics affords an opportunity for greatly expanded insight, but it creates new challenges of scale for computation, storage, and interpretation of petascale data. Cloud computing resources have the potential to help solve these problems by offering a utility model of computing and storage: near-unlimited capacity, the ability to burst usage, and cheap and flexible payment models. Effective use of cloud computing on large biological datasets requires dealing with non-trivial problems of scale and robustness, since performance-limiting factors can change substantially when a dataset grows by a factor of 10,000 or more. New computing paradigms are thus often needed. The use of cloud platforms also creates new opportunities to share data, reduce duplication, and to provide easy reproducibility by making the datasets and computational methods easily available.

  12. Parallel computing and molecular dynamics of biological membranes

    International Nuclear Information System (INIS)

    La Penna, G.; Letardi, S.; Minicozzi, V.; Morante, S.; Rossi, G.C.; Salina, G.

    1998-01-01

    In this talk I discuss the general question of the portability of molecular dynamics codes for diffusive systems on parallel computers of the APE family. The intrinsic single precision of the today available platforms does not seem to affect the numerical accuracy of the simulations, while the absence of integer addressing from CPU to individual nodes puts strong constraints on possible programming strategies. Liquids can be satisfactorily simulated using the ''systolic'' method. For more complex systems, like the biological ones at which we are ultimately interested in, the ''domain decomposition'' approach is best suited to beat the quadratic growth of the inter-molecular computational time with the number of atoms of the system. The promising perspectives of using this strategy for extensive simulations of lipid bilayers are briefly reviewed. (orig.)

  13. TORCH Computational Reference Kernels - A Testbed for Computer Science Research

    Energy Technology Data Exchange (ETDEWEB)

    Kaiser, Alex; Williams, Samuel Webb; Madduri, Kamesh; Ibrahim, Khaled; Bailey, David H.; Demmel, James W.; Strohmaier, Erich

    2010-12-02

    For decades, computer scientists have sought guidance on how to evolve architectures, languages, and programming models in order to improve application performance, efficiency, and productivity. Unfortunately, without overarching advice about future directions in these areas, individual guidance is inferred from the existing software/hardware ecosystem, and each discipline often conducts their research independently assuming all other technologies remain fixed. In today's rapidly evolving world of on-chip parallelism, isolated and iterative improvements to performance may miss superior solutions in the same way gradient descent optimization techniques may get stuck in local minima. To combat this, we present TORCH: A Testbed for Optimization ResearCH. These computational reference kernels define the core problems of interest in scientific computing without mandating a specific language, algorithm, programming model, or implementation. To compliment the kernel (problem) definitions, we provide a set of algorithmically-expressed verification tests that can be used to verify a hardware/software co-designed solution produces an acceptable answer. Finally, to provide some illumination as to how researchers have implemented solutions to these problems in the past, we provide a set of reference implementations in C and MATLAB.

  14. MACBenAbim: A Multi-platform Mobile Application for searching keyterms in Computational Biology and Bioinformatics.

    Science.gov (United States)

    Oluwagbemi, Olugbenga O; Adewumi, Adewole; Esuruoso, Abimbola

    2012-01-01

    Computational biology and bioinformatics are gradually gaining grounds in Africa and other developing nations of the world. However, in these countries, some of the challenges of computational biology and bioinformatics education are inadequate infrastructures, and lack of readily-available complementary and motivational tools to support learning as well as research. This has lowered the morale of many promising undergraduates, postgraduates and researchers from aspiring to undertake future study in these fields. In this paper, we developed and described MACBenAbim (Multi-platform Mobile Application for Computational Biology and Bioinformatics), a flexible user-friendly tool to search for, define and describe the meanings of keyterms in computational biology and bioinformatics, thus expanding the frontiers of knowledge of the users. This tool also has the capability of achieving visualization of results on a mobile multi-platform context. MACBenAbim is available from the authors for non-commercial purposes.

  15. A Computational Architecture for Programmable Automation Research

    Science.gov (United States)

    Taylor, Russell H.; Korein, James U.; Maier, Georg E.; Durfee, Lawrence F.

    1987-03-01

    This short paper describes recent work at the IBM T. J. Watson Research Center directed at developing a highly flexible computational architecture for research on sensor-based programmable automation. The system described here has been designed with a focus on dynamic configurability, layered user inter-faces and incorporation of sensor-based real time operations into new commands. It is these features which distinguish it from earlier work. The system is cur-rently being implemented at IBM for research purposes and internal use and is an outgrowth of programmable automation research which has been ongoing since 1972 [e.g., 1, 2, 3, 4, 5, 6] .

  16. Architecture, systems research and computational sciences

    CERN Document Server

    2012-01-01

    The Winter 2012 (vol. 14 no. 1) issue of the Nexus Network Journal is dedicated to the theme “Architecture, Systems Research and Computational Sciences”. This is an outgrowth of the session by the same name which took place during the eighth international, interdisciplinary conference “Nexus 2010: Relationships between Architecture and Mathematics, held in Porto, Portugal, in June 2010. Today computer science is an integral part of even strictly historical investigations, such as those concerning the construction of vaults, where the computer is used to survey the existing building, analyse the data and draw the ideal solution. What the papers in this issue make especially evident is that information technology has had an impact at a much deeper level as well: architecture itself can now be considered as a manifestation of information and as a complex system. The issue is completed with other research papers, conference reports and book reviews.

  17. Cloud Computing Technologies Facilitate Earth Research

    Science.gov (United States)

    2015-01-01

    Under a Space Act Agreement, NASA partnered with Seattle-based Amazon Web Services to make the agency's climate and Earth science satellite data publicly available on the company's servers. Users can access the data for free, but they can also pay to use Amazon's computing services to analyze and visualize information using the same software available to NASA researchers.

  18. Research Collaboration Workshop for Women in Mathematical Biology

    CERN Document Server

    Miller, Laura

    2017-01-01

    Inspired by the Research Collaboration Workshop for Women in Mathematical Biology, this volume contains research and review articles that cover topics ranging from models of animal movement to the flow of blood cells in the embryonic heart. Hosted by the National Institute for Mathematics and Biological Synthesis (NIMBioS), the workshop brought together women working in biology and mathematics to form four research groups that encouraged multidisciplinary collaboration and lifetime connections in the STEM field. This volume introduces many of the topics from the workshop, including the aerodynamics of spider ballooning; sleep, circadian rhythms, and pain; blood flow regulation in the kidney; and the effects of antimicrobial therapy on gut microbiota and microbiota and Clostridium difficile. Perfect for students and researchers in mathematics and biology, the papers included in this volume offer an introductory glimpse at recent research in mathematical biology. .

  19. Research computing in a distributed cloud environment

    International Nuclear Information System (INIS)

    Fransham, K; Agarwal, A; Armstrong, P; Bishop, A; Charbonneau, A; Desmarais, R; Hill, N; Gable, I; Gaudet, S; Goliath, S; Impey, R; Leavett-Brown, C; Ouellete, J; Paterson, M; Pritchet, C; Penfold-Brown, D; Podaima, W; Schade, D; Sobie, R J

    2010-01-01

    The recent increase in availability of Infrastructure-as-a-Service (IaaS) computing clouds provides a new way for researchers to run complex scientific applications. However, using cloud resources for a large number of research jobs requires significant effort and expertise. Furthermore, running jobs on many different clouds presents even more difficulty. In order to make it easy for researchers to deploy scientific applications across many cloud resources, we have developed a virtual machine resource manager (Cloud Scheduler) for distributed compute clouds. In response to a user's job submission to a batch system, the Cloud Scheduler manages the distribution and deployment of user-customized virtual machines across multiple clouds. We describe the motivation for and implementation of a distributed cloud using the Cloud Scheduler that is spread across both commercial and dedicated private sites, and present some early results of scientific data analysis using the system.

  20. Convolutional Deep Belief Networks for Single-Cell/Object Tracking in Computational Biology and Computer Vision.

    Science.gov (United States)

    Zhong, Bineng; Pan, Shengnan; Zhang, Hongbo; Wang, Tian; Du, Jixiang; Chen, Duansheng; Cao, Liujuan

    2016-01-01

    In this paper, we propose deep architecture to dynamically learn the most discriminative features from data for both single-cell and object tracking in computational biology and computer vision. Firstly, the discriminative features are automatically learned via a convolutional deep belief network (CDBN). Secondly, we design a simple yet effective method to transfer features learned from CDBNs on the source tasks for generic purpose to the object tracking tasks using only limited amount of training data. Finally, to alleviate the tracker drifting problem caused by model updating, we jointly consider three different types of positive samples. Extensive experiments validate the robustness and effectiveness of the proposed method.

  1. Complex fluids in biological systems experiment, theory, and computation

    CERN Document Server

    2015-01-01

    This book serves as an introduction to the continuum mechanics and mathematical modeling of complex fluids in living systems. The form and function of living systems are intimately tied to the nature of surrounding fluid environments, which commonly exhibit nonlinear and history dependent responses to forces and displacements. With ever-increasing capabilities in the visualization and manipulation of biological systems, research on the fundamental phenomena, models, measurements, and analysis of complex fluids has taken a number of exciting directions. In this book, many of the world’s foremost experts explore key topics such as: Macro- and micro-rheological techniques for measuring the material properties of complex biofluids and the subtleties of data interpretation Experimental observations and rheology of complex biological materials, including mucus, cell membranes, the cytoskeleton, and blood The motility of microorganisms in complex fluids and the dynamics of active suspensions Challenges and solut...

  2. A research program in empirical computer science

    Science.gov (United States)

    Knight, J. C.

    1991-01-01

    During the grant reporting period our primary activities have been to begin preparation for the establishment of a research program in experimental computer science. The focus of research in this program will be safety-critical systems. Many questions that arise in the effort to improve software dependability can only be addressed empirically. For example, there is no way to predict the performance of the various proposed approaches to building fault-tolerant software. Performance models, though valuable, are parameterized and cannot be used to make quantitative predictions without experimental determination of underlying distributions. In the past, experimentation has been able to shed some light on the practical benefits and limitations of software fault tolerance. It is common, also, for experimentation to reveal new questions or new aspects of problems that were previously unknown. A good example is the Consistent Comparison Problem that was revealed by experimentation and subsequently studied in depth. The result was a clear understanding of a previously unknown problem with software fault tolerance. The purpose of a research program in empirical computer science is to perform controlled experiments in the area of real-time, embedded control systems. The goal of the various experiments will be to determine better approaches to the construction of the software for computing systems that have to be relied upon. As such it will validate research concepts from other sources, provide new research results, and facilitate the transition of research results from concepts to practical procedures that can be applied with low risk to NASA flight projects. The target of experimentation will be the production software development activities undertaken by any organization prepared to contribute to the research program. Experimental goals, procedures, data analysis and result reporting will be performed for the most part by the University of Virginia.

  3. Current research in Radiation Biology and Biochemistry Division

    International Nuclear Information System (INIS)

    Tarachand, U.; Singh, B.B.

    1995-01-01

    The Radiation Biology and Biochemistry Division, Bhabha Atomic Research Centre, Bombay has been engaged in research in the frontier areas of (i) radiation biology related to tumour therapy and injury caused by free radicals; (ii) molecular basis of diseases of physiological origin; (iii) molecular aspects of chemical carcinogenesis and (iv) structure of genome and genome related functions. The gist of research and development activities carried out in the Division during the last two years are documented

  4. Current research in Radiation Biology and Biochemistry Division

    Energy Technology Data Exchange (ETDEWEB)

    Tarachand, U; Singh, B B [eds.; Bhabha Atomic Research Centre, Bombay (India). Radiation Biology and Biochemistry Div.

    1996-12-31

    The Radiation Biology and Biochemistry Division, Bhabha Atomic Research Centre, Bombay has been engaged in research in the frontier areas of (i) radiation biology related to tumour therapy and injury caused by free radicals; (ii) molecular basis of diseases of physiological origin; (iii) molecular aspects of chemical carcinogenesis and (iv) structure of genome and genome related functions. The gist of research and development activities carried out in the Division during the last two years are documented.

  5. Complex network problems in physics, computer science and biology

    Science.gov (United States)

    Cojocaru, Radu Ionut

    There is a close relation between physics and mathematics and the exchange of ideas between these two sciences are well established. However until few years ago there was no such a close relation between physics and computer science. Even more, only recently biologists started to use methods and tools from statistical physics in order to study the behavior of complex system. In this thesis we concentrate on applying and analyzing several methods borrowed from computer science to biology and also we use methods from statistical physics in solving hard problems from computer science. In recent years physicists have been interested in studying the behavior of complex networks. Physics is an experimental science in which theoretical predictions are compared to experiments. In this definition, the term prediction plays a very important role: although the system is complex, it is still possible to get predictions for its behavior, but these predictions are of a probabilistic nature. Spin glasses, lattice gases or the Potts model are a few examples of complex systems in physics. Spin glasses and many frustrated antiferromagnets map exactly to computer science problems in the NP-hard class defined in Chapter 1. In Chapter 1 we discuss a common result from artificial intelligence (AI) which shows that there are some problems which are NP-complete, with the implication that these problems are difficult to solve. We introduce a few well known hard problems from computer science (Satisfiability, Coloring, Vertex Cover together with Maximum Independent Set and Number Partitioning) and then discuss their mapping to problems from physics. In Chapter 2 we provide a short review of combinatorial optimization algorithms and their applications to ground state problems in disordered systems. We discuss the cavity method initially developed for studying the Sherrington-Kirkpatrick model of spin glasses. We extend this model to the study of a specific case of spin glass on the Bethe

  6. CERR: A computational environment for radiotherapy research

    International Nuclear Information System (INIS)

    Deasy, Joseph O.; Blanco, Angel I.; Clark, Vanessa H.

    2003-01-01

    A software environment is described, called the computational environment for radiotherapy research (CERR, pronounced 'sir'). CERR partially addresses four broad needs in treatment planning research: (a) it provides a convenient and powerful software environment to develop and prototype treatment planning concepts, (b) it serves as a software integration environment to combine treatment planning software written in multiple languages (MATLAB, FORTRAN, C/C++, JAVA, etc.), together with treatment plan information (computed tomography scans, outlined structures, dose distributions, digital films, etc.), (c) it provides the ability to extract treatment plans from disparate planning systems using the widely available AAPM/RTOG archiving mechanism, and (d) it provides a convenient and powerful tool for sharing and reproducing treatment planning research results. The functional components currently being distributed, including source code, include: (1) an import program which converts the widely available AAPM/RTOG treatment planning format into a MATLAB cell-array data object, facilitating manipulation; (2) viewers which display axial, coronal, and sagittal computed tomography images, structure contours, digital films, and isodose lines or dose colorwash, (3) a suite of contouring tools to edit and/or create anatomical structures, (4) dose-volume and dose-surface histogram calculation and display tools, and (5) various predefined commands. CERR allows the user to retrieve any AAPM/RTOG key word information about the treatment plan archive. The code is relatively self-describing, because it relies on MATLAB structure field name definitions based on the AAPM/RTOG standard. New structure field names can be added dynamically or permanently. New components of arbitrary data type can be stored and accessed without disturbing system operation. CERR has been applied to aid research in dose-volume-outcome modeling, Monte Carlo dose calculation, and treatment planning optimization

  7. Current research in Canada on biological effects of ionizing radiation

    International Nuclear Information System (INIS)

    Marko, A.M.

    1980-05-01

    A survey of current research in Canada on the biological effects of ionizing radiation has been compiled. The list of projects has been classified according to structure (organizational state of the test system) as well as according to the type of effects. Using several assumptions, ballpark estimates of expenditures on these activities have been made. Agencies funding these research activities have been tabulated and the break-down of research in government laboratories and in academic institutions has been designated. Wherever possible, comparisons have been made outlining differences or similarities that exist between the United States and Canada concerning biological radiation research. It has been concluded that relevant research in this area in Canada is inadequate. Wherever possible, strengths and weaknesses in radiation biology programs have been indicated. The most promising course for Canada to follow is to support adequately fundamental studies of the biological effects of radiation. (auth)

  8. Synthesis on biological soil crust research

    Science.gov (United States)

    Weber, Bettina; Belnap, Jayne; Buedel, Burkhard

    2016-01-01

    In this closing chapter, we summarize the advances in biocrust research made during the last 1.5 decades. In the first part of the chapter, we discuss how in some research fields, such as the microbial diversity of fungi, bacteria, and microfauna; the interaction between biocrusts and vascular plants; and in the rehabilitation of biocrusts; particularly large achievements have been made. In other fields, previously established knowledge of overall patterns has been corroborated and refined by additional studies, e.g., in the fields of soil stabilization and disturbance effects. In the second part of the chapter, we outline the research gaps and challenges foreseen by us. We identify multiple knowledge gaps, including many understudied geographic regions, the largely missing link between genetic and morphological species identification data, and the answers to some mechanistic questions, such as the overall role of biocrusts in hydrology and nutrient cycles. With some ideas on promising new research questions and approaches we close this chapter and the overall book.

  9. Contribution to researches in biophysics and biology

    International Nuclear Information System (INIS)

    Luccioni, Catherine

    2000-01-01

    In this accreditation to supervise research, the author indicates its curriculum and scientific works which mainly dealt with the different agents used in chemotherapy. Scientific works addressed anti-carcinogenic pharmacology, applied biophysics, and researches in oncology and radiobiology. Current research projects deal with mechanisms of cellular transformation and the implication of the anti-oxidising metabolism and of nucleotide metabolism in cell radio-sensitivity. Teaching and research supervising activities are also indicated. Several articles are proposed in appendix: Average quality factor and dose equivalent meter based on microdosimetry techniques; Activity of thymidylate synthetase, thymidine kinase and galactokinase in primary and xenografted human colorectal cancers in relation to their chromosomal patterns; Nucleotide metabolism in human gliomas, relation to the chromosomal profile; Pyrimidine nucleotide metabolism in human colon carcinomas: comparison of normal tissues, primary tumors and xenografts; Modifications of the antioxidant metabolism during proliferation and differentiation of colon tumours cell lines; Modulation of the antioxidant enzymes, p21 and p53 expression during proliferation and differentiation of human melanoma cell lines; Purine metabolism in 2 human melanoma cell lines, relation with proliferation and differentiation; Radiation-induced changes in nucleotide metabolism of 2 colon cancer cell lines with different radio-sensitivities

  10. NCI RNA Biology 2017 symposium recap | Center for Cancer Research

    Science.gov (United States)

    The recent discovery of new classes of RNAs and the demonstration that alterations in RNA metabolism underlie numerous human cancers have resulted in enormous interest among CCR investigators in RNA biology. In order to share the latest research in this exciting field, the CCR Initiative in RNA Biology held its second international symposium April 23-24, 2017, in Natcher Auditorium. Learn more...

  11. NCI RNA Biology 2017 symposium recap | Center for Cancer Research

    Science.gov (United States)

    The recent discovery of new classes of RNAs and the demonstration that alterations in RNA metabolism underlie numerous human cancers have resulted in enormous interest among CCR investigators in RNA biology. In order to share the latest research in this exciting field, the CCR Initiative in RNA Biology held its second international symposium April 23-24, 2017, in Natcher

  12. Cloud Computing : Research Issues and Implications

    OpenAIRE

    Marupaka Rajenda Prasad; R. Lakshman Naik; V. Bapuji

    2013-01-01

    Cloud computing is a rapidly developing and excellent promising technology. It has aroused the concern of the computer society of whole world. Cloud computing is Internet-based computing, whereby shared information, resources, and software, are provided to terminals and portable devices on-demand, like the energy grid. Cloud computing is the product of the combination of grid computing, distributed computing, parallel computing, and ubiquitous computing. It aims to build and forecast sophisti...

  13. International Research and Development in Systems Biology

    Science.gov (United States)

    2005-10-01

    Genetics Berlin, Germany Hans Lehrach, Edda Klipp, Silke Sperling Yeast stress response and mitochondrial damage; Downs syndrome; cardiac...molgen.mpg.de, Dr. Edda Klipp, Axel Kowald, Christoph Wierling, Dr. Silke Sperling BACKGROUND The Max Planck Institute for Molecular Genetics was...the cardiovascular genetics group. RESEARCH PROJECTS Dr. Edda Klipp is the head of the kinetic modeling group. She described her group’s

  14. Stable isotopes: essential tools in biological and medical research

    Energy Technology Data Exchange (ETDEWEB)

    Klein, P. D.; Hachey, D. L.; Kreek, M. J.; Schoeller, D. A.

    1977-01-01

    Recent developments in the use of the stable isotopes, /sup 13/C, /sup 15/N, /sup 17/O, and /sup 18/O, as tracers in research studies in the fields of biology, medicine, pharmacology, and agriculture are briefly reviewed. (CH)

  15. Social justice and research using human biological material: A ...

    African Journals Online (AJOL)

    Social justice and research using human biological material: A response to Mahomed, Nöthling-Slabbert and Pepper. ... South African Medical Journal ... In a recent article, Mahomed, Nöthling-Slabbert and Pepper proposed that research participants should be entitled to share in the profits emanating from such research ...

  16. Biological Research in Canisters (BRIC) - Light Emitting Diode (LED)

    Science.gov (United States)

    Levine, Howard G.; Caron, Allison

    2016-01-01

    The Biological Research in Canisters - LED (BRIC-LED) is a biological research system that is being designed to complement the capabilities of the existing BRIC-Petri Dish Fixation Unit (PDFU) for the Space Life and Physical Sciences (SLPS) Program. A diverse range of organisms can be supported, including plant seedlings, callus cultures, Caenorhabditis elegans, microbes, and others. In the event of a launch scrub, the entire assembly can be replaced with an identical back-up unit containing freshly loaded specimens.

  17. Connecting Biology and Organic Chemistry Introductory Laboratory Courses through a Collaborative Research Project

    Science.gov (United States)

    Boltax, Ariana L.; Armanious, Stephanie; Kosinski-Collins, Melissa S.; Pontrello, Jason K.

    2015-01-01

    Modern research often requires collaboration of experts in fields, such as math, chemistry, biology, physics, and computer science to develop unique solutions to common problems. Traditional introductory undergraduate laboratory curricula in the sciences often do not emphasize connections possible between the various disciplines. We designed an…

  18. Parallel computing in genomic research: advances and applications.

    Science.gov (United States)

    Ocaña, Kary; de Oliveira, Daniel

    2015-01-01

    Today's genomic experiments have to process the so-called "biological big data" that is now reaching the size of Terabytes and Petabytes. To process this huge amount of data, scientists may require weeks or months if they use their own workstations. Parallelism techniques and high-performance computing (HPC) environments can be applied for reducing the total processing time and to ease the management, treatment, and analyses of this data. However, running bioinformatics experiments in HPC environments such as clouds, grids, clusters, and graphics processing unit requires the expertise from scientists to integrate computational, biological, and mathematical techniques and technologies. Several solutions have already been proposed to allow scientists for processing their genomic experiments using HPC capabilities and parallelism techniques. This article brings a systematic review of literature that surveys the most recently published research involving genomics and parallel computing. Our objective is to gather the main characteristics, benefits, and challenges that can be considered by scientists when running their genomic experiments to benefit from parallelism techniques and HPC capabilities.

  19. RISE OF BIOINFORMATICS AND COMPUTATIONAL BIOLOGY IN INDIA: A LOOK THROUGH PUBLICATIONS

    Directory of Open Access Journals (Sweden)

    Anjali Srivastava

    2017-09-01

    Full Text Available Computational biology and bioinformatics have been part and parcel of biomedical research for few decades now. However, the institutionalization of bioinformatics research took place with the establishment of Distributed Information Centres (DISCs in the research institutions of repute in various disciplines by the Department of Biotechnology, Government of India. Though, at initial stages, this endeavor was mainly focused on providing infrastructure for using information technology and internet based communication and tools for carrying out computational biology and in-silico assisted research in varied arena of research starting from disease biology to agricultural crops, spices, veterinary science and many more, the natural outcome of establishment of such facilities resulted into new experiments with bioinformatics tools. Thus, Biotechnology Information Systems (BTIS grew into a solid movement and a large number of publications started coming out of these centres. In the end of last century, bioinformatics started developing like a full-fledged research subject. In the last decade, a need was felt to actually make a factual estimation of the result of this endeavor of DBT which had, by then, established about two hundred centres in almost all disciplines of biomedical research. In a bid to evaluate the efforts and outcome of these centres, BTIS Centre at CSIR-CDRI, Lucknow was entrusted with collecting and collating the publications of these centres. However, when the full data was compiled, the DBT task force felt that the study must include Non-BTIS centres also so as to expand the report to have a glimpse of bioinformatics publications from the country.

  20. Computer technology and computer programming research and strategies

    CERN Document Server

    Antonakos, James L

    2011-01-01

    Covering a broad range of new topics in computer technology and programming, this volume discusses encryption techniques, SQL generation, Web 2.0 technologies, and visual sensor networks. It also examines reconfigurable computing, video streaming, animation techniques, and more. Readers will learn about an educational tool and game to help students learn computer programming. The book also explores a new medical technology paradigm centered on wireless technology and cloud computing designed to overcome the problems of increasing health technology costs.

  1. Elastic Multi-scale Mechanisms: Computation and Biological Evolution.

    Science.gov (United States)

    Diaz Ochoa, Juan G

    2018-01-01

    Explanations based on low-level interacting elements are valuable and powerful since they contribute to identify the key mechanisms of biological functions. However, many dynamic systems based on low-level interacting elements with unambiguous, finite, and complete information of initial states generate future states that cannot be predicted, implying an increase of complexity and open-ended evolution. Such systems are like Turing machines, that overlap with dynamical systems that cannot halt. We argue that organisms find halting conditions by distorting these mechanisms, creating conditions for a constant creativity that drives evolution. We introduce a modulus of elasticity to measure the changes in these mechanisms in response to changes in the computed environment. We test this concept in a population of predators and predated cells with chemotactic mechanisms and demonstrate how the selection of a given mechanism depends on the entire population. We finally explore this concept in different frameworks and postulate that the identification of predictive mechanisms is only successful with small elasticity modulus.

  2. Computer simulations for biological aging and sexual reproduction

    Directory of Open Access Journals (Sweden)

    DIETRICH STAUFFER

    2001-03-01

    Full Text Available The sexual version of the Penna model of biological aging, simulated since 1996, is compared here with alternative forms of reproduction as well as with models not involving aging. In particular we want to check how sexual forms of life could have evolved and won over earlier asexual forms hundreds of million years ago. This computer model is based on the mutation-accumulation theory of aging, using bits-strings to represent the genome. Its population dynamics is studied by Monte Carlo methods.A versão sexual do modelo de envelhecimento biológico de Penna, simulada desde 1996, é comparada aqui com formas alternativas de reprodução bem como com modelos que não envolvem envelhecimento. Em particular, queremos verificar como formas sexuais de vida poderiam ter evoluído e predominado sobre formas assexuais há centenas de milhões de anos. Este modelo computacional baseia-se na teoria do envelhecimento por acumulação de mutações, usando 'bits-strings' para representar o genoma. Sua dinâmica de populações é estudada por métodos de Monte Carlo.

  3. A cyber-linked undergraduate research experience in computational biomolecular structure prediction and design.

    Science.gov (United States)

    Alford, Rebecca F; Leaver-Fay, Andrew; Gonzales, Lynda; Dolan, Erin L; Gray, Jeffrey J

    2017-12-01

    Computational biology is an interdisciplinary field, and many computational biology research projects involve distributed teams of scientists. To accomplish their work, these teams must overcome both disciplinary and geographic barriers. Introducing new training paradigms is one way to facilitate research progress in computational biology. Here, we describe a new undergraduate program in biomolecular structure prediction and design in which students conduct research at labs located at geographically-distributed institutions while remaining connected through an online community. This 10-week summer program begins with one week of training on computational biology methods development, transitions to eight weeks of research, and culminates in one week at the Rosetta annual conference. To date, two cohorts of students have participated, tackling research topics including vaccine design, enzyme design, protein-based materials, glycoprotein modeling, crowd-sourced science, RNA processing, hydrogen bond networks, and amyloid formation. Students in the program report outcomes comparable to students who participate in similar in-person programs. These outcomes include the development of a sense of community and increases in their scientific self-efficacy, scientific identity, and science values, all predictors of continuing in a science research career. Furthermore, the program attracted students from diverse backgrounds, which demonstrates the potential of this approach to broaden the participation of young scientists from backgrounds traditionally underrepresented in computational biology.

  4. A cyber-linked undergraduate research experience in computational biomolecular structure prediction and design.

    Directory of Open Access Journals (Sweden)

    Rebecca F Alford

    2017-12-01

    Full Text Available Computational biology is an interdisciplinary field, and many computational biology research projects involve distributed teams of scientists. To accomplish their work, these teams must overcome both disciplinary and geographic barriers. Introducing new training paradigms is one way to facilitate research progress in computational biology. Here, we describe a new undergraduate program in biomolecular structure prediction and design in which students conduct research at labs located at geographically-distributed institutions while remaining connected through an online community. This 10-week summer program begins with one week of training on computational biology methods development, transitions to eight weeks of research, and culminates in one week at the Rosetta annual conference. To date, two cohorts of students have participated, tackling research topics including vaccine design, enzyme design, protein-based materials, glycoprotein modeling, crowd-sourced science, RNA processing, hydrogen bond networks, and amyloid formation. Students in the program report outcomes comparable to students who participate in similar in-person programs. These outcomes include the development of a sense of community and increases in their scientific self-efficacy, scientific identity, and science values, all predictors of continuing in a science research career. Furthermore, the program attracted students from diverse backgrounds, which demonstrates the potential of this approach to broaden the participation of young scientists from backgrounds traditionally underrepresented in computational biology.

  5. The Air Force "In Silico" -- Computational Biology in 2025

    National Research Council Canada - National Science Library

    Coates, Christopher

    2007-01-01

    The biological sciences have recently experienced remarkable advances and there are now frequent claims that "we are on the advent of being able to model or simulate biological systems to the smallest, molecular detail...

  6. Parameterized algorithmics for computational social choice : nine research challenges

    NARCIS (Netherlands)

    Bredereck, R.; Chen, J.; Faliszewski, P.; Guo, J.; Niedermeier, R.; Woeginger, G.J.

    2014-01-01

    Computational Social Choice is an interdisciplinary research area involving Economics, Political Science, and Social Science on the one side, and Mathematics and Computer Science (including Artificial Intelligence and Multiagent Systems) on the other side. Typical computational problems studied in

  7. A practical workflow for making anatomical atlases for biological research.

    Science.gov (United States)

    Wan, Yong; Lewis, A Kelsey; Colasanto, Mary; van Langeveld, Mark; Kardon, Gabrielle; Hansen, Charles

    2012-01-01

    The anatomical atlas has been at the intersection of science and art for centuries. These atlases are essential to biological research, but high-quality atlases are often scarce. Recent advances in imaging technology have made high-quality 3D atlases possible. However, until now there has been a lack of practical workflows using standard tools to generate atlases from images of biological samples. With certain adaptations, CG artists' workflow and tools, traditionally used in the film industry, are practical for building high-quality biological atlases. Researchers have developed a workflow for generating a 3D anatomical atlas using accessible artists' tools. They used this workflow to build a mouse limb atlas for studying the musculoskeletal system's development. This research aims to raise the awareness of using artists' tools in scientific research and promote interdisciplinary collaborations between artists and scientists. This video (http://youtu.be/g61C-nia9ms) demonstrates a workflow for creating an anatomical atlas.

  8. Quarterly report of Biological and Medical Research Division, April 1955

    Energy Technology Data Exchange (ETDEWEB)

    Brues, A.M.

    1955-04-01

    This report is a compilation of 48 investigator prepared summaries of recent progress in individual research programs of the Biology and Medical Division of the Argonne National Laboratory for the quarterly period ending April,1955. Individual reports are about 3-6 pages in length and often contain research data.

  9. How computational models can help unlock biological systems.

    Science.gov (United States)

    Brodland, G Wayne

    2015-12-01

    With computation models playing an ever increasing role in the advancement of science, it is important that researchers understand what it means to model something; recognize the implications of the conceptual, mathematical and algorithmic steps of model construction; and comprehend what models can and cannot do. Here, we use examples to show that models can serve a wide variety of roles, including hypothesis testing, generating new insights, deepening understanding, suggesting and interpreting experiments, tracing chains of causation, doing sensitivity analyses, integrating knowledge, and inspiring new approaches. We show that models can bring together information of different kinds and do so across a range of length scales, as they do in multi-scale, multi-faceted embryogenesis models, some of which connect gene expression, the cytoskeleton, cell properties, tissue mechanics, morphogenetic movements and phenotypes. Models cannot replace experiments nor can they prove that particular mechanisms are at work in a given situation. But they can demonstrate whether or not a proposed mechanism is sufficient to produce an observed phenomenon. Although the examples in this article are taken primarily from the field of embryo mechanics, most of the arguments and discussion are applicable to any form of computational modelling. Crown Copyright © 2015. Published by Elsevier Ltd. All rights reserved.

  10. Bringing the physical sciences into your cell biology research.

    Science.gov (United States)

    Robinson, Douglas N; Iglesias, Pablo A

    2012-11-01

    Historically, much of biology was studied by physicists and mathematicians. With the advent of modern molecular biology, a wave of researchers became trained in a new scientific discipline filled with the language of genes, mutants, and the central dogma. These new molecular approaches have provided volumes of information on biomolecules and molecular pathways from the cellular to the organismal level. The challenge now is to determine how this seemingly endless list of components works together to promote the healthy function of complex living systems. This effort requires an interdisciplinary approach by investigators from both the biological and the physical sciences.

  11. Interactomes to Biological Phase Space: a call to begin thinking at a new level in computational biology.

    Energy Technology Data Exchange (ETDEWEB)

    Davidson, George S.; Brown, William Michael

    2007-09-01

    Techniques for high throughput determinations of interactomes, together with high resolution protein collocalizations maps within organelles and through membranes will soon create a vast resource. With these data, biological descriptions, akin to the high dimensional phase spaces familiar to physicists, will become possible. These descriptions will capture sufficient information to make possible realistic, system-level models of cells. The descriptions and the computational models they enable will require powerful computing techniques. This report is offered as a call to the computational biology community to begin thinking at this scale and as a challenge to develop the required algorithms and codes to make use of the new data.3

  12. The NASA computer science research program plan

    Science.gov (United States)

    1983-01-01

    A taxonomy of computer science is included, one state of the art of each of the major computer science categories is summarized. A functional breakdown of NASA programs under Aeronautics R and D, space R and T, and institutional support is also included. These areas were assessed against the computer science categories. Concurrent processing, highly reliable computing, and information management are identified.

  13. Biological and Physical Space Research Laboratory 2002 Science Review

    Science.gov (United States)

    Curreri, P. A. (Editor); Robinson, M. B. (Editor); Murphy, K. L. (Editor)

    2003-01-01

    With the International Space Station Program approaching core complete, our NASA Headquarters sponsor, the new Code U Enterprise, Biological and Physical Research, is shifting its research emphasis from purely fundamental microgravity and biological sciences to strategic research aimed at enabling human missions beyond Earth orbit. Although we anticipate supporting microgravity research on the ISS for some time to come, our laboratory has been vigorously engaged in developing these new strategic research areas.This Technical Memorandum documents the internal science research at our laboratory as presented in a review to Dr. Ann Whitaker, MSFC Science Director, in July 2002. These presentations have been revised and updated as appropriate for this report. It provides a snapshot of the internal science capability of our laboratory as an aid to other NASA organizations and the external scientific community.

  14. The computational future for climate change research

    International Nuclear Information System (INIS)

    Washington, Warren M

    2005-01-01

    The development of climate models has a long history starting with the building of atmospheric models and later ocean models. The early researchers were very aware of the goal of building climate models which could integrate our knowledge of complex physical interactions between atmospheric, land-vegetation, hydrology, ocean, cryospheric processes, and sea ice. The transition from climate models to earth system models is already underway with coupling of active biochemical cycles. Progress is limited by present computer capability which is needed for increasingly more complex and higher resolution climate models versions. It would be a mistake to make models too complex or too high resolution. Arriving at a 'feasible' and useful model is the challenge for the climate model community. Some of the climate change history, scientific successes, and difficulties encountered with supercomputers will be presented

  15. Computer network for experimental research using ISDN

    International Nuclear Information System (INIS)

    Ida, Katsumi; Nakanishi, Hideya

    1997-01-01

    This report describes the development of a computer network that uses the Integrated Service Digital Network (ISDN) for real-time analysis of experimental plasma physics and nuclear fusion research. Communication speed, 64/128kbps (INS64) or 1.5Mbps (INS1500) per connection, is independent of how busy the network is. When INS-1500 is used, the communication speed, which is proportional to the public telephone connection fee, can be dynamically varied from 64kbps to 1472kbps (depending on how much data are being transferred using the Bandwidth-on-Demand (BOD) function in the ISDN Router. On-demand dial-up and time-out disconnection reduce the public telephone connection fee by 10%-97%. (author)

  16. Proceedings of the 2013 MidSouth Computational Biology and Bioinformatics Society (MCBIOS) Conference.

    Science.gov (United States)

    Wren, Jonathan D; Dozmorov, Mikhail G; Burian, Dennis; Kaundal, Rakesh; Perkins, Andy; Perkins, Ed; Kupfer, Doris M; Springer, Gordon K

    2013-01-01

    The tenth annual conference of the MidSouth Computational Biology and Bioinformatics Society (MCBIOS 2013), "The 10th Anniversary in a Decade of Change: Discovery in a Sea of Data", took place at the Stoney Creek Inn & Conference Center in Columbia, Missouri on April 5-6, 2013. This year's Conference Chairs were Gordon Springer and Chi-Ren Shyu from the University of Missouri and Edward Perkins from the US Army Corps of Engineers Engineering Research and Development Center, who is also the current MCBIOS President (2012-3). There were 151 registrants and a total of 111 abstracts (51 oral presentations and 60 poster session abstracts).

  17. NASA Space Biology Plant Research for 2010-2020

    Science.gov (United States)

    Levine, H. G.; Tomko, D. L.; Porterfield, D. M.

    2012-01-01

    The U.S. National Research Council (NRC) recently published "Recapturing a Future for Space Exploration: Life and Physical Sciences Research for a New Era" (http://www.nap.edu/catalog.php?record id=13048), and NASA completed a Space Biology Science Plan to develop a strategy for implementing its recommendations ( http://www.nasa.gov/exploration/library/esmd documents.html). The most important recommendations of the NRC report on plant biology in space were that NASA should: (1) investigate the roles of microbial-plant systems in long-term bioregenerative life support systems, and (2) establish a robust spaceflight program of research analyzing plant growth and physiological responses to the multiple stimuli encountered in spaceflight environments. These efforts should take advantage of recently emerged analytical technologies (genomics, transcriptomics, proteomics, metabolomics) and apply modern cellular and molecular approaches in the development of a vigorous flight-based and ground-based research program. This talk will describe NASA's strategy and plans for implementing these NRC Plant Space Biology recommendations. New research capabilities for Plant Biology, optimized by providing state-of-the-art automated technology and analytical techniques to maximize scientific return, will be described. Flight experiments will use the most appropriate platform to achieve science results (e.g., ISS, free flyers, sub-orbital flights) and NASA will work closely with its international partners and other U.S. agencies to achieve its objectives. One of NASA's highest priorities in Space Biology is the development research capabilities for use on the International Space Station and other flight platforms for studying multiple generations of large plants. NASA will issue recurring NASA Research Announcements (NRAs) that include a rapid turn-around model to more fully engage the biology community in designing experiments to respond to the NRC recommendations. In doing so, NASA

  18. Activities in biological radiation research at the AGF

    International Nuclear Information System (INIS)

    1984-01-01

    The AGF is working on a wide spectrum of biological radiation research, with the different scientific disciplines contributing different methodologies to long-term research projects. The following fields are studied: 1. Molecular and cellular modes of action of radiation. 2. Detection and characterisation of biological radiation damage, especially in humans. 3. Medical applications of radiation effects. 4. Concepts and methods of radiation protection. The studies will lead to suggestions for radiation protection and improved radiotherapy. They may also contribute to the development of environmental protection strategies. (orig./MG) [de

  19. The value of closed-circuit rebreathers for biological research

    Science.gov (United States)

    Pyle, Richrad L.; Lobel, Phillip S.; Tomoleoni, Joseph

    2016-01-01

    Closed-circuit rebreathers have been used for underwater biological research since the late 1960s, but have only started to gain broader application within scientific diving organizations within the past two decades. Rebreathers offer certain specific advantages for such research, especially for research involving behavior and surveys that depend on unobtrusive observers or for a stealthy approach to wildlife for capture and tagging, research that benefits from extended durations underwater, and operations requiring access to relatively deep (>50 m) environments (especially in remote locations). Although many institutions have been slow to adopt rebreather technology within their diving programs, recent developments in rebreather technology that improve safety, standardize training requirements, and reduce costs of equipment and maintenance, will likely result in a trend of increasing utilization of rebreathers for underwater biological research.

  20. New computing techniques in physics research

    International Nuclear Information System (INIS)

    Perret-Gallix, D.; Wojcik, W.

    1990-01-01

    These proceedings relate in a pragmatic way the use of methods and techniques of software engineering and artificial intelligence in high energy and nuclear physics. Such fundamental research can only be done through the design, the building and the running of equipments and systems among the most complex ever undertaken by mankind. The use of these new methods is mandatory in such an environment. However their proper integration in these real applications raise some unsolved problems. Their solution, beyond the research field, will lead to a better understanding of some fundamental aspects of software engineering and artificial intelligence. Here is a sample of subjects covered in the proceedings : Software engineering in a multi-users, multi-versions, multi-systems environment, project management, software validation and quality control, data structure and management object oriented languages, multi-languages application, interactive data analysis, expert systems for diagnosis, expert systems for real-time applications, neural networks for pattern recognition, symbolic manipulation for automatic computation of complex processes

  1. Research on Key Technologies of Cloud Computing

    Science.gov (United States)

    Zhang, Shufen; Yan, Hongcan; Chen, Xuebin

    With the development of multi-core processors, virtualization, distributed storage, broadband Internet and automatic management, a new type of computing mode named cloud computing is produced. It distributes computation task on the resource pool which consists of massive computers, so the application systems can obtain the computing power, the storage space and software service according to its demand. It can concentrate all the computing resources and manage them automatically by the software without intervene. This makes application offers not to annoy for tedious details and more absorbed in his business. It will be advantageous to innovation and reduce cost. It's the ultimate goal of cloud computing to provide calculation, services and applications as a public facility for the public, So that people can use the computer resources just like using water, electricity, gas and telephone. Currently, the understanding of cloud computing is developing and changing constantly, cloud computing still has no unanimous definition. This paper describes three main service forms of cloud computing: SAAS, PAAS, IAAS, compared the definition of cloud computing which is given by Google, Amazon, IBM and other companies, summarized the basic characteristics of cloud computing, and emphasized on the key technologies such as data storage, data management, virtualization and programming model.

  2. Usage of Cloud Computing Simulators and Future Systems For Computational Research

    OpenAIRE

    Lakshminarayanan, Ramkumar; Ramalingam, Rajasekar

    2016-01-01

    Cloud Computing is an Internet based computing, whereby shared resources, software and information, are provided to computers and devices on demand, like the electricity grid. Currently, IaaS (Infrastructure as a Service), PaaS (Platform as a Service) and SaaS (Software as a Service) are used as a business model for Cloud Computing. Nowadays, the adoption and deployment of Cloud Computing is increasing in various domains, forcing researchers to conduct research in the area of Cloud Computing ...

  3. Computer Presentation Programs and Teaching Research Methodologies

    OpenAIRE

    Motamedi, Vahid

    2015-01-01

    Supplementing traditional chalk and board instruction with computer delivery has been viewed positively by students who have reported increased understanding and more interaction with the instructor when computer presentations are used in the classroom. Some problems contributing to student errors while taking class notes might be transcription of numbers to the board, and handwriting of the instructor can be resolved in careful construction of computer presentations. The use of computer pres...

  4. Spacecraft computer technology at Southwest Research Institute

    Science.gov (United States)

    Shirley, D. J.

    1993-01-01

    Southwest Research Institute (SwRI) has developed and delivered spacecraft computers for a number of different near-Earth-orbit spacecraft including shuttle experiments and SDIO free-flyer experiments. We describe the evolution of the basic SwRI spacecraft computer design from those weighing in at 20 to 25 lb and using 20 to 30 W to newer models weighing less than 5 lb and using only about 5 W, yet delivering twice the processing throughput. Because of their reduced size, weight, and power, these newer designs are especially applicable to planetary instrument requirements. The basis of our design evolution has been the availability of more powerful processor chip sets and the development of higher density packaging technology, coupled with more aggressive design strategies in incorporating high-density FPGA technology and use of high-density memory chips. In addition to reductions in size, weight, and power, the newer designs also address the necessity of survival in the harsh radiation environment of space. Spurred by participation in such programs as MSTI, LACE, RME, Delta 181, Delta Star, and RADARSAT, our designs have evolved in response to program demands to be small, low-powered units, radiation tolerant enough to be suitable for both Earth-orbit microsats and for planetary instruments. Present designs already include MIL-STD-1750 and Multi-Chip Module (MCM) technology with near-term plans to include RISC processors and higher-density MCM's. Long term plans include development of whole-core processors on one or two MCM's.

  5. Bibliographical review on the teaching of Biology and research

    Directory of Open Access Journals (Sweden)

    Mª Luz Rodríguez Palmero

    2000-09-01

    Full Text Available This review complements another one done by the same author, in 1997, regarding the role of comprehending the concept of cell in the learning of Biology. In addition, some general papers on science education that provide a better understanding of research approaches used in the investigation of this topic have been included. The reviewed papers have been organized into categories according to the object of study, the relevance assigned to the cell concept, and the framework of analysis. The review shows that the concept of cell is very important in the biological conceptualization, however, it also shows the need of additional research on this matter, from theoretical frameworks that pay more attention to the psychological level, in order to provide some guidance to improve the teaching and learning processes of the biological content that presupose the comprehension of living beings.

  6. Dispensing processes impact apparent biological activity as determined by computational and statistical analyses.

    Directory of Open Access Journals (Sweden)

    Sean Ekins

    Full Text Available Dispensing and dilution processes may profoundly influence estimates of biological activity of compounds. Published data show Ephrin type-B receptor 4 IC50 values obtained via tip-based serial dilution and dispensing versus acoustic dispensing with direct dilution differ by orders of magnitude with no correlation or ranking of datasets. We generated computational 3D pharmacophores based on data derived by both acoustic and tip-based transfer. The computed pharmacophores differ significantly depending upon dispensing and dilution methods. The acoustic dispensing-derived pharmacophore correctly identified active compounds in a subsequent test set where the tip-based method failed. Data from acoustic dispensing generates a pharmacophore containing two hydrophobic features, one hydrogen bond donor and one hydrogen bond acceptor. This is consistent with X-ray crystallography studies of ligand-protein interactions and automatically generated pharmacophores derived from this structural data. In contrast, the tip-based data suggest a pharmacophore with two hydrogen bond acceptors, one hydrogen bond donor and no hydrophobic features. This pharmacophore is inconsistent with the X-ray crystallographic studies and automatically generated pharmacophores. In short, traditional dispensing processes are another important source of error in high-throughput screening that impacts computational and statistical analyses. These findings have far-reaching implications in biological research.

  7. 16th International Conference on Hybrid Intelligent Systems and the 8th World Congress on Nature and Biologically Inspired Computing

    CERN Document Server

    Haqiq, Abdelkrim; Alimi, Adel; Mezzour, Ghita; Rokbani, Nizar; Muda, Azah

    2017-01-01

    This book presents the latest research in hybrid intelligent systems. It includes 57 carefully selected papers from the 16th International Conference on Hybrid Intelligent Systems (HIS 2016) and the 8th World Congress on Nature and Biologically Inspired Computing (NaBIC 2016), held on November 21–23, 2016 in Marrakech, Morocco. HIS - NaBIC 2016 was jointly organized by the Machine Intelligence Research Labs (MIR Labs), USA; Hassan 1st University, Settat, Morocco and University of Sfax, Tunisia. Hybridization of intelligent systems is a promising research field in modern artificial/computational intelligence and is concerned with the development of the next generation of intelligent systems. The conference’s main aim is to inspire further exploration of the intriguing potential of hybrid intelligent systems and bio-inspired computing. As such, the book is a valuable resource for practicing engineers /scientists and researchers working in the field of computational intelligence and artificial intelligence.

  8. Parallel metaheuristics in computational biology: an asynchronous cooperative enhanced scatter search method

    OpenAIRE

    Penas, David R.; González, Patricia; Egea, José A.; Banga, Julio R.; Doallo, Ramón

    2015-01-01

    Metaheuristics are gaining increased attention as efficient solvers for hard global optimization problems arising in bioinformatics and computational systems biology. Scatter Search (SS) is one of the recent outstanding algorithms in that class. However, its application to very hard problems, like those considering parameter estimation in dynamic models of systems biology, still results in excessive computation times. In order to reduce the computational cost of the SS and improve its success...

  9. Norwegian computers in European energy research project

    International Nuclear Information System (INIS)

    Anon.

    1979-01-01

    16 NORD computers have been ordered for the JET data acquisition and storage system. The computers will be arranged in a 'double star' configuration, developed by CERN. Two control consoles each have their own computer. All computers for communication, control, diagnostics, consoles and testing are NORD-100s while the computer for data storage and analysis is a NORD-500. The operating system is SINTRAN CAMAC SERIAL HIGHWAY with fibre optics to be used for long communications paths. The programming languages FORTRAN, NODAL, NORD PL, PASCAL and BASIC may be used. The JET project and TOKAMAK type machines are briefly described. (JIW)

  10. Effectiveness of computer-assisted learning in biology teaching in primary schools in Serbia

    Directory of Open Access Journals (Sweden)

    Županec Vera

    2013-01-01

    Full Text Available The paper analyzes the comparative effectiveness of Computer-Assisted Learning (CAL and the traditional teaching method in biology on primary school pupils. A stratified random sample consisted of 214 pupils from two primary schools in Novi Sad. The pupils in the experimental group learned the biology content (Chordate using CAL, whereas the pupils in the control group learned the same content using traditional teaching. The research design was the pretest-posttest equivalent groups design. All instruments (the pretest, the posttest and the retest contained the questions belonging to three different cognitive domains: knowing, applying, and reasoning. Arithmetic mean, standard deviation, and standard error were analyzed using the software package SPSS 14.0, and t-test was used in order to establish the difference between the same statistical indicators. The analysis of results of the post­test and the retest showed that the pupils from the CAL group achieved significantly higher quantity and quality of knowledge in all three cognitive domains than the pupils from the traditional group. The results accomplished by the pupils from the CAL group suggest that individual CAL should be more present in biology teaching in primary schools, with the aim of raising the quality of biology education in pupils. [Projekat Ministarstva nauke Republike Srbije, br. 179010: Quality of Educational System in Serbia in the European Perspective

  11. Biomedical Research Experiences for Biology Majors at a Small College

    Science.gov (United States)

    Stover, Shawn K.; Mabry, Michelle L.

    2010-01-01

    A program-level assessment of the biology curriculum at a small liberal arts college validates a previous study demonstrating success in achieving learning outcomes related to content knowledge and communication skills. Furthermore, research opportunities have been provided to complement pedagogical strategies and give students a more complete…

  12. Biologically Enhanced Carbon Sequestration: Research Needs and Opportunities

    Energy Technology Data Exchange (ETDEWEB)

    Oldenburg, Curtis; Oldenburg, Curtis M.; Torn, Margaret S.

    2008-03-21

    Fossil fuel combustion, deforestation, and biomass burning are the dominant contributors to increasing atmospheric carbon dioxide (CO{sub 2}) concentrations and global warming. Many approaches to mitigating CO{sub 2} emissions are being pursued, and among the most promising are terrestrial and geologic carbon sequestration. Recent advances in ecology and microbial biology offer promising new possibilities for enhancing terrestrial and geologic carbon sequestration. A workshop was held October 29, 2007, at Lawrence Berkeley National Laboratory (LBNL) on Biologically Enhanced Carbon Sequestration (BECS). The workshop participants (approximately 30 scientists from California, Illinois, Oregon, Montana, and New Mexico) developed a prioritized list of research needed to make progress in the development of biological enhancements to improve terrestrial and geologic carbon sequestration. The workshop participants also identified a number of areas of supporting science that are critical to making progress in the fundamental research areas. The purpose of this position paper is to summarize and elaborate upon the findings of the workshop. The paper considers terrestrial and geologic carbon sequestration separately. First, we present a summary in outline form of the research roadmaps for terrestrial and geologic BECS. This outline is elaborated upon in the narrative sections that follow. The narrative sections start with the focused research priorities in each area followed by critical supporting science for biological enhancements as prioritized during the workshop. Finally, Table 1 summarizes the potential significance or 'materiality' of advances in these areas for reducing net greenhouse gas emissions.

  13. Biological field stations: research legacies and sites for serendipity

    Science.gov (United States)

    William K. Michener; Keith L. Bildstein; Arthur McKee; Robert R. Parmenter; William W. Hargrove; Deedra McClearn; Mark Stromberg

    2009-01-01

    Biological field stations are distributed throughout North America, capturing much of the ecological variability present at the continental scale and encompassing many unique habitats. In addition to their role in supporting research and education, field stations offer legacies of data, specimens, and accumulated knowledge. Such legacies often provide the only...

  14. Researchers study decontamination of chemical, biological warfare agents

    OpenAIRE

    Trulove, Susan

    2007-01-01

    The U.S. Army Research Office has awarded Virginia Tech a $680,000 grant over two years to build an instrument that can be used to study the chemistry of gases that will decompose both chemical and biological warfare agents on surfaces.

  15. Computational tools for high-throughput discovery in biology

    OpenAIRE

    Jones, Neil Christopher

    2007-01-01

    High throughput data acquisition technology has inarguably transformed the landscape of the life sciences, in part by making possible---and necessary---the computational disciplines of bioinformatics and biomedical informatics. These fields focus primarily on developing tools for analyzing data and generating hypotheses about objects in nature, and it is in this context that we address three pressing problems in the fields of the computational life sciences which each require computing capaci...

  16. COMPUTATIONAL MODELING AND SIMULATION IN BIOLOGY TEACHING: A MINIMALLY EXPLORED FIELD OF STUDY WITH A LOT OF POTENTIAL

    Directory of Open Access Journals (Sweden)

    Sonia López

    2016-09-01

    Full Text Available This study is part of a research project that aims to characterize the epistemological, psychological and didactic presuppositions of science teachers (Biology, Physics, Chemistry that implement Computational Modeling and Simulation (CMS activities as a part of their teaching practice. We present here a synthesis of a literature review on the subject, evidencing how in the last two decades this form of computer usage for science teaching has boomed in disciplines such as Physics and Chemistry, but in a lesser degree in Biology. Additionally, in the works that dwell on the use of CMS in Biology, we identified a lack of theoretical bases that support their epistemological, psychological and/or didactic postures. Accordingly, this generates significant considerations for the fields of research and teacher instruction in Science Education.

  17. Computer - based modeling in extract sciences research -III ...

    African Journals Online (AJOL)

    Molecular modeling techniques have been of great applicability in the study of the biological sciences and other exact science fields like agriculture, mathematics, computer science and the like. In this write up, a list of computer programs for predicting, for instance, the structure of proteins has been provided. Discussions on ...

  18. Affective computing: A reverence for a century of research

    NARCIS (Netherlands)

    Broek, E.L. van den

    2012-01-01

    To bring affective computing a leap forward, it is best to start with a step back. A century of research has been conducted on topics, which are crucial for affective computing. Understanding this vast amount of research will accelerate progress on affective computing. Therefore, this article

  19. Affective computing: a reverence for a century of research

    NARCIS (Netherlands)

    van den Broek, Egon; Esposito, Anna; Esposito, Antonietta M.; Vinciarelli, Alessandro; Hoffmann, Rüdiger; Müller, Vincent C.

    2012-01-01

    To bring affective computing a leap forward, it is best to start with a step back. A century of research has been conducted on topics, which are crucial for affective computing. Understanding this vast amount of research will accelerate progress on affective computing. Therefore, this article

  20. Current dichotomy between traditional molecular biological and omic research in cancer biology and pharmacology.

    Science.gov (United States)

    Reinhold, William C

    2015-12-10

    There is currently a split within the cancer research community between traditional molecular biological hypothesis-driven and the more recent "omic" forms or research. While the molecular biological approach employs the tried and true single alteration-single response formulations of experimentation, the omic employs broad-based assay or sample collection approaches that generate large volumes of data. How to integrate the benefits of these two approaches in an efficient and productive fashion remains an outstanding issue. Ideally, one would merge the understandability, exactness, simplicity, and testability of the molecular biological approach, with the larger amounts of data, simultaneous consideration of multiple alterations, consideration of genes both of known interest along with the novel, cross-sample comparisons among cell lines and patient samples, and consideration of directed questions while simultaneously gaining exposure to the novel provided by the omic approach. While at the current time integration of the two disciplines remains problematic, attempts to do so are ongoing, and will be necessary for the understanding of the large cell line screens including the Developmental Therapeutics Program's NCI-60, the Broad Institute's Cancer Cell Line Encyclopedia, and the Wellcome Trust Sanger Institute's Cancer Genome Project, as well as the the Cancer Genome Atlas clinical samples project. Going forward there is significant benefit to be had from the integration of the molecular biological and the omic forms or research, with the desired goal being improved translational understanding and application.

  1. Applications of computer modeling to fusion research

    International Nuclear Information System (INIS)

    Dawson, J.M.

    1989-01-01

    Progress achieved during this report period is presented on the following topics: Development and application of gyrokinetic particle codes to tokamak transport, development of techniques to take advantage of parallel computers; model dynamo and bootstrap current drive; and in general maintain our broad-based program in basic plasma physics and computer modeling

  2. pClone: Synthetic Biology Tool Makes Promoter Research Accessible to Beginning Biology Students

    Science.gov (United States)

    Eckdahl, Todd; Cronk, Brian; Andresen, Corinne; Frederick, Paul; Huckuntod, Samantha; Shinneman, Claire; Wacker, Annie; Yuan, Jason

    2014-01-01

    The Vision and Change report recommended genuine research experiences for undergraduate biology students. Authentic research improves science education, increases the number of scientifically literate citizens, and encourages students to pursue research. Synthetic biology is well suited for undergraduate research and is a growing area of science. We developed a laboratory module called pClone that empowers students to use advances in molecular cloning methods to discover new promoters for use by synthetic biologists. Our educational goals are consistent with Vision and Change and emphasize core concepts and competencies. pClone is a family of three plasmids that students use to clone a new transcriptional promoter or mutate a canonical promoter and measure promoter activity in Escherichia coli. We also developed the Registry of Functional Promoters, an open-access database of student promoter research results. Using pre- and posttests, we measured significant learning gains among students using pClone in introductory biology and genetics classes. Student posttest scores were significantly better than scores of students who did not use pClone. pClone is an easy and affordable mechanism for large-enrollment labs to meet the high standards of Vision and Change. PMID:26086659

  3. Invited Review Article: Advanced light microscopy for biological space research

    Science.gov (United States)

    De Vos, Winnok H.; Beghuin, Didier; Schwarz, Christian J.; Jones, David B.; van Loon, Jack J. W. A.; Bereiter-Hahn, Juergen; Stelzer, Ernst H. K.

    2014-10-01

    As commercial space flights have become feasible and long-term extraterrestrial missions are planned, it is imperative that the impact of space travel and the space environment on human physiology be thoroughly characterized. Scrutinizing the effects of potentially detrimental factors such as ionizing radiation and microgravity at the cellular and tissue level demands adequate visualization technology. Advanced light microscopy (ALM) is the leading tool for non-destructive structural and functional investigation of static as well as dynamic biological systems. In recent years, technological developments and advances in photochemistry and genetic engineering have boosted all aspects of resolution, readout and throughput, rendering ALM ideally suited for biological space research. While various microscopy-based studies have addressed cellular response to space-related environmental stressors, biological endpoints have typically been determined only after the mission, leaving an experimental gap that is prone to bias results. An on-board, real-time microscopical monitoring device can bridge this gap. Breadboards and even fully operational microscope setups have been conceived, but they need to be rendered more compact and versatile. Most importantly, they must allow addressing the impact of gravity, or the lack thereof, on physiologically relevant biological systems in space and in ground-based simulations. In order to delineate the essential functionalities for such a system, we have reviewed the pending questions in space science, the relevant biological model systems, and the state-of-the art in ALM. Based on a rigorous trade-off, in which we recognize the relevance of multi-cellular systems and the cellular microenvironment, we propose a compact, but flexible concept for space-related cell biological research that is based on light sheet microscopy.

  4. Invited Review Article: Advanced light microscopy for biological space research

    International Nuclear Information System (INIS)

    De Vos, Winnok H.; Beghuin, Didier; Schwarz, Christian J.; Jones, David B.; Loon, Jack J. W. A. van; Bereiter-Hahn, Juergen; Stelzer, Ernst H. K.

    2014-01-01

    As commercial space flights have become feasible and long-term extraterrestrial missions are planned, it is imperative that the impact of space travel and the space environment on human physiology be thoroughly characterized. Scrutinizing the effects of potentially detrimental factors such as ionizing radiation and microgravity at the cellular and tissue level demands adequate visualization technology. Advanced light microscopy (ALM) is the leading tool for non-destructive structural and functional investigation of static as well as dynamic biological systems. In recent years, technological developments and advances in photochemistry and genetic engineering have boosted all aspects of resolution, readout and throughput, rendering ALM ideally suited for biological space research. While various microscopy-based studies have addressed cellular response to space-related environmental stressors, biological endpoints have typically been determined only after the mission, leaving an experimental gap that is prone to bias results. An on-board, real-time microscopical monitoring device can bridge this gap. Breadboards and even fully operational microscope setups have been conceived, but they need to be rendered more compact and versatile. Most importantly, they must allow addressing the impact of gravity, or the lack thereof, on physiologically relevant biological systems in space and in ground-based simulations. In order to delineate the essential functionalities for such a system, we have reviewed the pending questions in space science, the relevant biological model systems, and the state-of-the art in ALM. Based on a rigorous trade-off, in which we recognize the relevance of multi-cellular systems and the cellular microenvironment, we propose a compact, but flexible concept for space-related cell biological research that is based on light sheet microscopy

  5. Invited Review Article: Advanced light microscopy for biological space research

    Energy Technology Data Exchange (ETDEWEB)

    De Vos, Winnok H., E-mail: winnok.devos@uantwerpen.be [Laboratory of Cell Biology and Histology, Department of Veterinary Sciences, University of Antwerp, Antwerp (Belgium); Cell Systems and Imaging Research Group, Department of Molecular Biotechnology, Ghent University, Ghent (Belgium); Beghuin, Didier [Lambda-X, Nivelles (Belgium); Schwarz, Christian J. [European Space Agency (ESA), ESTEC, TEC-MMG, Noordwijk (Netherlands); Jones, David B. [Institute for Experimental Orthopaedics and Biomechanics, Philipps University, Marburg (Germany); Loon, Jack J. W. A. van [Department of Oral and Maxillofacial Surgery/Oral Pathology, VU University Medical Center and Department of Oral Cell Biology, Academic Centre for Dentistry Amsterdam, Amsterdam (Netherlands); Bereiter-Hahn, Juergen; Stelzer, Ernst H. K. [Physical Biology, BMLS (FB15, IZN), Goethe University, Frankfurt am Main (Germany)

    2014-10-15

    As commercial space flights have become feasible and long-term extraterrestrial missions are planned, it is imperative that the impact of space travel and the space environment on human physiology be thoroughly characterized. Scrutinizing the effects of potentially detrimental factors such as ionizing radiation and microgravity at the cellular and tissue level demands adequate visualization technology. Advanced light microscopy (ALM) is the leading tool for non-destructive structural and functional investigation of static as well as dynamic biological systems. In recent years, technological developments and advances in photochemistry and genetic engineering have boosted all aspects of resolution, readout and throughput, rendering ALM ideally suited for biological space research. While various microscopy-based studies have addressed cellular response to space-related environmental stressors, biological endpoints have typically been determined only after the mission, leaving an experimental gap that is prone to bias results. An on-board, real-time microscopical monitoring device can bridge this gap. Breadboards and even fully operational microscope setups have been conceived, but they need to be rendered more compact and versatile. Most importantly, they must allow addressing the impact of gravity, or the lack thereof, on physiologically relevant biological systems in space and in ground-based simulations. In order to delineate the essential functionalities for such a system, we have reviewed the pending questions in space science, the relevant biological model systems, and the state-of-the art in ALM. Based on a rigorous trade-off, in which we recognize the relevance of multi-cellular systems and the cellular microenvironment, we propose a compact, but flexible concept for space-related cell biological research that is based on light sheet microscopy.

  6. Research in Applied Mathematics, Fluid Mechanics and Computer Science

    Science.gov (United States)

    1999-01-01

    This report summarizes research conducted at the Institute for Computer Applications in Science and Engineering in applied mathematics, fluid mechanics, and computer science during the period October 1, 1998 through March 31, 1999.

  7. [Research activities in applied mathematics, fluid mechanics, and computer science

    Science.gov (United States)

    1995-01-01

    This report summarizes research conducted at the Institute for Computer Applications in Science and Engineering in applied mathematics, fluid mechanics, and computer science during the period April 1, 1995 through September 30, 1995.

  8. A Community-Building Framework for Collaborative Research Coordination across the Education and Biology Research Disciplines

    Science.gov (United States)

    Pelaez, Nancy; Anderson, Trevor R.; Gardner, Stephanie M.; Yin, Yue; Abraham, Joel K.; Barlett, Edward L.; Gormally, Cara; Hurney, Carol A.; Long, Tammy M.; Newman, Dina L.; Sirum, Karen; Stevens, Michael T.

    2018-01-01

    Since 2009, the U.S. National Science Foundation Directorate for Biological Sciences has funded Research Coordination Networks (RCN) aimed at collaborative efforts to improve participation, learning, and assessment in undergraduate biology education (UBE). RCN-UBE projects focus on coordination and communication among scientists and educators who…

  9. 16 NORD computers for Europeen fusion research

    International Nuclear Information System (INIS)

    Anon.

    1979-01-01

    16 computers of the NORD type, mannfactured by Nordata's French subsidiary, have been ordered for the JET data acquisition and storage system. 15 of the computers are of the NORD-100 model and the 16th is a NORD-500. The computers are to be arranged in a 'double star' configuration, developed by CERN. CAMAC Serial Highway is to be used for long communications paths, using fibre optics. The operating system is SINTRAN and the programming languages FORTRAN, NODAC, NORD PL, PASCAL and BASIC may be used. (JIW)

  10. The Human Genome Project: Biology, Computers, and Privacy.

    Science.gov (United States)

    Cutter, Mary Ann G.; Drexler, Edward; Gottesman, Kay S.; Goulding, Philip G.; McCullough, Laurence B.; McInerney, Joseph D.; Micikas, Lynda B.; Mural, Richard J.; Murray, Jeffrey C.; Zola, John

    This module, for high school teachers, is the second of two modules about the Human Genome Project (HGP) produced by the Biological Sciences Curriculum Study (BSCS). The first section of this module provides background information for teachers about the structure and objectives of the HGP, aspects of the science and technology that underlie the…

  11. Computer Presentation Programs and Teaching Research Methodologies

    Directory of Open Access Journals (Sweden)

    Vahid Motamedi

    2015-05-01

    Full Text Available Supplementing traditional chalk and board instruction with computer delivery has been viewed positively by students who have reported increased understanding and more interaction with the instructor when computer presentations are used in the classroom. Some problems contributing to student errors while taking class notes might be transcription of numbers to the board, and handwriting of the instructor can be resolved in careful construction of computer presentations. The use of computer presentation programs promises to increase the effectiveness of learning by making content more readily available, by reducing the cost and effort of producing quality content, and by allowing content to be more easily shared. This paper describes how problems can be overcome by using presentation packages for instruction.

  12. Hybrid soft computing approaches research and applications

    CERN Document Server

    Dutta, Paramartha; Chakraborty, Susanta

    2016-01-01

    The book provides a platform for dealing with the flaws and failings of the soft computing paradigm through different manifestations. The different chapters highlight the necessity of the hybrid soft computing methodology in general with emphasis on several application perspectives in particular. Typical examples include (a) Study of Economic Load Dispatch by Various Hybrid Optimization Techniques, (b) An Application of Color Magnetic Resonance Brain Image Segmentation by ParaOptiMUSIG activation Function, (c) Hybrid Rough-PSO Approach in Remote Sensing Imagery Analysis,  (d) A Study and Analysis of Hybrid Intelligent Techniques for Breast Cancer Detection using Breast Thermograms, and (e) Hybridization of 2D-3D Images for Human Face Recognition. The elaborate findings of the chapters enhance the exhibition of the hybrid soft computing paradigm in the field of intelligent computing.

  13. Research Applications of Proteolytic Enzymes in Molecular Biology

    Directory of Open Access Journals (Sweden)

    József Tőzsér

    2013-11-01

    Full Text Available Proteolytic enzymes (also termed peptidases, proteases and proteinases are capable of hydrolyzing peptide bonds in proteins. They can be found in all living organisms, from viruses to animals and humans. Proteolytic enzymes have great medical and pharmaceutical importance due to their key role in biological processes and in the life-cycle of many pathogens. Proteases are extensively applied enzymes in several sectors of industry and biotechnology, furthermore, numerous research applications require their use, including production of Klenow fragments, peptide synthesis, digestion of unwanted proteins during nucleic acid purification, cell culturing and tissue dissociation, preparation of recombinant antibody fragments for research, diagnostics and therapy, exploration of the structure-function relationships by structural studies, removal of affinity tags from fusion proteins in recombinant protein techniques, peptide sequencing and proteolytic digestion of proteins in proteomics. The aim of this paper is to review the molecular biological aspects of proteolytic enzymes and summarize their applications in the life sciences.

  14. Eye-tracking research in computer-mediated language learning

    NARCIS (Netherlands)

    Michel, Marije; Smith, Bryan

    2017-01-01

    Though eye-tracking technology has been used in reading research for over 100 years, researchers have only recently begun to use it in studies of computer-assisted language learning (CALL). This chapter provides an overview of eye-tracking research to date, which is relevant to computer-mediated

  15. FOREWORD: Third Nordic Symposium on Computer Simulation in Physics, Chemistry, Biology and Mathematics

    Science.gov (United States)

    Kaski, K.; Salomaa, M.

    1990-01-01

    These are Proceedings of the Third Nordic Symposium on Computer Simulation in Physics, Chemistry, Biology, and Mathematics, held August 25-26, 1989, at Lahti (Finland). The Symposium belongs to an annual series of Meetings, the first one of which was arranged in 1987 at Lund (Sweden) and the second one in 1988 at Kolle-Kolle near Copenhagen (Denmark). Although these Symposia have thus far been essentially Nordic events, their international character has increased significantly; the trend is vividly reflected through contributions in the present Topical Issue. The interdisciplinary nature of Computational Science is central to the activity; this fundamental aspect is also responsible, in an essential way, for its rapidly increasing impact. Crucially important to a wide spectrum of superficially disparate fields is the common need for extensive - and often quite demanding - computational modelling. For such theoretical models, no closed-form (analytical) solutions are available or they would be extremely difficult to find; hence one must rather resort to the Art of performing computational investigations. Among the unifying features in the computational research are the methods of simulation employed; methods which frequently are quite closely related with each other even for faculties of science that are quite unrelated. Computer simulation in Natural Sciences is presently apprehended as a discipline on its own right, occupying a broad region somewhere between the experimental and theoretical methods, but also partially overlapping with and complementing them. - Whichever its proper definition may be, the computational approach serves as a novel and an extremely versatile tool with which one can equally well perform "pure" experimental modelling and conduct "computational theory". Computational studies that have earlier been made possible only through supercomputers have opened unexpected, as well as exciting, novel frontiers equally in mathematics (e.g., fractals

  16. The Systems Biology Research Tool: evolvable open-source software

    OpenAIRE

    Wright, J; Wagner, A

    2008-01-01

    Abstract Background Research in the field of systems biology requires software for a variety of purposes. Software must be used to store, retrieve, analyze, and sometimes even to collect the data obtained from system-level (often high-throughput) experiments. Software must also be used to implement mathematical models and algorithms required for simulation and theoretical predictions on the system-level. Results We introduce a free, easy-to-use, open-source, integrated software platform calle...

  17. BRIC-60: Biological Research in Canisters (BRIC)-60

    Science.gov (United States)

    Richards, Stephanie E. (Compiler); Levine, Howard G.; Romero, Vergel

    2016-01-01

    The Biological Research in Canisters (BRIC) is an anodized-aluminum cylinder used to provide passive stowage for investigations evaluating the effects of space flight on small organisms. Specimens flown in the BRIC 60 mm petri dish (BRIC-60) hardware include Lycoperscion esculentum (tomato), Arabidopsis thaliana (thale cress), Glycine max (soybean) seedlings, Physarum polycephalum (slime mold) cells, Pothetria dispar (gypsy moth) eggs and Ceratodon purpureus (moss).

  18. Research Applications of Proteolytic Enzymes in Molecular Biology

    OpenAIRE

    Mótyán, János András; Tóth, Ferenc; Tőzsér, József

    2013-01-01

    Proteolytic enzymes (also termed peptidases, proteases and proteinases) are capable of hydrolyzing peptide bonds in proteins. They can be found in all living organisms, from viruses to animals and humans. Proteolytic enzymes have great medical and pharmaceutical importance due to their key role in biological processes and in the life-cycle of many pathogens. Proteases are extensively applied enzymes in several sectors of industry and biotechnology, furthermore, numerous research applications ...

  19. Current trends and new challenges of databases and web applications for systems driven biological research

    Directory of Open Access Journals (Sweden)

    Pradeep Kumar eSreenivasaiah

    2010-12-01

    Full Text Available Dynamic and rapidly evolving nature of systems driven research imposes special requirements on the technology, approach, design and architecture of computational infrastructure including database and web application. Several solutions have been proposed to meet the expectations and novel methods have been developed to address the persisting problems of data integration. It is important for researchers to understand different technologies and approaches. Having familiarized with the pros and cons of the existing technologies, researchers can exploit its capabilities to the maximum potential for integrating data. In this review we discuss the architecture, design and key technologies underlying some of the prominent databases (DBs and web applications. We will mention their roles in integration of biological data and investigate some of the emerging design concepts and computational technologies that are likely to have a key role in the future of systems driven biomedical research.

  20. Next Generation Risk Assessment: Incorporation of Recent Advances in Molecular, Computational, and Systems Biology (Final Report)

    Science.gov (United States)

    EPA announced the release of the final report, Next Generation Risk Assessment: Incorporation of Recent Advances in Molecular, Computational, and Systems Biology. This report describes new approaches that are faster, less resource intensive, and more robust that can help ...

  1. Where mathematics, computer science, linguistics and biology meet essays in honour of Gheorghe Păun

    CERN Document Server

    Mitrana, Victor

    2001-01-01

    In the last years, it was observed an increasing interest of computer scientists in the structure of biological molecules and the way how they can be manipulated in vitro in order to define theoretical models of computation based on genetic engineering tools. Along the same lines, a parallel interest is growing regarding the process of evolution of living organisms. Much of the current data for genomes are expressed in the form of maps which are now becoming available and permit the study of the evolution of organisms at the scale of genome for the first time. On the other hand, there is an active trend nowadays throughout the field of computational biology toward abstracted, hierarchical views of biological sequences, which is very much in the spirit of computational linguistics. In the last decades, results and methods in the field of formal language theory that might be applied to the description of biological sequences were pointed out.

  2. Recent progress in structural biology: lessons from our research history.

    Science.gov (United States)

    Nitta, Ryo; Imasaki, Tsuyoshi; Nitta, Eriko

    2018-05-16

    The recent 'resolution revolution' in structural analyses of cryo-electron microscopy (cryo-EM) has drastically changed the research strategy for structural biology. In addition to X-ray crystallography and nuclear magnetic resonance spectroscopy, cryo-EM has achieved the structural analysis of biological molecules at near-atomic resolution, resulting in the Nobel Prize in Chemistry 2017. The effect of this revolution has spread within the biology and medical science fields affecting everything from basic research to pharmaceutical development by visualizing atomic structure. As we have used cryo-EM as well as X-ray crystallography since 2000 to elucidate the molecular mechanisms of the fundamental phenomena in the cell, here we review our research history and summarize our findings. In the first half of the review, we describe the structural mechanisms of microtubule-based motility of molecular motor kinesin by using a joint cryo-EM and X-ray crystallography method. In the latter half, we summarize our structural studies on transcriptional regulation by X-ray crystallography of in vitro reconstitution of a multi-protein complex.

  3. Biological Visualization, Imaging and Simulation(Bio-VIS) at NASA Ames Research Center: Developing New Software and Technology for Astronaut Training and Biology Research in Space

    Science.gov (United States)

    Smith, Jeffrey

    2003-01-01

    The Bio- Visualization, Imaging and Simulation (BioVIS) Technology Center at NASA's Ames Research Center is dedicated to developing and applying advanced visualization, computation and simulation technologies to support NASA Space Life Sciences research and the objectives of the Fundamental Biology Program. Research ranges from high resolution 3D cell imaging and structure analysis, virtual environment simulation of fine sensory-motor tasks, computational neuroscience and biophysics to biomedical/clinical applications. Computer simulation research focuses on the development of advanced computational tools for astronaut training and education. Virtual Reality (VR) and Virtual Environment (VE) simulation systems have become important training tools in many fields from flight simulation to, more recently, surgical simulation. The type and quality of training provided by these computer-based tools ranges widely, but the value of real-time VE computer simulation as a method of preparing individuals for real-world tasks is well established. Astronauts routinely use VE systems for various training tasks, including Space Shuttle landings, robot arm manipulations and extravehicular activities (space walks). Currently, there are no VE systems to train astronauts for basic and applied research experiments which are an important part of many missions. The Virtual Glovebox (VGX) is a prototype VE system for real-time physically-based simulation of the Life Sciences Glovebox where astronauts will perform many complex tasks supporting research experiments aboard the International Space Station. The VGX consists of a physical display system utilizing duel LCD projectors and circular polarization to produce a desktop-sized 3D virtual workspace. Physically-based modeling tools (Arachi Inc.) provide real-time collision detection, rigid body dynamics, physical properties and force-based controls for objects. The human-computer interface consists of two magnetic tracking devices

  4. Biological and chemical technologies research. FY 1995 annual summary report

    Energy Technology Data Exchange (ETDEWEB)

    None

    1996-03-01

    The annual summary report presents the fiscal year (FY) 1995 research activities and accomplishments for the United States Department of Energy (DOE) Biological and Chemical Technologies Research (BCTR) Program. This BCTR program resides within the Office of Industrial Technologies (OIT) of the Office of Energy Efficiency and Renewable Energy (EE). The annual summary report for 1995 (ASR 95) contains the following: program description (including BCTR program mission statement, historical background, relevance, goals and objectives); program structure and organization, selected technical and programmatic highlights for 1995; detailed descriptions of individual projects; a listing of program output, including a bibliography of published work; patents; and awards arising from work supported by the BCTR.

  5. Division of Biological and Medical Research annual technical report, 1981

    International Nuclear Information System (INIS)

    Rosenthal, M.W.

    1982-06-01

    This report summarizes research during 1981 in the Division of Biological and Medical Research, Argonne National Laboratory. Studies in Low Level Radiation include comparison of lifetime effects in mice of low level neutron and gamma irradiation, delineation of the responses of dogs to continuous low level gamma irradiation, elucidation of mechanisms of radiation damage and repair in mammalian cells, and study of the genetic effects of high LET radiations. Carcinogenesis research addresses mechanisms of tumor initiation and promotion in rat liver, chemical carcinogenesis in cultured mammalian cells, and molecular and genetic mechanisms of chemical and ultraviolet mutagenesis in bacteria. Research in Toxicology uses a variety of cellular, whole animal, and chronobiological end points, chemical separations, and statistical models to evaluate the hazards and mechanisms of actions of metals, coal gasification by products, and other energy-related pollutants. Human Protein Index studies develop two-dimensional electrophoresis systems for diagnosis and detection of cancer and other disease. Biophysics research includes fundamental structural and biophysical investigations of immunoglobulins and key biological molecules using NMR, crystallographic, and x-ray and neutron small-angle scattering techniques. The final sections cover support facilities, educational activities, seminars, staff talks, staff, and funding agencies

  6. Division of Biological and Medical Research annual technical report, 1981

    Energy Technology Data Exchange (ETDEWEB)

    Rosenthal, M.W. (ed.)

    1982-06-01

    This report summarizes research during 1981 in the Division of Biological and Medical Research, Argonne National Laboratory. Studies in Low Level Radiation include comparison of lifetime effects in mice of low level neutron and gamma irradiation, delineation of the responses of dogs to continuous low level gamma irradiation, elucidation of mechanisms of radiation damage and repair in mammalian cells, and study of the genetic effects of high LET radiations. Carcinogenesis research addresses mechanisms of tumor initiation and promotion in rat liver, chemical carcinogenesis in cultured mammalian cells, and molecular and genetic mechanisms of chemical and ultraviolet mutagenesis in bacteria. Research in Toxicology uses a variety of cellular, whole animal, and chronobiological end points, chemical separations, and statistical models to evaluate the hazards and mechanisms of actions of metals, coal gasification by products, and other energy-related pollutants. Human Protein Index studies develop two-dimensional electrophoresis systems for diagnosis and detection of cancer and other disease. Biophysics research includes fundamental structural and biophysical investigations of immunoglobulins and key biological molecules using NMR, crystallographic, and x-ray and neutron small-angle scattering techniques. The final sections cover support facilities, educational activities, seminars, staff talks, staff, and funding agencies.

  7. Computer simulation of heating of biological tissue during laser radiation

    International Nuclear Information System (INIS)

    Bojanic, S.; Sreckovic, M.

    1995-01-01

    Computer model is based on an implicit finite difference scheme to solve the diffusion equation for light distribution and the bio-heat equation. A practical application of the model is to calculate the temperature distributions during thermal coagulation of prostate by radiative heating. (author)

  8. The Effects of 3D Computer Simulation on Biology Students' Achievement and Memory Retention

    Science.gov (United States)

    Elangovan, Tavasuria; Ismail, Zurida

    2014-01-01

    A quasi experimental study was conducted for six weeks to determine the effectiveness of two different 3D computer simulation based teaching methods, that is, realistic simulation and non-realistic simulation on Form Four Biology students' achievement and memory retention in Perak, Malaysia. A sample of 136 Form Four Biology students in Perak,…

  9. Division of Biological and Medical Research research summary 1984-1985

    Energy Technology Data Exchange (ETDEWEB)

    Barr, S.H. (ed.)

    1985-08-01

    The Division of Biological and Medical Research at Argonne National Laboratory conducts multidisciplinary research aimed at defining the biological and medical hazards to man from energy technologies and new energy options. These technically oriented studies have a strong base in fundamental research in a variety of scientific disciplines, including molecular and cellular biology, biophysics, genetics, radiobiology, pharmacology, biochemistry, chemistry, environmental toxicology, and epidemiology. This research summary is organized into six parts. The first five parts reflect the Divisional structure and contain the scientific program chapters, which summarize the activities of the individual groups during the calendar year 1984 and the first half of 1985. To provide better continuity and perspective, previous work is sometimes briefly described. Although the summaries are short, efforts have been made to indicate the range of research activities for each group.

  10. Division of Biological and Medical Research research summary 1984-1985

    International Nuclear Information System (INIS)

    Barr, S.H.

    1985-08-01

    The Division of Biological and Medical Research at Argonne National Laboratory conducts multidisciplinary research aimed at defining the biological and medical hazards to man from energy technologies and new energy options. These technically oriented studies have a strong base in fundamental research in a variety of scientific disciplines, including molecular and cellular biology, biophysics, genetics, radiobiology, pharmacology, biochemistry, chemistry, environmental toxicology, and epidemiology. This research summary is organized into six parts. The first five parts reflect the Divisional structure and contain the scientific program chapters, which summarize the activities of the individual groups during the calendar year 1984 and the first half of 1985. To provide better continuity and perspective, previous work is sometimes briefly described. Although the summaries are short, efforts have been made to indicate the range of research activities for each group

  11. Social things : design research on social computing

    NARCIS (Netherlands)

    Hu, J.; Luen, P.; Rau, P.

    2016-01-01

    In the era of social networking and computing, things and people are more and more interconnected, giving rise to not only new opportunities but also new challenges in designing new products that are networked, and services that are adaptive to their human users and context aware in their physical

  12. Computers in Language Testing: Present Research and Some Future Directions.

    Science.gov (United States)

    Brown, James Dean

    1997-01-01

    Explores recent developments in the use of computers in language testing in four areas: (1) item banking; (2) computer-assisted language testing; (3) computerized-adaptive language testing; and (4) research on the effectiveness of computers in language testing. Examines educational measurement literature in an attempt to forecast the directions…

  13. Amorphous Computing: A Research Agenda for the Near Future

    Czech Academy of Sciences Publication Activity Database

    Wiedermann, Jiří

    2012-01-01

    Roč. 11, č. 1 (2012), s. 59-63 ISSN 1567-7818 R&D Projects: GA ČR GAP202/10/1333 Institutional research plan: CEZ:AV0Z10300504 Keywords : amorphous computing * nano-machines * flying amorphous computer Subject RIV: IN - Informatics, Computer Science Impact factor: 0.683, year: 2012

  14. Inferring biological functions of guanylyl cyclases with computational methods

    KAUST Repository

    Alquraishi, May Majed; Meier, Stuart Kurt

    2013-01-01

    A number of studies have shown that functionally related genes are often co-expressed and that computational based co-expression analysis can be used to accurately identify functional relationships between genes and by inference, their encoded proteins. Here we describe how a computational based co-expression analysis can be used to link the function of a specific gene of interest to a defined cellular response. Using a worked example we demonstrate how this methodology is used to link the function of the Arabidopsis Wall-Associated Kinase-Like 10 gene, which encodes a functional guanylyl cyclase, to host responses to pathogens. © Springer Science+Business Media New York 2013.

  15. Inferring biological functions of guanylyl cyclases with computational methods

    KAUST Repository

    Alquraishi, May Majed

    2013-09-03

    A number of studies have shown that functionally related genes are often co-expressed and that computational based co-expression analysis can be used to accurately identify functional relationships between genes and by inference, their encoded proteins. Here we describe how a computational based co-expression analysis can be used to link the function of a specific gene of interest to a defined cellular response. Using a worked example we demonstrate how this methodology is used to link the function of the Arabidopsis Wall-Associated Kinase-Like 10 gene, which encodes a functional guanylyl cyclase, to host responses to pathogens. © Springer Science+Business Media New York 2013.

  16. Research in applied mathematics, numerical analysis, and computer science

    Science.gov (United States)

    1984-01-01

    Research conducted at the Institute for Computer Applications in Science and Engineering (ICASE) in applied mathematics, numerical analysis, and computer science is summarized and abstracts of published reports are presented. The major categories of the ICASE research program are: (1) numerical methods, with particular emphasis on the development and analysis of basic numerical algorithms; (2) control and parameter identification; (3) computational problems in engineering and the physical sciences, particularly fluid dynamics, acoustics, and structural analysis; and (4) computer systems and software, especially vector and parallel computers.

  17. Division of Biological and Medical Research annual report 1978

    Energy Technology Data Exchange (ETDEWEB)

    Rosenthal, M.W. (ed.)

    1978-01-01

    The research during 1978 in the Division of Biological and Medical Research, Argonne National Laboratory, is summarized. Studies related to nuclear energy include responses of beagles to continuous low-level /sup 60/Co gamma radiation, and development of leukemic indicators; comparison of lifetime effects in mice of low-level neutron and /sup 60/Co gamma radiation; genetic effects of high LET radiations; and metabolic and therapeutic studies of heavy metals. Studies of nonnuclear energy sources deal with characterization and toxicological evaluation of effluents of fluidized bed combustion and coal gasification; electrical storage systems; electric fields associated with energy transmission; and development of population projection models and assessment of human risk. Basic research studies include fundamental structural and biophysical investigations; circadian rhythms; mutagenesis in bacteria and mammalian cells; cell killing, damage, and repair in mammalian cells; carcinogenesis and cocarcinogenesis; the use of liposomes as biological carriers; and studies of environmental influences on life-span, physiological performance, and circadian cycles. In the area of medical development, proteins in urine and tissues of normal and diseased humans are analyzed, and advanced analytical procedures for use of stable isotopes in clinical research and diagnosis are developed and applied. The final sections of the report cover support facilities, educational activities, the seminar program, staff talks, and staff publications.

  18. Division of Biological and Medical Research annual technical report 1982

    International Nuclear Information System (INIS)

    Rosenthal, M.W.

    1983-05-01

    This report summarizes research during 1982 in the Division of Biological and Medical Research, Argonne National Laboratory. Studies in Carcinogenesis address mechanisms of chemical and radiation carcinogenesis including the processes of tumor initiation and promotion. The studies employ rat liver and mouse skin models as well as human rodent cell culture systems. The use of liposomes for metal mobilization is also explored. Low Level Radiation studies include delineation of the hematopoietic and other responses of dogs to continuous low level gamma irradiation, comparison of lifetime effects in mice of low level neutron and gamma irradiation, and study of the genetic effects of high LET radiation. Molecular Biology research develops two-dimensional electrophoresis systems for diagnosis and detection of cancer and other diseases. Fundamental structural and biophysical investigations of immunoglobulins and other key proteins are included, as are studies of cell growth, and of molecular and cellular effects of solar uv light. Research in Toxicology uses cellular, physiological, whole animal, and chronobiological end points and chemical separations to elucidate mechanisms and evaluate hazards of coal conversion by-products, actinides, and toxic metals. The final sections cover support facilities, educational activities, seminars, staff talks, staff, and funding agencies

  19. Division of Biological and Medical Research annual technical report 1982

    Energy Technology Data Exchange (ETDEWEB)

    Rosenthal, M.W. (ed.)

    1983-05-01

    This report summarizes research during 1982 in the Division of Biological and Medical Research, Argonne National Laboratory. Studies in Carcinogenesis address mechanisms of chemical and radiation carcinogenesis including the processes of tumor initiation and promotion. The studies employ rat liver and mouse skin models as well as human rodent cell culture systems. The use of liposomes for metal mobilization is also explored. Low Level Radiation studies include delineation of the hematopoietic and other responses of dogs to continuous low level gamma irradiation, comparison of lifetime effects in mice of low level neutron and gamma irradiation, and study of the genetic effects of high LET radiation. Molecular Biology research develops two-dimensional electrophoresis systems for diagnosis and detection of cancer and other diseases. Fundamental structural and biophysical investigations of immunoglobulins and other key proteins are included, as are studies of cell growth, and of molecular and cellular effects of solar uv light. Research in Toxicology uses cellular, physiological, whole animal, and chronobiological end points and chemical separations to elucidate mechanisms and evaluate hazards of coal conversion by-products, actinides, and toxic metals. The final sections cover support facilities, educational activities, seminars, staff talks, staff, and funding agencies.

  20. Division of Biological and Medical Research annual report 1978

    International Nuclear Information System (INIS)

    Rosenthal, M.W.

    1978-01-01

    The research during 1978 in the Division of Biological and Medical Research, Argonne National Laboratory, is summarized. Studies related to nuclear energy include responses of beagles to continuous low-level 60 Co gamma radiation, and development of leukemic indicators; comparison of lifetime effects in mice of low-level neutron and 60 Co gamma radiation; genetic effects of high LET radiations; and metabolic and therapeutic studies of heavy metals. Studies of nonnuclear energy sources deal with characterization and toxicological evaluation of effluents of fluidized bed combustion and coal gasification; electrical storage systems; electric fields associated with energy transmission; and development of population projection models and assessment of human risk. Basic research studies include fundamental structural and biophysical investigations; circadian rhythms; mutagenesis in bacteria and mammalian cells; cell killing, damage, and repair in mammalian cells; carcinogenesis and cocarcinogenesis; the use of liposomes as biological carriers; and studies of environmental influences on life-span, physiological performance, and circadian cycles. In the area of medical development, proteins in urine and tissues of normal and diseased humans are analyzed, and advanced analytical procedures for use of stable isotopes in clinical research and diagnosis are developed and applied. The final sections of the report cover support facilities, educational activities, the seminar program, staff talks, and staff publications

  1. Use of synchrotron radiation in radiation biology research

    International Nuclear Information System (INIS)

    Yamada, Takeshi

    1981-01-01

    Synchrotron radiation (SR) holds great expectation as a new research tool in the new areas of material science, because it has the continuous spectral distribution from visible light to X-ray, and its intensity is 10 2 to 10 3 times as strong as that of conventional radiation sources. In the National Laboratory for High Energy Physics, a synchrotron radiation experimental facility has been constructed, which will start operation in fiscal 1982. With this SR, the photons having the wavelength in undeveloped region from vacuum ultraviolet to soft X-ray are obtained as intense mono-wavelength light. The SR thus should contribute to the elucidation of the fundamentals in the biological action of radiation. The following matters are described: synchrotron radiation, experimental facility using SR, electron storage ring, features of SR, photon factory plan and synchrotron radiation experimental facility, utilization of SR in radiation biology field. (J.P.N.)

  2. DOE research in utilization of high-performance computers

    International Nuclear Information System (INIS)

    Buzbee, B.L.; Worlton, W.J.; Michael, G.; Rodrigue, G.

    1980-12-01

    Department of Energy (DOE) and other Government research laboratories depend on high-performance computer systems to accomplish their programatic goals. As the most powerful computer systems become available, they are acquired by these laboratories so that advances can be made in their disciplines. These advances are often the result of added sophistication to numerical models whose execution is made possible by high-performance computer systems. However, high-performance computer systems have become increasingly complex; consequently, it has become increasingly difficult to realize their potential performance. The result is a need for research on issues related to the utilization of these systems. This report gives a brief description of high-performance computers, and then addresses the use of and future needs for high-performance computers within DOE, the growing complexity of applications within DOE, and areas of high-performance computer systems warranting research. 1 figure

  3. Interim research assessment 2003-2005 - Computer Science

    NARCIS (Netherlands)

    Mouthaan, A.J.; Hartel, Pieter H.

    This report primarily serves as a source of information for the 2007 Interim Research Assessment Committee for Computer Science at the three technical universities in the Netherlands. The report also provides information for others interested in our research activities.

  4. Biologically Weighted Quantities in Radiotherapy: an EMRP Joint Research Project

    Directory of Open Access Journals (Sweden)

    Rabus Hans

    2014-01-01

    Full Text Available Funded within the European Metrology Research Programme (EMRP [1], the joint research project “Biologically weighted quantities in radiotherapy” (BioQuaRT [2] aims to develop measurement and simulation techniques for determining the physical properties of ionising particle tracks on different length scales (about 2 nm to 10 μm, and to investigate the correlation of these track structure characteristics with the biological effects of radiation at the cellular level. Work package 1 develops micro-calorimeter prototypes for the direct measurement of lineal energy and will characterise their response for different ion beams by experiment and modelling. Work package 2 develops techniques to measure particle track structure on different length scales in the nanometre range as well as a measurement device integrating a silicon microdosimeter and a nanodosimeter. Work package 3 investigates the indirect effects of radiation based on probes for quantifying particular radical and reactive oxygen species (ROS. Work package 4 focuses on the biological aspects of radiation damage and will produce data on initial DNA damage and late effects for radiotherapy beams of different qualities. Work package 5 provides evaluated data sets of DNA cross-sections and develops a multi-scale model to address microscopic and nanometric track structure properties. The project consortium includes three linked researchers holding so-called Researcher Excellence Grants, who carry out ancillary investigations such as developing and benchmarking a new biophysical model for induction of early radiation damage and developing methods for the translation of quantities derived from particle track structure to clinical applications in ion beam therapy.

  5. Scalable Computational Methods for the Analysis of High-Throughput Biological Data

    Energy Technology Data Exchange (ETDEWEB)

    Langston, Michael A. [Univ. of Tennessee, Knoxville, TN (United States)

    2012-09-06

    This primary focus of this research project is elucidating genetic regulatory mechanisms that control an organism's responses to low-dose ionizing radiation. Although low doses (at most ten centigrays) are not lethal to humans, they elicit a highly complex physiological response, with the ultimate outcome in terms of risk to human health unknown. The tools of molecular biology and computational science will be harnessed to study coordinated changes in gene expression that orchestrate the mechanisms a cell uses to manage the radiation stimulus. High performance implementations of novel algorithms that exploit the principles of fixed-parameter tractability will be used to extract gene sets suggestive of co-regulation. Genomic mining will be performed to scrutinize, winnow and highlight the most promising gene sets for more detailed investigation. The overall goal is to increase our understanding of the health risks associated with exposures to low levels of radiation.

  6. Digital computer control of a research nuclear reactor

    International Nuclear Information System (INIS)

    Crawford, Kevan

    1986-01-01

    Currently, the use of digital computers in energy producing systems has been limited to data acquisition functions. These computers have greatly reduced human involvement in the moment to moment decision process and the crisis decision process, thereby improving the safety of the dynamic energy producing systems. However, in addition to data acquisition, control of energy producing systems also includes data comparison, decision making, and control actions. The majority of the later functions are accomplished through the use of analog computers in a distributed configuration. The lack of cooperation and hence, inefficiency in distributed control, and the extent of human interaction in critical phases of control have provided the incentive to improve the later three functions of energy systems control. Properly applied, centralized control by digital computers can increase efficiency by making the system react as a single unit and by implementing efficient power changes to match demand. Additionally, safety will be improved by further limiting human involvement to action only in the case of a failure of the centralized control system. This paper presents a hardware and software design for the centralized control of a research nuclear reactor by a digital computer. Current nuclear reactor control philosophies which include redundancy, inherent safety in failure, and conservative yet operational scram initiation were used as the bases of the design. The control philosophies were applied to the power monitoring system, the fuel temperature monitoring system, the area radiation monitoring system, and the overall system interaction. Unlike the single function analog computers that are currently used to control research and commercial reactors, this system will be driven by a multifunction digital computer. Specifically, the system will perform control rod movements to conform with operator requests, automatically log the required physical parameters during reactor

  7. Connecting biology and organic chemistry introductory laboratory courses through a collaborative research project.

    Science.gov (United States)

    Boltax, Ariana L; Armanious, Stephanie; Kosinski-Collins, Melissa S; Pontrello, Jason K

    2015-01-01

    Modern research often requires collaboration of experts in fields, such as math, chemistry, biology, physics, and computer science to develop unique solutions to common problems. Traditional introductory undergraduate laboratory curricula in the sciences often do not emphasize connections possible between the various disciplines. We designed an interdisciplinary, medically relevant, project intended to help students see connections between chemistry and biology. Second term organic chemistry laboratory students designed and synthesized potential polymer inhibitors or inducers of polyglutamine protein aggregation. The use of novel target compounds added the uncertainty of scientific research to the project. Biology laboratory students then tested the novel potential pharmaceuticals in Huntington's disease model assays, using in vitro polyglutamine peptide aggregation and in vivo lethality studies in Drosophila. Students read articles from the primary literature describing the system from both chemical and biological perspectives. Assessment revealed that students emerged from both courses with a deeper understanding of the interdisciplinary nature of biology and chemistry and a heightened interest in basic research. The design of this collaborative project for introductory biology and organic chemistry labs demonstrated how the local interests and expertise at a university can be drawn from to create an effective way to integrate these introductory courses. Rather than simply presenting a series of experiments to be replicated, we hope that our efforts will inspire other scientists to think about how some aspect of authentic work can be brought into their own courses, and we also welcome additional collaborations to extend the scope of the scientific exploration. © 2015 The International Union of Biochemistry and Molecular Biology.

  8. Proceedings of the meeting on large scale computer simulation research

    International Nuclear Information System (INIS)

    2004-04-01

    The meeting to summarize the collaboration activities for FY2003 on the Large Scale Computer Simulation Research was held January 15-16, 2004 at Theory and Computer Simulation Research Center, National Institute for Fusion Science. Recent simulation results, methodologies and other related topics were presented. (author)

  9. Monitoring Biological Modes in a Bioreactor Process by Computer Simulation

    Directory of Open Access Journals (Sweden)

    Samia Semcheddine

    2015-12-01

    Full Text Available This paper deals with the general framework of fermentation system modeling and monitoring, focusing on the fermentation of Escherichia coli. Our main objective is to develop an algorithm for the online detection of acetate production during the culture of recombinant proteins. The analysis the fermentation process shows that it behaves like a hybrid dynamic system with commutation (since it can be represented by 5 nonlinear models. We present a strategy of fault detection based on residual generation for detecting the different actual biological modes. The residual generation is based on nonlinear analytical redundancy relations. The simulation results show that the several modes that are occulted during the bacteria cultivation can be detected by residuals using a nonlinear dynamic model and a reduced instrumentation.

  10. Theoretical discussion for quantum computation in biological systems

    Science.gov (United States)

    Baer, Wolfgang

    2010-04-01

    Analysis of the brain as a physical system, that has the capacity of generating a display of every day observed experiences and contains some knowledge of the physical reality which stimulates those experiences, suggests the brain executes a self-measurement process described by quantum theory. Assuming physical reality is a universe of interacting self-measurement loops, we present a model of space as a field of cells executing such self-measurement activities. Empty space is the observable associated with the measurement of this field when the mass and charge density defining the material aspect of the cells satisfy the least action principle. Content is the observable associated with the measurement of the quantum wave function ψ interpreted as mass-charge displacements. The illusion of space and its content incorporated into cognitive biological systems is evidence of self-measurement activity that can be associated with quantum operations.

  11. Research in thermal biology: Burning questions for coldwater stream fishes

    Science.gov (United States)

    McCullough, D.A.; Bartholow, J.M.; Jager, H.I.; Beschta, R.L.; Cheslak, E.F.; Deas, M.L.; Ebersole, J.L.; Foott, J.S.; Johnson, S.L.; Marine, K.R.; Mesa, M.G.; Petersen, J.H.; Souchon, Y.; Tiffan, K.F.; Wurtsbaugh, W.A.

    2009-01-01

    With the increasing appreciation of global warming impacts on ecological systems, in addition to the myriad of land management effects on water quality, the number of literature citations dealing with the effects of water temperature on freshwater fish has escalated in the past decade. Given the many biological scales at which water temperature effects have been studied, and the growing need to integrate knowledge from multiple disciplines of thermal biology to fully protect beneficial uses, we held that a survey of the most promising recent developments and an expression of some of the remaining unanswered questions with significant management implications would best be approached collectively by a diverse research community. We have identified five specific topic areas of renewed research where new techniques and critical thought could benefit coldwater stream fishes (particularly salmonids): molecular, organism, population/species, community and ecosystem, and policy issues in water quality. Our hope is that information gained through examination of recent research fronts linking knowledge at various scales will prove useful in managing water quality at a basin level to protect fish populations and whole ecosystems. Standards of the past were based largely on incipient lethal and optimum growth rate temperatures for fish species, while future standards should consider all integrated thermal impacts to the organism and ecosystem. ?? Taylor and Francis Group, LLC.

  12. Impact of Interdisciplinary Undergraduate Research in mathematics and biology on the development of a new course integrating five STEM disciplines.

    Science.gov (United States)

    Caudill, Lester; Hill, April; Hoke, Kathy; Lipan, Ovidiu

    2010-01-01

    Funded by innovative programs at the National Science Foundation and the Howard Hughes Medical Institute, University of Richmond faculty in biology, chemistry, mathematics, physics, and computer science teamed up to offer first- and second-year students the opportunity to contribute to vibrant, interdisciplinary research projects. The result was not only good science but also good science that motivated and informed course development. Here, we describe four recent undergraduate research projects involving students and faculty in biology, physics, mathematics, and computer science and how each contributed in significant ways to the conception and implementation of our new Integrated Quantitative Science course, a course for first-year students that integrates the material in the first course of the major in each of biology, chemistry, mathematics, computer science, and physics.

  13. Caenorhabditis elegans, a Biological Model for Research in Toxicology.

    Science.gov (United States)

    Tejeda-Benitez, Lesly; Olivero-Verbel, Jesus

    2016-01-01

    Caenorhabditis elegans is a nematode of microscopic size which, due to its biological characteristics, has been used since the 1970s as a model for research in molecular biology, medicine, pharmacology, and toxicology. It was the first animal whose genome was completely sequenced and has played a key role in the understanding of apoptosis and RNA interference. The transparency of its body, short lifespan, ability to self-fertilize and ease of culture are advantages that make it ideal as a model in toxicology. Due to the fact that some of its biochemical pathways are similar to those of humans, it has been employed in research in several fields. C. elegans' use as a biological model in environmental toxicological assessments allows the determination of multiple endpoints. Some of these utilize the effects on the biological functions of the nematode and others use molecular markers. Endpoints such as lethality, growth, reproduction, and locomotion are the most studied, and usually employ the wild type Bristol N2 strain. Other endpoints use reporter genes, such as green fluorescence protein, driven by regulatory sequences from other genes related to different mechanisms of toxicity, such as heat shock, oxidative stress, CYP system, and metallothioneins among others, allowing the study of gene expression in a manner both rapid and easy. These transgenic strains of C. elegans represent a powerful tool to assess toxicity pathways for mixtures and environmental samples, and their numbers are growing in diversity and selectivity. However, other molecular biology techniques, including DNA microarrays and MicroRNAs have been explored to assess the effects of different toxicants and samples. C. elegans has allowed the assessment of neurotoxic effects for heavy metals and pesticides, among those more frequently studied, as the nematode has a very well defined nervous system. More recently, nanoparticles are emergent pollutants whose toxicity can be explored using this nematode

  14. Advancing vector biology research: a community survey for future directions, research applications and infrastructure requirements

    Science.gov (United States)

    Kohl, Alain; Pondeville, Emilie; Schnettler, Esther; Crisanti, Andrea; Supparo, Clelia; Christophides, George K.; Kersey, Paul J.; Maslen, Gareth L.; Takken, Willem; Koenraadt, Constantianus J. M.; Oliva, Clelia F.; Busquets, Núria; Abad, F. Xavier; Failloux, Anna-Bella; Levashina, Elena A.; Wilson, Anthony J.; Veronesi, Eva; Pichard, Maëlle; Arnaud Marsh, Sarah; Simard, Frédéric; Vernick, Kenneth D.

    2016-01-01

    Vector-borne pathogens impact public health, animal production, and animal welfare. Research on arthropod vectors such as mosquitoes, ticks, sandflies, and midges which transmit pathogens to humans and economically important animals is crucial for development of new control measures that target transmission by the vector. While insecticides are an important part of this arsenal, appearance of resistance mechanisms is increasingly common. Novel tools for genetic manipulation of vectors, use of Wolbachia endosymbiotic bacteria, and other biological control mechanisms to prevent pathogen transmission have led to promising new intervention strategies, adding to strong interest in vector biology and genetics as well as vector–pathogen interactions. Vector research is therefore at a crucial juncture, and strategic decisions on future research directions and research infrastructure investment should be informed by the research community. A survey initiated by the European Horizon 2020 INFRAVEC-2 consortium set out to canvass priorities in the vector biology research community and to determine key activities that are needed for researchers to efficiently study vectors, vector-pathogen interactions, as well as access the structures and services that allow such activities to be carried out. We summarize the most important findings of the survey which in particular reflect the priorities of researchers in European countries, and which will be of use to stakeholders that include researchers, government, and research organizations. PMID:27677378

  15. Continuing training program in radiation protection in biological research centers

    International Nuclear Information System (INIS)

    Escudero, R.; Hidalgo, R.M.; Usera, F.; Macias, M.T.; Mirpuri, E.; Perez, J.; Sanchez, A.

    2008-01-01

    The use of ionizing radiation in biological research has many specific characteristics. A great variety of radioisotopic techniques involve unsealed radioactive sources, and their use not only carries a risk of irradiation, but also a significant risk of contamination. Moreover, a high proportion of researchers are in training and the labor mobility rate is therefore high. Furthermore, most newly incorporated personnel have little or no previous training in radiological protection, since most academic qualifications do not include training in this discipline. In a biological research center, in addition to personnel whose work is directly associated with the radioactive facility (scientific-technical personnel, operators, supervisors), there are also groups of support personnel The use of ionizing radiation in biological research has many specific characteristics. A great variety of radioisotopic techniques involve unsealed radioactive sources, and their use not only carries a risk of irradiation, but also a significant risk of contamination. Moreover, a high proportion of researchers are in training and the labor mobility rate is therefore high. Furthermore, most newly incorporated personnel have little or no previous training in radiological protection, since most academic qualifications do not include training in this discipline. In a biological research center, in addition to personnel whose work is directly associated with the radioactive facility (scientific-technical personnel, operators, supervisors), there are also groups of support personnel maintenance and instrumentation workers, cleaners, administrative personnel, etc. who are associated with the radioactive facility indirectly. These workers are affected by the work in the radioactive facility to varying degrees, and they therefore also require information and training in radiological protection tailored to their level of interaction with the installation. The aim of this study was to design a

  16. "Biology Education"--An Emerging Interdisciplinary Area of Research

    Science.gov (United States)

    Rutledge, Michael

    2013-01-01

    The growing number of faculty positions in biology education, the formation of professional societies focused specifically on biology education, and the increasing number of publications in biology education over the past decade

  17. Chaste: an open source C++ library for computational physiology and biology.

    KAUST Repository

    Mirams, Gary R; Arthurs, Christopher J; Bernabeu, Miguel O; Bordas, Rafel; Cooper, Jonathan; Corrias, Alberto; Davit, Yohan; Dunn, Sara-Jane; Fletcher, Alexander G; Harvey, Daniel G; Marsh, Megan E; Osborne, James M; Pathmanathan, Pras; Pitt-Francis, Joe; Southern, James; Zemzemi, Nejib; Gavaghan, David J

    2013-01-01

    Chaste - Cancer, Heart And Soft Tissue Environment - is an open source C++ library for the computational simulation of mathematical models developed for physiology and biology. Code development has been driven by two initial applications: cardiac electrophysiology and cancer development. A large number of cardiac electrophysiology studies have been enabled and performed, including high-performance computational investigations of defibrillation on realistic human cardiac geometries. New models for the initiation and growth of tumours have been developed. In particular, cell-based simulations have provided novel insight into the role of stem cells in the colorectal crypt. Chaste is constantly evolving and is now being applied to a far wider range of problems. The code provides modules for handling common scientific computing components, such as meshes and solvers for ordinary and partial differential equations (ODEs/PDEs). Re-use of these components avoids the need for researchers to 're-invent the wheel' with each new project, accelerating the rate of progress in new applications. Chaste is developed using industrially-derived techniques, in particular test-driven development, to ensure code quality, re-use and reliability. In this article we provide examples that illustrate the types of problems Chaste can be used to solve, which can be run on a desktop computer. We highlight some scientific studies that have used or are using Chaste, and the insights they have provided. The source code, both for specific releases and the development version, is available to download under an open source Berkeley Software Distribution (BSD) licence at http://www.cs.ox.ac.uk/chaste, together with details of a mailing list and links to documentation and tutorials.

  18. Chaste: an open source C++ library for computational physiology and biology.

    Directory of Open Access Journals (Sweden)

    Gary R Mirams

    Full Text Available Chaste - Cancer, Heart And Soft Tissue Environment - is an open source C++ library for the computational simulation of mathematical models developed for physiology and biology. Code development has been driven by two initial applications: cardiac electrophysiology and cancer development. A large number of cardiac electrophysiology studies have been enabled and performed, including high-performance computational investigations of defibrillation on realistic human cardiac geometries. New models for the initiation and growth of tumours have been developed. In particular, cell-based simulations have provided novel insight into the role of stem cells in the colorectal crypt. Chaste is constantly evolving and is now being applied to a far wider range of problems. The code provides modules for handling common scientific computing components, such as meshes and solvers for ordinary and partial differential equations (ODEs/PDEs. Re-use of these components avoids the need for researchers to 're-invent the wheel' with each new project, accelerating the rate of progress in new applications. Chaste is developed using industrially-derived techniques, in particular test-driven development, to ensure code quality, re-use and reliability. In this article we provide examples that illustrate the types of problems Chaste can be used to solve, which can be run on a desktop computer. We highlight some scientific studies that have used or are using Chaste, and the insights they have provided. The source code, both for specific releases and the development version, is available to download under an open source Berkeley Software Distribution (BSD licence at http://www.cs.ox.ac.uk/chaste, together with details of a mailing list and links to documentation and tutorials.

  19. Chaste: an open source C++ library for computational physiology and biology.

    KAUST Repository

    Mirams, Gary R

    2013-03-14

    Chaste - Cancer, Heart And Soft Tissue Environment - is an open source C++ library for the computational simulation of mathematical models developed for physiology and biology. Code development has been driven by two initial applications: cardiac electrophysiology and cancer development. A large number of cardiac electrophysiology studies have been enabled and performed, including high-performance computational investigations of defibrillation on realistic human cardiac geometries. New models for the initiation and growth of tumours have been developed. In particular, cell-based simulations have provided novel insight into the role of stem cells in the colorectal crypt. Chaste is constantly evolving and is now being applied to a far wider range of problems. The code provides modules for handling common scientific computing components, such as meshes and solvers for ordinary and partial differential equations (ODEs/PDEs). Re-use of these components avoids the need for researchers to \\'re-invent the wheel\\' with each new project, accelerating the rate of progress in new applications. Chaste is developed using industrially-derived techniques, in particular test-driven development, to ensure code quality, re-use and reliability. In this article we provide examples that illustrate the types of problems Chaste can be used to solve, which can be run on a desktop computer. We highlight some scientific studies that have used or are using Chaste, and the insights they have provided. The source code, both for specific releases and the development version, is available to download under an open source Berkeley Software Distribution (BSD) licence at http://www.cs.ox.ac.uk/chaste, together with details of a mailing list and links to documentation and tutorials.

  20. 2010 Tetrapyrroles, Chemistry & Biology of Gordon Research Conference

    Energy Technology Data Exchange (ETDEWEB)

    Angela Wilks

    2010-07-30

    The objective of the Chemistry & Biology of Tetrapyrroles Gordon Conference is to bring together researchers from diverse disciplines that otherwise would not interact. By bringing biologists, chemists, engineers and clinicians with a common interest in tetrapyrroles the conference provides a forum for cross-disciplinary ideas and collaboration. The perspective provided by biologists, chemists, and clinicians working in fields such as newly discovered defects in human porphyrin metabolism, the myriad of strategies for light harvesting in photosynthetic organisms, novel tetrapyrroles that serve as auxiliary chromophores or enzyme cofactors, synthetic strategies in the design of novel tetrapyrrole scaffolds, and tetrapyrrole based cell signaling and regulatory systems, makes this conference unique in the field. Over the years the growing evidence for the role of tetrapyrroles and their reactive intermediates in cell signaling and regulation has been of increasing importance at this conference. The 2010 conference on Chemistry & Biology of Tetrapyrroles will focus on many of these new frontiers as outlined in the preliminary program listed. Speakers will emphasize unpublished results and new findings in the field. The oral sessions will be followed by the highly interactive afternoon poster sessions. The poster sessions provide all conferees with the opportunity to present their latest research and to exchange ideas in a more informal setting. As in the past, this opportunity will continue during the nightly social gathering that takes place in the poster hall following the evening lectures. All conferees are encouraged to submit and present posters. At the conference the best poster in the areas of biology, chemistry and medicine will be selected by a panel of previous conference chairs.

  1. New computing techniques in physics research

    International Nuclear Information System (INIS)

    Becks, Karl-Heinz; Perret-Gallix, Denis

    1994-01-01

    New techniques were highlighted by the ''Third International Workshop on Software Engineering, Artificial Intelligence and Expert Systems for High Energy and Nuclear Physics'' in Oberammergau, Bavaria, Germany, from October 4 to 8. It was the third workshop in the series; the first was held in Lyon in 1990 and the second at France-Telecom site near La Londe les Maures in 1992. This series of workshops covers a broad spectrum of problems. New, highly sophisticated experiments demand new techniques in computing, in hardware as well as in software. Software Engineering Techniques could in principle satisfy the needs for forthcoming accelerator experiments. The growing complexity of detector systems demands new techniques in experimental error diagnosis and repair suggestions; Expert Systems seem to offer a way of assisting the experimental crew during data-taking

  2. Research directions in computer engineering. Report of a workshop

    Energy Technology Data Exchange (ETDEWEB)

    Freeman, H

    1982-09-01

    The results of a workshop held in November 1981 in Washington, DC, to outline research directions for computer engineering are reported upon. The purpose of the workshop was to provide guidance to government research funding agencies, as well as to universities and industry, as to the directions which computer engineering research should take for the next five to ten years. A select group of computer engineers was assembled, drawn from all over the United States and with expertise in virtually every aspect of today's computer technology. Industrial organisations and universities were represented in roughly equal numbers. The panel proceeded to provide a sharper definition of computer engineering than had been in popular use previously, to identify the social and national needs which provide the basis for encouraging research, to probe for obstacles to research and seek means of overcoming them and to delineate high-priority areas in which computer engineering research should be fostered. These included experimental software engineering, architectures in support of programming style, computer graphics, pattern recognition. VLSI design tools, machine intelligence, programmable automation, architectures for speech and signal processing, computer architecture and robotics. 13 references.

  3. Research program on the biological effects of oil pollution

    International Nuclear Information System (INIS)

    Barrett, R.T.

    1991-12-01

    A national research program on the biological effects of oil pollution (FOBO) was initiated by the Norwegian Ministry of Environment in October 1983 in the light of the increasing oil exploration and production activity in the North Sea and northern Norwegian waters. Ambitions were high and five main fields of research were suggested: Seabirds, fish (incl. salmon), marine mammals, the littoral zone and plankton. However, due to the lack of interest on the part of other potential financers, e.g. the Ministry of Fisheries and the oil companies, to participate, the four-year programme had to be limited to the following three topics: Seabirds around bruding colonies and at sea; Higher plants along the shoreline; The littoral zone. The program ran from the autumn of 1985 to the end of 1989 and this report summarizes the main results and conclusions of each project. 95 refs., 52 figs., 9 tabs

  4. BCTR: Biological and Chemical Technologies Research 1994 annual summary report

    Energy Technology Data Exchange (ETDEWEB)

    Petersen, G.

    1995-02-01

    The annual summary report presents the fiscal year (FY) 1994 research activities and accomplishments for the United States Department of Energy (DOE) Biological and Chemical Technologies Research (BCTR) Program of the Advanced Industrial Concepts Division (AICD). This AICD program resides within the Office of Industrial Technologies (OIT) of the Office of Energy Efficiency and Renewable Energy (EE). Although the OIT was reorganized in 1991 and AICD no longer exists, this document reports on efforts conducted under the former structure. The annual summary report for 1994 (ASR 94) contains the following: program description (including BCTR program mission statement, historical background, relevance, goals and objectives); program structure and organization, selected technical and programmatic highlights for 1994; detailed descriptions of individual projects; a listing of program output, including a bibliography of published work; patents, and awards arising from work supported by BCTR.

  5. Computational research on lithium ion battery materials

    Science.gov (United States)

    Tang, Ping

    Crystals of LiFePO4 and related materials have recently received a lot of attention due to their very promising use as cathodes in rechargeable lithium ion batteries. This thesis studied the electronic structures of FePO 4 and LiMPO4, where M=Mn, Fe, Co and Ni within the framework of density-functional theory. The first study compared the electronic structures of the LiMPO 4 and FePO4 materials in their electrochemically active olivine form, using the LAPW (linear augmented plane wave) method [1]. A comparison of results for various spin configurations suggested that the ferromagnetic configuration can serve as a useful approximation for studying general features of these systems. The partial densities of states for the LiMPO4 materials are remarkably similar to each other, showing the transition metal 3d states forming narrow bands above the O 2p band. By contrast, in absence of Li, the majority spin transition metal 3d states are well-hybridized with the O 2p band in FePO4. The second study compared the electronic structures of FePO4 in several crystal structures including an olivine, monoclinic, quartz-like, and CrVO4-like form [2,3]. For this work, in addition to the LAPW method, PAW (Projector Augmented Wave) [4], and PWscf (plane-wave pseudopotential) [5] methods were used. By carefully adjusting the computational parameters, very similar results were achieved for the three independent computational methods. Results for the relative stability of the four crystal structures are reported. In addition, partial densities of state analyses show qualitative information about the crystal field splittings and bond hybridizations and help rationalize the understanding of the electrochemical and stability properties of these materials.

  6. Open-Source Software in Computational Research: A Case Study

    Directory of Open Access Journals (Sweden)

    Sreekanth Pannala

    2008-04-01

    Full Text Available A case study of open-source (OS development of the computational research software MFIX, used for multiphase computational fluid dynamics simulations, is presented here. The verification and validation steps required for constructing modern computational software and the advantages of OS development in those steps are discussed. The infrastructure used for enabling the OS development of MFIX is described. The impact of OS development on computational research and education in gas-solids flow, as well as the dissemination of information to other areas such as geophysical and volcanology research, is demonstrated. This study shows that the advantages of OS development were realized in the case of MFIX: verification by many users, which enhances software quality; the use of software as a means for accumulating and exchanging information; the facilitation of peer review of the results of computational research.

  7. Application of the selected physical methods in biological research

    Directory of Open Access Journals (Sweden)

    Jaromír Tlačbaba

    2013-01-01

    Full Text Available This paper deals with the application of acoustic emission (AE, which is a part of the non-destructive methods, currently having an extensive application. This method is used for measuring the internal defects of materials. AE has a high potential in further research and development to extend the application of this method even in the field of process engineering. For that matter, it is the most elaborate acoustic emission monitoring in laboratory conditions with regard to external stimuli. The aim of the project is to apply the acoustic emission recording the activity of bees in different seasons. The mission is to apply a new perspective on the behavior of colonies by means of acoustic emission, which collects a sound propagation in the material. Vibration is one of the integral part of communication in the community. Sensing colonies with the support of this method is used for understanding of colonies biological behavior to stimuli clutches, colony development etc. Simulating conditions supported by acoustic emission monitoring system the illustrate colonies activity. Collected information will be used to represent a comprehensive view of the life cycle and behavior of honey bees (Apis mellifera. Use of information about the activities of bees gives a comprehensive perspective on using of acoustic emission in the field of biological research.

  8. Methods of 15N tracer research in biological systems

    International Nuclear Information System (INIS)

    Hirschberg, K.; Faust, H.

    1985-01-01

    The application of the stable isotope 15 N is of increasing importance in different scientific disciplines, especially in medicine, agriculture, and the biosciences. The close correlation between the growing interest and improvements of analytical procedures resulted in remarkable advances in the 15 N tracer technique. On the basis of the latest results of 15 N tracer research in life sciences and agriculture methods of 15 N tracer research in biological systems are compiled. The 15 N methodology is considered under three headings: Chemical analysis with a description of methods of sample preparation (including different separation and isolation methods for N-containing substances of biological and agricultural origin) and special procedures converting ammonia to molecular nitrogen. Isotopic analysis with a review on the most important methods of isotopic analysis of nitrogen: mass spectrometry (including the GC-MS technique), emission spectrometry, NMR spectroscopy, and other analytical procedures. 15 N-tracer techniques with a consideration of the role of the isotope dilution analysis as well as different labelling techniques and the mathematical interpretation of tracer data (modelling, N turnover experiments). In these chapters also sources of errors in chemical and isotopic analysis, the accuracy of the different methods and its importance on tracer experiments are discussed. Procedures for micro scale 15 N analysis and aspects of 15 N analysis on the level of natural abundance are considered. Furthermore some remarks on isotope effects in 15 N tracer experiments are made. (author)

  9. Computational protein design-the next generation tool to expand synthetic biology applications.

    Science.gov (United States)

    Gainza-Cirauqui, Pablo; Correia, Bruno Emanuel

    2018-05-02

    One powerful approach to engineer synthetic biology pathways is the assembly of proteins sourced from one or more natural organisms. However, synthetic pathways often require custom functions or biophysical properties not displayed by natural proteins, limitations that could be overcome through modern protein engineering techniques. Structure-based computational protein design is a powerful tool to engineer new functional capabilities in proteins, and it is beginning to have a profound impact in synthetic biology. Here, we review efforts to increase the capabilities of synthetic biology using computational protein design. We focus primarily on computationally designed proteins not only validated in vitro, but also shown to modulate different activities in living cells. Efforts made to validate computational designs in cells can illustrate both the challenges and opportunities in the intersection of protein design and synthetic biology. We also highlight protein design approaches, which although not validated as conveyors of new cellular function in situ, may have rapid and innovative applications in synthetic biology. We foresee that in the near-future, computational protein design will vastly expand the functional capabilities of synthetic cells. Copyright © 2018. Published by Elsevier Ltd.

  10. Research Activity in Computational Physics utilizing High Performance Computing: Co-authorship Network Analysis

    Science.gov (United States)

    Ahn, Sul-Ah; Jung, Youngim

    2016-10-01

    The research activities of the computational physicists utilizing high performance computing are analyzed by bibliometirc approaches. This study aims at providing the computational physicists utilizing high-performance computing and policy planners with useful bibliometric results for an assessment of research activities. In order to achieve this purpose, we carried out a co-authorship network analysis of journal articles to assess the research activities of researchers for high-performance computational physics as a case study. For this study, we used journal articles of the Scopus database from Elsevier covering the time period of 2004-2013. We extracted the author rank in the physics field utilizing high-performance computing by the number of papers published during ten years from 2004. Finally, we drew the co-authorship network for 45 top-authors and their coauthors, and described some features of the co-authorship network in relation to the author rank. Suggestions for further studies are discussed.

  11. A Computational Lens on Design Research

    Science.gov (United States)

    Hoyles, Celia; Noss, Richard

    2015-01-01

    In this commentary, we briefly review the collective effort of design researchers to weave theory with empirical results, in order to gain a better understanding of the processes of learning. We seek to respond to this challenging agenda by centring on the evolution of one sub-field: namely that which involves investigations within a…

  12. Computer Science Research Review 1974-75

    Science.gov (United States)

    1975-08-01

    mwmmmimmm^m^mmmrm. : i i 1 Faculty and Visitors Mario Barbaccl Research Associate B.S., Universidad Nacional de Ingenieria , Lima, Peru (1966...Engineer, Universidad Nacional de Ingenieria , Lima, Peru (1968) Ph.D., Carnegie-Mellon University (1974) Carnegie. 1969: Design Automation

  13. Computer science security research and human subjects: emerging considerations for research ethics boards.

    Science.gov (United States)

    Buchanan, Elizabeth; Aycock, John; Dexter, Scott; Dittrich, David; Hvizdak, Erin

    2011-06-01

    This paper explores the growing concerns with computer science research, and in particular, computer security research and its relationship with the committees that review human subjects research. It offers cases that review boards are likely to confront, and provides a context for appropriate consideration of such research, as issues of bots, clouds, and worms enter the discourse of human subjects review.

  14. [Activities of Research Institute for Advanced Computer Science

    Science.gov (United States)

    Gross, Anthony R. (Technical Monitor); Leiner, Barry M.

    2001-01-01

    The Research Institute for Advanced Computer Science (RIACS) carries out basic research and technology development in computer science, in support of the National Aeronautics and Space Administrations missions. RIACS is located at the NASA Ames Research Center, Moffett Field, California. RIACS research focuses on the three cornerstones of IT research necessary to meet the future challenges of NASA missions: 1. Automated Reasoning for Autonomous Systems Techniques are being developed enabling spacecraft that will be self-guiding and self-correcting to the extent that they will require little or no human intervention. Such craft will be equipped to independently solve problems as they arise, and fulfill their missions with minimum direction from Earth. 2. Human-Centered Computing Many NASA missions require synergy between humans and computers, with sophisticated computational aids amplifying human cognitive and perceptual abilities. 3. High Performance Computing and Networking Advances in the performance of computing and networking continue to have major impact on a variety of NASA endeavors, ranging from modeling and simulation to analysis of large scientific datasets to collaborative engineering, planning and execution. In addition, RIACS collaborates with NASA scientists to apply IT research to a variety of NASA application domains. RIACS also engages in other activities, such as workshops, seminars, visiting scientist programs and student summer programs, designed to encourage and facilitate collaboration between the university and NASA IT research communities.

  15. National Energy Research Scientific Computing Center (NERSC): Advancing the frontiers of computational science and technology

    Energy Technology Data Exchange (ETDEWEB)

    Hules, J. [ed.

    1996-11-01

    National Energy Research Scientific Computing Center (NERSC) provides researchers with high-performance computing tools to tackle science`s biggest and most challenging problems. Founded in 1974 by DOE/ER, the Controlled Thermonuclear Research Computer Center was the first unclassified supercomputer center and was the model for those that followed. Over the years the center`s name was changed to the National Magnetic Fusion Energy Computer Center and then to NERSC; it was relocated to LBNL. NERSC, one of the largest unclassified scientific computing resources in the world, is the principal provider of general-purpose computing services to DOE/ER programs: Magnetic Fusion Energy, High Energy and Nuclear Physics, Basic Energy Sciences, Health and Environmental Research, and the Office of Computational and Technology Research. NERSC users are a diverse community located throughout US and in several foreign countries. This brochure describes: the NERSC advantage, its computational resources and services, future technologies, scientific resources, and computational science of scale (interdisciplinary research over a decade or longer; examples: combustion in engines, waste management chemistry, global climate change modeling).

  16. Systems Biology-Based Platforms to Accelerate Research of Emerging Infectious Diseases.

    Science.gov (United States)

    Oh, Soo Jin; Choi, Young Ki; Shin, Ok Sarah

    2018-03-01

    Emerging infectious diseases (EIDs) pose a major threat to public health and security. Given the dynamic nature and significant impact of EIDs, the most effective way to prevent and protect against them is to develop vaccines in advance. Systems biology approaches provide an integrative way to understand the complex immune response to pathogens. They can lead to a greater understanding of EID pathogenesis and facilitate the evaluation of newly developed vaccine-induced immunity in a timely manner. In recent years, advances in high throughput technologies have enabled researchers to successfully apply systems biology methods to analyze immune responses to a variety of pathogens and vaccines. Despite recent advances, computational and biological challenges impede wider application of systems biology approaches. This review highlights recent advances in the fields of systems immunology and vaccinology, and presents ways that systems biology-based platforms can be applied to accelerate a deeper understanding of the molecular mechanisms of immunity against EIDs. © Copyright: Yonsei University College of Medicine 2018.

  17. Division of Biological and Medical Research annual report, 1980

    International Nuclear Information System (INIS)

    Rosenthal, M.W.

    1981-08-01

    The research during 1980 in the Division of Biological and Medical Research, Argonne National Laboratory, is summarized. Research related to nuclear energy includes the delineation, in the beagle, of the responses to continuous low level 60 Co gamma radiation and the development of cellular indicators of preclinical phases of leukemia; comparison of lifetime effects in mice of low level neutron and 60 Co gamma radiation; studies of the genetic effects of high LET radiations; and studies of the gastrointestinal absorption of the actinide elements. Research related to nonuclear energy sources deals with characterization and toxicological evaluation of process streams and effluents of coal gasification; with electrical storage systems; and electric fields associated with energy transmission. Proteins in human urine and selected tissues are examined by two-dimensional electrophoresis to detect disease and pollutant related changes. Assessment of human risk associated with nuclearing collective dose commitment will result in more attention being paid to potential releases of radionuclides at relatively short times after disposal

  18. Fluid dynamics parallel computer development at NASA Langley Research Center

    Science.gov (United States)

    Townsend, James C.; Zang, Thomas A.; Dwoyer, Douglas L.

    1987-01-01

    To accomplish more detailed simulations of highly complex flows, such as the transition to turbulence, fluid dynamics research requires computers much more powerful than any available today. Only parallel processing on multiple-processor computers offers hope for achieving the required effective speeds. Looking ahead to the use of these machines, the fluid dynamicist faces three issues: algorithm development for near-term parallel computers, architecture development for future computer power increases, and assessment of possible advantages of special purpose designs. Two projects at NASA Langley address these issues. Software development and algorithm exploration is being done on the FLEX/32 Parallel Processing Research Computer. New architecture features are being explored in the special purpose hardware design of the Navier-Stokes Computer. These projects are complementary and are producing promising results.

  19. Impact of Interdisciplinary Undergraduate Research in Mathematics and Biology on the Development of a New Course Integrating Five STEM Disciplines

    Science.gov (United States)

    Caudill, Lester; Hill, April; Hoke, Kathy; Lipan, Ovidiu

    2010-01-01

    Funded by innovative programs at the National Science Foundation and the Howard Hughes Medical Institute, University of Richmond faculty in biology, chemistry, mathematics, physics, and computer science teamed up to offer first- and second-year students the opportunity to contribute to vibrant, interdisciplinary research projects. The result was…

  20. Recent advances, and unresolved issues, in the application of computational modelling to the prediction of the biological effects of nanomaterials

    International Nuclear Information System (INIS)

    Winkler, David A.

    2016-01-01

    Nanomaterials research is one of the fastest growing contemporary research areas. The unprecedented properties of these materials have meant that they are being incorporated into products very quickly. Regulatory agencies are concerned they cannot assess the potential hazards of these materials adequately, as data on the biological properties of nanomaterials are still relatively limited and expensive to acquire. Computational modelling methods have much to offer in helping understand the mechanisms by which toxicity may occur, and in predicting the likelihood of adverse biological impacts of materials not yet tested experimentally. This paper reviews the progress these methods, particularly those QSAR-based, have made in understanding and predicting potentially adverse biological effects of nanomaterials, and also the limitations and pitfalls of these methods. - Highlights: • Nanomaterials regulators need good information to make good decisions. • Nanomaterials and their interactions with biology are very complex. • Computational methods use existing data to predict properties of new nanomaterials. • Statistical, data driven modelling methods have been successfully applied to this task. • Much more must be learnt before robust toolkits will be widely usable by regulators.

  1. Converting differential-equation models of biological systems to membrane computing.

    Science.gov (United States)

    Muniyandi, Ravie Chandren; Zin, Abdullah Mohd; Sanders, J W

    2013-12-01

    This paper presents a method to convert the deterministic, continuous representation of a biological system by ordinary differential equations into a non-deterministic, discrete membrane computation. The dynamics of the membrane computation is governed by rewrite rules operating at certain rates. That has the advantage of applying accurately to small systems, and to expressing rates of change that are determined locally, by region, but not necessary globally. Such spatial information augments the standard differentiable approach to provide a more realistic model. A biological case study of the ligand-receptor network of protein TGF-β is used to validate the effectiveness of the conversion method. It demonstrates the sense in which the behaviours and properties of the system are better preserved in the membrane computing model, suggesting that the proposed conversion method may prove useful for biological systems in particular. Copyright © 2013 Elsevier Ireland Ltd. All rights reserved.

  2. Modeling biological problems in computer science: a case study in genome assembly.

    Science.gov (United States)

    Medvedev, Paul

    2018-01-30

    As computer scientists working in bioinformatics/computational biology, we often face the challenge of coming up with an algorithm to answer a biological question. This occurs in many areas, such as variant calling, alignment and assembly. In this tutorial, we use the example of the genome assembly problem to demonstrate how to go from a question in the biological realm to a solution in the computer science realm. We show the modeling process step-by-step, including all the intermediate failed attempts. Please note this is not an introduction to how genome assembly algorithms work and, if treated as such, would be incomplete and unnecessarily long-winded. © The Author(s) 2018. Published by Oxford University Press. All rights reserved. For Permissions, please email: journals.permissions@oup.com.

  3. Persistence and availability of Web services in computational biology.

    Science.gov (United States)

    Schultheiss, Sebastian J; Münch, Marc-Christian; Andreeva, Gergana D; Rätsch, Gunnar

    2011-01-01

    We have conducted a study on the long-term availability of bioinformatics Web services: an observation of 927 Web services published in the annual Nucleic Acids Research Web Server Issues between 2003 and 2009. We found that 72% of Web sites are still available at the published addresses, only 9% of services are completely unavailable. Older addresses often redirect to new pages. We checked the functionality of all available services: for 33%, we could not test functionality because there was no example data or a related problem; 13% were truly no longer working as expected; we could positively confirm functionality only for 45% of all services. Additionally, we conducted a survey among 872 Web Server Issue corresponding authors; 274 replied. 78% of all respondents indicate their services have been developed solely by students and researchers without a permanent position. Consequently, these services are in danger of falling into disrepair after the original developers move to another institution, and indeed, for 24% of services, there is no plan for maintenance, according to the respondents. We introduce a Web service quality scoring system that correlates with the number of citations: services with a high score are cited 1.8 times more often than low-scoring services. We have identified key characteristics that are predictive of a service's survival, providing reviewers, editors, and Web service developers with the means to assess or improve Web services. A Web service conforming to these criteria receives more citations and provides more reliable service for its users. The most effective way of ensuring continued access to a service is a persistent Web address, offered either by the publishing journal, or created on the authors' own initiative, for example at http://bioweb.me. The community would benefit the most from a policy requiring any source code needed to reproduce results to be deposited in a public repository.

  4. Applied Computational Fluid Dynamics at NASA Ames Research Center

    Science.gov (United States)

    Holst, Terry L.; Kwak, Dochan (Technical Monitor)

    1994-01-01

    The field of Computational Fluid Dynamics (CFD) has advanced to the point where it can now be used for many applications in fluid mechanics research and aerospace vehicle design. A few applications being explored at NASA Ames Research Center will be presented and discussed. The examples presented will range in speed from hypersonic to low speed incompressible flow applications. Most of the results will be from numerical solutions of the Navier-Stokes or Euler equations in three space dimensions for general geometry applications. Computational results will be used to highlight the presentation as appropriate. Advances in computational facilities including those associated with NASA's CAS (Computational Aerosciences) Project of the Federal HPCC (High Performance Computing and Communications) Program will be discussed. Finally, opportunities for future research will be presented and discussed. All material will be taken from non-sensitive, previously-published and widely-disseminated work.

  5. Computer processing techniques in digital radiography research

    International Nuclear Information System (INIS)

    Pickens, D.R.; Kugel, J.A.; Waddill, W.B.; Smith, G.D.; Martin, V.N.; Price, R.R.; James, A.E. Jr.

    1985-01-01

    In the Department of Radiology and Radiological Sciences, Vanderbilt University Medical Center, and the Center for Medical Imaging Research, Nashville, TN, there are several activities which are designed to increase the information available from film-screen acquisition as well as from direct digital acquisition of radiographic information. Two of the projects involve altering the display of images after acquisition, either to remove artifacts present as a result of the acquisition process or to change the manner in which the image is displayed to improve the perception of details in the image. These two projects use methods which can be applied to any type of digital image, but are being implemented with images digitized from conventional x-ray film. One of these research endeavors involves mathematical alteration of the image to correct for motion artifacts or registration errors between images that will be subtracted. Another applies well-known image processing methods to digital radiographic images to improve the image contrast and enhance subtle details in the image. A third project involves the use of dual energy imaging with a digital radiography system to reconstruct images which demonstrate either soft tissue details or the osseous structures. These projects are discussed in greater detail in the following sections of this communication

  6. Mathematical computer simulation of the process of ultrasound interaction with biological medium

    International Nuclear Information System (INIS)

    Yakovleva, T.; Nassiri, D.; Ciantar, D.

    1996-01-01

    The aim of the paper is to study theoretically the interaction of ultrasound irradiation with biological medium and the peculiarities of ultrasound scattering by inhomogeneities of biological tissue, which can be represented by fractal structures. This investigation has been used for the construction of the computer model of three-dimensional ultrasonic imaging system what gives the possibility to define more accurately the pathological changes in such a tissue by means of its image analysis. Poster 180. (author)

  7. Advances, gaps, and future prospects in biological soil crust research

    Science.gov (United States)

    Weber, Bettina; Büdel, Burkhard; Belnap, Jayne

    2017-04-01

    Research progress has led to the understanding that biological soil crusts (biocrusts) are often complete miniature ecosystems comprising a variety of photosynthesizers (cyanobacteria, algae, lichens, bryophytes), decomposers like bacteria, fungi, and archaea, and heterotrophic organisms, like protozoa, nematodes, and microarthropods feeding on them. Biocrusts are one of the oldest terrestrial ecosystems, playing central roles in the structure and functioning of dryland ecosystems and presumably also influencing global biogeochemical cycles. On the other hand, biocrusts have been shown to be highly sensitive to global change, being easily destroyed by mechanical disturbance and severely threatened by minor changes in climate patterns. Despite the large increase in biocrust research, we still see major knowledge gaps which need to be tackled. Considering biodiversity studies, there are major regions of potential biocrust occurrence, where hardly any studies have been conducted. Molecular identification techniques are increasingly employed, but genetically characterized entities need to be linked with morphologically identified organisms to identify their ecological roles. Although there is a large body of research on the role of biocrusts in water and nutrient budgets, we are still far from closing the overall cycles. Results suggest that not all mechanisms have been identified, yet, leading to sometimes contradictory results between different studies. Knowledge on how to minimize impact to biocrusts during surface-disturbing activities has hardly been gained, and despite research efforts, instructions on effective biocrust restoration are still exemplary. In order to fill these research gaps, novel scientific approaches are needed. We expect that global research networks could be extremely helpful to answer scientific questions by tackling them within different regions, utilizing the same methodological techniques. Global networks could also be used for long

  8. Fiction as an Introduction to Computer Science Research

    Science.gov (United States)

    Goldsmith, Judy; Mattei, Nicholas

    2014-01-01

    The undergraduate computer science curriculum is generally focused on skills and tools; most students are not exposed to much research in the field, and do not learn how to navigate the research literature. We describe how fiction reviews (and specifically science fiction) are used as a gateway to research reviews. Students learn a little about…

  9. Life lines: An art history of biological research around 1800.

    Science.gov (United States)

    Bruhn, Matthias

    2011-12-01

    Around 1800, the scientific "illustrator" emerged as a new artistic profession in Europe. Artists were increasingly sought after in order to picture anatomical dissections and microscopic observations and to translate drawings into artworks for books and journals. By training and technical expertise, they introduced a particular kind of knowledge into scientific perception that also shaped the common image of nature. Illustrations of scientific publications, often undervalued as a biased interpretation of facts and subordinate to logic and description, thus convey an 'art history' of science in its own right, relevant both for the understanding of biological thought around 1800 as well as for the development of the arts and their historiography. The article is based on an analysis of botanical treatises produced for the Göttingen Society of Sciences in 1803, during an early phase of microscopic cell research, in order to determine the constitutive role of artistic knowledge and the media employed for the visualization and conceptualization of biological issues. Copyright © 2011 Elsevier Ltd. All rights reserved.

  10. Growth Analysis of Cancer Biology Research, 2000-2011

    Directory of Open Access Journals (Sweden)

    Keshava,

    2015-09-01

    Full Text Available Methods and Material: The PubMed database was used for retrieving data on 'cancer biology.' Articles were downloaded from the years 2000 to 2011. The articles were classified chronologically and transferred to a spreadsheet application for analysis of the data as per the objectives of the study. Statistical Method: To investigate the nature of growth of articles via exponential, linear, and logistics tests. Result: The year wise analysis of the growth of articles output shows that for the years 2000 to 2005 and later there is a sudden increase in output, during the years 2006 to 2007 and 2008 to 2011. The high productivity of articles during these years may be due to their significance in cancer biology literature, having received prominence in research. Conclusion: There is an obvious need for better compilations of statistics on numbers of publications in the years from 2000 to 2011 on various disciplines on a worldwide scale, for informed critical assessments of the amount of new knowledge contributed by these publications, and for enhancements and refinements of present Scientometric techniques (citation and publication counts, so that valid measures of knowledge growth may be obtained. Only then will Scientometrics be able to provide accurate, useful descriptions and predictions of knowledge growth.

  11. Developing a Research Agenda for Ubiquitous Computing in Schools

    Science.gov (United States)

    Zucker, Andrew

    2004-01-01

    Increasing numbers of states, districts, and schools provide every student with a computing device; for example, the middle schools in Maine maintain wireless Internet access and the students receive laptops. Research can provide policymakers with better evidence of the benefits and costs of 1:1 computing and establish which factors make 1:1…

  12. Secure encapsulation and publication of biological services in the cloud computing environment.

    Science.gov (United States)

    Zhang, Weizhe; Wang, Xuehui; Lu, Bo; Kim, Tai-hoon

    2013-01-01

    Secure encapsulation and publication for bioinformatics software products based on web service are presented, and the basic function of biological information is realized in the cloud computing environment. In the encapsulation phase, the workflow and function of bioinformatics software are conducted, the encapsulation interfaces are designed, and the runtime interaction between users and computers is simulated. In the publication phase, the execution and management mechanisms and principles of the GRAM components are analyzed. The functions such as remote user job submission and job status query are implemented by using the GRAM components. The services of bioinformatics software are published to remote users. Finally the basic prototype system of the biological cloud is achieved.

  13. Model and Computing Experiment for Research and Aerosols Usage Management

    Directory of Open Access Journals (Sweden)

    Daler K. Sharipov

    2012-09-01

    Full Text Available The article deals with a math model for research and management of aerosols released into the atmosphere as well as numerical algorithm used as hardware and software systems for conducting computing experiment.

  14. Human-Computer Interaction and Information Management Research Needs

    Data.gov (United States)

    Networking and Information Technology Research and Development, Executive Office of the President — In a visionary future, Human-Computer Interaction HCI and Information Management IM have the potential to enable humans to better manage their lives through the use...

  15. Research on the Use of Computer-Assisted Instruction.

    Science.gov (United States)

    Craft, C. O.

    1982-01-01

    Reviews recent research studies related to computer assisted instruction (CAI). The studies concerned program effectiveness, teaching of psychomotor skills, tool availability, and factors affecting the adoption of CAI. (CT)

  16. How the confocal laser scanning microscope entered biological research.

    Science.gov (United States)

    Amos, W B; White, J G

    2003-09-01

    A history of the early development of the confocal laser scanning microscope in the MRC Laboratory of Molecular Biology in Cambridge is presented. The rapid uptake of this technology is explained by the wide use of fluorescence in the 80s. The key innovations were the scanning of the light beam over the specimen rather than vice-versa and a high magnification at the level of the detector, allowing the use of a macroscopic iris. These were followed by an achromatic all-reflective relay system, a non-confocal transmission detector and novel software for control and basic image processing. This design was commercialized successfully and has been produced and developed over 17 years, surviving challenges from alternative technologies, including solid-state scanning systems. Lessons are pointed out from the unusual nature of the original funding and research environment. Attention is drawn to the slow adoption of the instrument in diagnostic medicine, despite promising applications.

  17. Birth/birth-death processes and their computable transition probabilities with biological applications.

    Science.gov (United States)

    Ho, Lam Si Tung; Xu, Jason; Crawford, Forrest W; Minin, Vladimir N; Suchard, Marc A

    2018-03-01

    Birth-death processes track the size of a univariate population, but many biological systems involve interaction between populations, necessitating models for two or more populations simultaneously. A lack of efficient methods for evaluating finite-time transition probabilities of bivariate processes, however, has restricted statistical inference in these models. Researchers rely on computationally expensive methods such as matrix exponentiation or Monte Carlo approximation, restricting likelihood-based inference to small systems, or indirect methods such as approximate Bayesian computation. In this paper, we introduce the birth/birth-death process, a tractable bivariate extension of the birth-death process, where rates are allowed to be nonlinear. We develop an efficient algorithm to calculate its transition probabilities using a continued fraction representation of their Laplace transforms. Next, we identify several exemplary models arising in molecular epidemiology, macro-parasite evolution, and infectious disease modeling that fall within this class, and demonstrate advantages of our proposed method over existing approaches to inference in these models. Notably, the ubiquitous stochastic susceptible-infectious-removed (SIR) model falls within this class, and we emphasize that computable transition probabilities newly enable direct inference of parameters in the SIR model. We also propose a very fast method for approximating the transition probabilities under the SIR model via a novel branching process simplification, and compare it to the continued fraction representation method with application to the 17th century plague in Eyam. Although the two methods produce similar maximum a posteriori estimates, the branching process approximation fails to capture the correlation structure in the joint posterior distribution.

  18. Evolving a lingua franca and associated software infrastructure for computational systems biology: the Systems Biology Markup Language (SBML) project.

    Science.gov (United States)

    Hucka, M; Finney, A; Bornstein, B J; Keating, S M; Shapiro, B E; Matthews, J; Kovitz, B L; Schilstra, M J; Funahashi, A; Doyle, J C; Kitano, H

    2004-06-01

    Biologists are increasingly recognising that computational modelling is crucial for making sense of the vast quantities of complex experimental data that are now being collected. The systems biology field needs agreed-upon information standards if models are to be shared, evaluated and developed cooperatively. Over the last four years, our team has been developing the Systems Biology Markup Language (SBML) in collaboration with an international community of modellers and software developers. SBML has become a de facto standard format for representing formal, quantitative and qualitative models at the level of biochemical reactions and regulatory networks. In this article, we summarise the current and upcoming versions of SBML and our efforts at developing software infrastructure for supporting and broadening its use. We also provide a brief overview of the many SBML-compatible software tools available today.

  19. Development of a computational system for management of risks in radiosterilization processes of biological tissues

    International Nuclear Information System (INIS)

    Montoya, Cynara Viterbo

    2009-01-01

    Risk management can be understood to be a systematic management which aims to identify record and control the risks of a process. Applying risk management becomes a complex activity, due to the variety of professionals involved. In order to execute risk management the following are requirements of paramount importance: the experience, discernment and judgment of a multidisciplinary team, guided by means of quality tools, so as to provide standardization in the process of investigating the cause and effects of risks and dynamism in obtaining the objective desired, i.e. the reduction and control of the risk. This work aims to develop a computational system of risk management (software) which makes it feasible to diagnose the risks of the processes of radiosterilization of biological tissues. The methodology adopted was action-research, according to which the researcher performs an active role in the establishment of the problems found, in the follow-up and in the evaluation of the actions taken owing to the problems. The scenario of this action-research was the Laboratory of Biological Tissues (LTB) in the Radiation Technology Center IPEN/CNEN-SP - Sao Paulo/Brazil. The software developed was executed in PHP and Flash/MySQL language, the server (hosting), the software is available on the Internet (www.vcrisk.com.br), which the user can access from anywhere by means of the login/access password previously sent by email to the team responsible for the tissue to be analyzed. The software presents friendly navigability whereby the user is directed step-by-step in the process of investigating the risk up to the means of reducing it. The software 'makes' the user comply with the term and present the effectiveness of the actions taken to reduce the risk. Applying this system provided the organization (LTB/CTR/IPEN) with dynamic communication, effective between the members of the multidisciplinary team: a) in decision-making; b) in lessons learned; c) in knowing the new risk

  20. Documenting and predicting topic changes in Computers in Biology and Medicine: A bibliometric keyword analysis from 1990 to 2017

    Directory of Open Access Journals (Sweden)

    Oliver Faust

    Full Text Available The Computers in Biology and Medicine (CBM journal promotes the use of computing machinery in the fields of bioscience and medicine. Since the first volume in 1970, the importance of computers in these fields has grown dramatically, this is evident in the diversification of topics and an increase in the publication rate. In this study, we quantify both change and diversification of topics covered in. This is done by analysing the author supplied keywords, since they were electronically captured in 1990. The analysis starts by selecting 40 keywords, related to Medical (M (7, Data (D (10, Feature (F (17 and (AI (6 methods. Automated keyword clustering shows the statistical connection between the selected keywords. We found that the three most popular topics in CBM are: Support Vector Machine (SVM, Electroencephalography (EEG and IMAGE PROCESSING. In a separate analysis step, we bagged the selected keywords into sequential one year time slices and calculated the normalized appearance. The results were visualised with graphs that indicate the CBM topic changes. These graphs show that there was a transition from Artificial Neural Network (ANN to SVM. In 2006 SVM replaced ANN as the most important AI algorithm. Our investigation helps the editorial board to manage and embrace topic change. Furthermore, our analysis is interesting for the general reader, as the results can help them to adjust their research directions. Keywords: Research trends, Topic analysis, Topic detection and tracking, Text mining, Computers in biology and medicine

  1. Activities of the Research Institute for Advanced Computer Science

    Science.gov (United States)

    Oliger, Joseph

    1994-01-01

    The Research Institute for Advanced Computer Science (RIACS) was established by the Universities Space Research Association (USRA) at the NASA Ames Research Center (ARC) on June 6, 1983. RIACS is privately operated by USRA, a consortium of universities with research programs in the aerospace sciences, under contract with NASA. The primary mission of RIACS is to provide research and expertise in computer science and scientific computing to support the scientific missions of NASA ARC. The research carried out at RIACS must change its emphasis from year to year in response to NASA ARC's changing needs and technological opportunities. Research at RIACS is currently being done in the following areas: (1) parallel computing; (2) advanced methods for scientific computing; (3) high performance networks; and (4) learning systems. RIACS technical reports are usually preprints of manuscripts that have been submitted to research journals or conference proceedings. A list of these reports for the period January 1, 1994 through December 31, 1994 is in the Reports and Abstracts section of this report.

  2. Microgravity research in plant biological systems: Realizing the potential of molecular biology

    Science.gov (United States)

    Lewis, Norman G.; Ryan, Clarence A.

    1993-01-01

    The sole all-pervasive feature of the environment that has helped shape, through evolution, all life on Earth is gravity. The near weightlessness of the Space Station Freedom space environment allows gravitational effects to be essentially uncoupled, thus providing an unprecedented opportunity to manipulate, systematically dissect, study, and exploit the role of gravity in the growth and development of all life forms. New and exciting opportunities are now available to utilize molecular biological and biochemical approaches to study the effects of microgravity on living organisms. By careful experimentation, we can determine how gravity perception occurs, how the resulting signals are produced and transduced, and how or if tissue-specific differences in gene expression occur. Microgravity research can provide unique new approaches to further our basic understanding of development and metabolic processes of cells and organisms, and to further the application of this new knowledge for the betterment of humankind.

  3. STRUCTURAL BIOLOGY AND MOLECULAR MEDICINE RESEARCH PROGRAM (LSBMM)

    International Nuclear Information System (INIS)

    Eisenberg, David S.

    2008-01-01

    The UCLA-DOE Institute of Genomics and Proteomics is an organized research unit of the University of California, sponsored by the Department of Energy through the mechanism of a Cooperative Agreement. Today the Institute consists of 10 Principal Investigators and 7 Associate Members, developing and applying technologies to promote the biological and environmental missions of the Department of Energy, and 5 Core Technology Centers to sustain this work. The focus is on understanding genomes, pathways and molecular machines in organisms of interest to DOE, with special emphasis on developing enabling technologies. Since it was founded in 1947, the UCLA-DOE Institute has adapted its mission to the research needs of DOE and its progenitor agencies as these research needs have changed. The Institute started as the AEC Laboratory of Nuclear Medicine, directed by Stafford Warren, who later became the founding Dean of the UCLA School of Medicine. In this sense, the entire UCLA medical center grew out of the precursor of our Institute. In 1963, the mission of the Institute was expanded into environmental studies by Director Ray Lunt. I became the third director in 1993, and in close consultation with David Galas and John Wooley of DOE, shifted the mission of the Institute towards genomics and proteomics. Since 1993, the Principal Investigators and Core Technology Centers are entirely new, and the Institute has separated from its former division concerned with PET imaging. The UCLA-DOE Institute shares the space of Boyer Hall with the Molecular Biology Institute, and assumes responsibility for the operation of the main core facilities. Fig. 1 gives the organizational chart of the Institute. Some of the benefits to the public of research carried out at the UCLA-DOE Institute include the following: The development of publicly accessible, web-based databases, including the Database of Protein Interactions, and the ProLinks database of genomicly inferred protein function linkages

  4. The secondary metabolite bioinformatics portal: Computational tools to facilitate synthetic biology of secondary metabolite production

    Directory of Open Access Journals (Sweden)

    Tilmann Weber

    2016-06-01

    Full Text Available Natural products are among the most important sources of lead molecules for drug discovery. With the development of affordable whole-genome sequencing technologies and other ‘omics tools, the field of natural products research is currently undergoing a shift in paradigms. While, for decades, mainly analytical and chemical methods gave access to this group of compounds, nowadays genomics-based methods offer complementary approaches to find, identify and characterize such molecules. This paradigm shift also resulted in a high demand for computational tools to assist researchers in their daily work. In this context, this review gives a summary of tools and databases that currently are available to mine, identify and characterize natural product biosynthesis pathways and their producers based on ‘omics data. A web portal called Secondary Metabolite Bioinformatics Portal (SMBP at http://www.secondarymetabolites.org is introduced to provide a one-stop catalog and links to these bioinformatics resources. In addition, an outlook is presented how the existing tools and those to be developed will influence synthetic biology approaches in the natural products field.

  5. Reproducible computational biology experiments with SED-ML--the Simulation Experiment Description Markup Language.

    Science.gov (United States)

    Waltemath, Dagmar; Adams, Richard; Bergmann, Frank T; Hucka, Michael; Kolpakov, Fedor; Miller, Andrew K; Moraru, Ion I; Nickerson, David; Sahle, Sven; Snoep, Jacky L; Le Novère, Nicolas

    2011-12-15

    The increasing use of computational simulation experiments to inform modern biological research creates new challenges to annotate, archive, share and reproduce such experiments. The recently published Minimum Information About a Simulation Experiment (MIASE) proposes a minimal set of information that should be provided to allow the reproduction of simulation experiments among users and software tools. In this article, we present the Simulation Experiment Description Markup Language (SED-ML). SED-ML encodes in a computer-readable exchange format the information required by MIASE to enable reproduction of simulation experiments. It has been developed as a community project and it is defined in a detailed technical specification and additionally provides an XML schema. The version of SED-ML described in this publication is Level 1 Version 1. It covers the description of the most frequent type of simulation experiments in the area, namely time course simulations. SED-ML documents specify which models to use in an experiment, modifications to apply on the models before using them, which simulation procedures to run on each model, what analysis results to output, and how the results should be presented. These descriptions are independent of the underlying model implementation. SED-ML is a software-independent format for encoding the description of simulation experiments; it is not specific to particular simulation tools. Here, we demonstrate that with the growing software support for SED-ML we can effectively exchange executable simulation descriptions. With SED-ML, software can exchange simulation experiment descriptions, enabling the validation and reuse of simulation experiments in different tools. Authors of papers reporting simulation experiments can make their simulation protocols available for other scientists to reproduce the results. Because SED-ML is agnostic about exact modeling language(s) used, experiments covering models from different fields of research

  6. Reproducible computational biology experiments with SED-ML - The Simulation Experiment Description Markup Language

    Science.gov (United States)

    2011-01-01

    Background The increasing use of computational simulation experiments to inform modern biological research creates new challenges to annotate, archive, share and reproduce such experiments. The recently published Minimum Information About a Simulation Experiment (MIASE) proposes a minimal set of information that should be provided to allow the reproduction of simulation experiments among users and software tools. Results In this article, we present the Simulation Experiment Description Markup Language (SED-ML). SED-ML encodes in a computer-readable exchange format the information required by MIASE to enable reproduction of simulation experiments. It has been developed as a community project and it is defined in a detailed technical specification and additionally provides an XML schema. The version of SED-ML described in this publication is Level 1 Version 1. It covers the description of the most frequent type of simulation experiments in the area, namely time course simulations. SED-ML documents specify which models to use in an experiment, modifications to apply on the models before using them, which simulation procedures to run on each model, what analysis results to output, and how the results should be presented. These descriptions are independent of the underlying model implementation. SED-ML is a software-independent format for encoding the description of simulation experiments; it is not specific to particular simulation tools. Here, we demonstrate that with the growing software support for SED-ML we can effectively exchange executable simulation descriptions. Conclusions With SED-ML, software can exchange simulation experiment descriptions, enabling the validation and reuse of simulation experiments in different tools. Authors of papers reporting simulation experiments can make their simulation protocols available for other scientists to reproduce the results. Because SED-ML is agnostic about exact modeling language(s) used, experiments covering models from

  7. Grand Challenges for Biological and Environmental Research: A Long-Term Vision

    Energy Technology Data Exchange (ETDEWEB)

    Arkin, A.; Baliga, N.; Braam, J.; Church, G.; Collins, J; ; Cottingham, R.; Ecker, J.; Gerstein, M.; Gilna, P.; Greenberg, J.; Handelsman, J.; Hubbard, S.; Joachimiak, A.; Liao, J.; Looger, L.; Meyerowitz, E.; Mjolness, E.; Petsko, G.; Sayler, G.; Simpson, M.; Stacey, G.; Sussman, M.; Tiedje, J.; Bader, D.; Cessi, P.; Collins, W.; Denning, S.; Dickinson, R.; Easterling, D.; Edmonds, J.; Feddema, J.; Field, C.; Fridlind, A.; Fung, I.; Held, I.; Jackson, R.; Janetos, A.; Large, W.; Leinen, M.; Leung, R.; Long, S.; Mace, G.; Masiello, C.; Meehl, G.; Ort, D.; Otto-Bliesner, B.; Penner, J.; Prather, M.; Randall, D.; Rasch, P.; Schneider, E.; Shugart, H.; Thornton, P.; Washington, W.; Wildung, R.; Wiscombe, W.; Zak, D.; Zhang, M.; Bielicki, J.; Buford, M.; Cleland, E.; Dale, V.; Duke, C.; Ehleringer, J.; Hecht, A.; Kammen, D.; Marland, G.; Pataki, D.; Riley, M. Robertson, P.; Hubbard, S.

    2010-12-01

    The interactions and feedbacks among plants, animals, microbes, humans, and the environment ultimately form the world in which we live. This world is now facing challenges from a growing and increasingly affluent human population whose numbers and lifestyles are driving ever greater energy demand and impacting climate. These and other contributing factors will make energy and climate sustainability extremely difficult to achieve over the 20-year time horizon that is the focus of this report. Despite these severe challenges, there is optimism that deeper understanding of our environment will enable us to mitigate detrimental effects, while also harnessing biological and climate systems to ensure a sustainable energy future. This effort is advanced by scientific inquiries in the fields of atmospheric chemistry and physics, biology, ecology, and subsurface science - all made possible by computing. The Office of Biological and Environmental Research (BER) within the Department of Energy's (DOE) Office of Science has a long history of bringing together researchers from different disciplines to address critical national needs in determining the biological and environmental impacts of energy production and use, characterizing the interplay of climate and energy, and collaborating with other agencies and DOE programs to improve the world's most powerful climate models. BER science focuses on three distinct areas: (1) What are the roles of Earth system components (atmosphere, land, oceans, sea ice, and the biosphere) in determining climate? (2) How is the information stored in a genome translated into microbial, plant, and ecosystem processes that influence biofuel production, climate feedbacks, and the natural cycling of carbon? (3) What are the biological, geochemical, and physical forces that govern the behavior of Earth's subsurface environment? Ultimately, the goal of BER science is to support experimentation and modeling that can reliably predict the

  8. Automatic processing of radioimmunological research data on a computer

    International Nuclear Information System (INIS)

    Korolyuk, I.P.; Gorodenko, A.N.; Gorodenko, S.I.

    1979-01-01

    A program ''CRITEST'' in the language PL/1 for the EC computer intended for automatic processing of the results of radioimmunological research has been elaborated. The program works in the operation system of the OC EC computer and is performed in the section OC 60 kb. When compiling the program Eitken's modified algorithm was used. The program was clinically approved when determining a number of hormones: CTH, T 4 , T 3 , TSH. The automatic processing of the radioimmunological research data on the computer makes it possible to simplify the labour-consuming analysis and to raise its accuracy

  9. Cell Science and Cell Biology Research at MSFC: Summary

    Science.gov (United States)

    2003-01-01

    The common theme of these research programs is that they investigate regulation of gene expression in cells, and ultimately gene expression is controlled by the macromolecular interactions between regulatory proteins and DNA. The NASA Critical Path Roadmap identifies Muscle Alterations and Atrophy and Radiation Effects as Very Serious Risks and Severe Risks, respectively, in long term space flights. The specific problem addressed by Dr. Young's research ("Skeletal Muscle Atrophy and Muscle Cell Signaling") is that skeletal muscle loss in space cannot be prevented by vigorous exercise. Aerobic skeletal muscles (i.e., red muscles) undergo the most extensive atrophy during long-term space flight. Of the many different potential avenues for preventing muscle atrophy, Dr. Young has chosen to study the beta-adrenergic receptor (betaAR) pathway. The reason for this choice is that a family of compounds called betaAR agonists will preferentially cause an increase in muscle mass of aerobic muscles (i.e., red muscle) in animals, potentially providing a specific pharmacological solution to muscle loss in microgravity. In addition, muscle atrophy is a widespread medical problem in neuromuscular diseases, spinal cord injury, lack of exercise, aging, and any disease requiring prolonged bedridden status. Skeletal muscle cells in cell culture are utilized as a model system to study this problem. Dr. Richmond's research ("Radiation & Cancer Biology of Mammary Cells in Culture") is directed toward developing a laboratory model for use in risk assessment of cancer caused by space radiation. This research is unique because a human model will be developed utilizing human mammary cells that are highly susceptible to tumor development. This approach is preferential over using animal cells because of problems in comparing radiation-induced cancers between humans and animals.

  10. The Implementation of Research-based Learning on Biology Seminar Course in Biology Education Study Program of FKIP UMRAH

    Science.gov (United States)

    Amelia, T.

    2018-04-01

    Biology Seminar is a course in Biology Education Study Program of Faculty of Teacher Training and Education University of Maritim Raja Ali Haji (FKIP UMRAH) that requires students to have the ability to apply scientific attitudes, perform scientific writing and undertake scientific publications on a small scale. One of the learning strategies that can drive the achievement of learning outcomes in this course is Research-Based Learning. Research-Based Learning principles are considered in accordance with learning outcomes in Biology Seminar courses and generally in accordance with the purpose of higher education. On this basis, this article which is derived from a qualitative research aims at describing Research-based Learning on Biology Seminar course. Based on a case study research, it was known that Research-Based Learning on Biology Seminar courses is applied through: designing learning activities around contemporary research issues; teaching research methods, techniques and skills explicitly within program; drawing on personal research in designing and teaching courses; building small-scale research activities into undergraduate assignment; and infusing teaching with the values of researchers.

  11. Soil protists: a fertile frontier in soil biology research.

    Science.gov (United States)

    Geisen, Stefan; Mitchell, Edward A D; Adl, Sina; Bonkowski, Michael; Dunthorn, Micah; Ekelund, Flemming; Fernández, Leonardo D; Jousset, Alexandre; Krashevska, Valentyna; Singer, David; Spiegel, Frederick W; Walochnik, Julia; Lara, Enrique

    2018-05-01

    Protists include all eukaryotes except plants, fungi and animals. They are an essential, yet often forgotten, component of the soil microbiome. Method developments have now furthered our understanding of the real taxonomic and functional diversity of soil protists. They occupy key roles in microbial foodwebs as consumers of bacteria, fungi and other small eukaryotes. As parasites of plants, animals and even of larger protists, they regulate populations and shape communities. Pathogenic forms play a major role in public health issues as human parasites, or act as agricultural pests. Predatory soil protists release nutrients enhancing plant growth. Soil protists are of key importance for our understanding of eukaryotic evolution and microbial biogeography. Soil protists are also useful in applied research as bioindicators of soil quality, as models in ecotoxicology and as potential biofertilizers and biocontrol agents. In this review, we provide an overview of the enormous morphological, taxonomical and functional diversity of soil protists, and discuss current challenges and opportunities in soil protistology. Research in soil biology would clearly benefit from incorporating more protistology alongside the study of bacteria, fungi and animals.

  12. PERMITTIVITY RESEARCH OF BIOLOGICAL SOLUTIONS IN GIGAHERTZ FREQUENCY RANGE

    Directory of Open Access Journals (Sweden)

    Anton S. Demin

    2017-07-01

    Full Text Available Subject of Research. We present results of permittivity research in gigahertz frequency range for saline and glucose solutions used in medical practice. Experiment results are substantiated theoretically on the basis of Debye-Cole model. Method. Researches have been carried out on blood plasma of healthy donor, water, normal saline and glucose solutions with different concentration from 3 to 12 mmol/l. Experiments have been performed by an active nearfield method based on measuring the impedance of a plane air-liquid boundary with open end of coaxial waveguide in the frequency range from 1 to 12 GHz. Measurement results have been processed with the use of vector analyzer computer system from Rohde & Schwarz. Transmittance spectra have been determined by means of IR-spectrometer from TENZOR-Bruker. Main Results. Simulation results have shown good agreement between the experimental results and the model, as well as the choice of the main parameters of the Debye-Cole model in the studied frequency range for all media. It has been shown that the range of 3-6 GHz can be considered as the main one in the development of diagnostic sensors for the non-invasive analysis of the glucose concentration in the human blood. Practical Relevance. Electrodynamic models of test fluid replacing human blood give the possibility to simulate the sensor basic characteristics for qualitative and quantitative estimation of glucose concentration in human blood and can be used to create an experimental sample of a non- invasive glucometer.

  13. Research on OpenStack of open source cloud computing in colleges and universities’ computer room

    Science.gov (United States)

    Wang, Lei; Zhang, Dandan

    2017-06-01

    In recent years, the cloud computing technology has a rapid development, especially open source cloud computing. Open source cloud computing has attracted a large number of user groups by the advantages of open source and low cost, have now become a large-scale promotion and application. In this paper, firstly we briefly introduced the main functions and architecture of the open source cloud computing OpenStack tools, and then discussed deeply the core problems of computer labs in colleges and universities. Combining with this research, it is not that the specific application and deployment of university computer rooms with OpenStack tool. The experimental results show that the application of OpenStack tool can efficiently and conveniently deploy cloud of university computer room, and its performance is stable and the functional value is good.

  14. Roles of radiation chemistry in development and research of radiation biology

    International Nuclear Information System (INIS)

    Min Rui

    2009-01-01

    Radiation chemistry acts as a bridge connecting radiation physics with radiation biology in spatial and temporal insight. The theory, model, and methodology coming from radiation chemistry play an important role in the research and development of radiation biology. The chemical changes induced by ionizing radiation are involved not only in early event of biological effects caused by ionizing radiation but in function radiation biology, such as DNA damage and repair, sensitive modification, metabolism and function of active oxygen and so on. Following the research development of radiation biology, systems radiation biology, accurate quality and quantity of radiation biology effects need more methods and perfect tools from radiation chemistry. (authors)

  15. Computational intelligence in multi-feature visual pattern recognition hand posture and face recognition using biologically inspired approaches

    CERN Document Server

    Pisharady, Pramod Kumar; Poh, Loh Ai

    2014-01-01

    This book presents a collection of computational intelligence algorithms that addresses issues in visual pattern recognition such as high computational complexity, abundance of pattern features, sensitivity to size and shape variations and poor performance against complex backgrounds. The book has 3 parts. Part 1 describes various research issues in the field with a survey of the related literature. Part 2 presents computational intelligence based algorithms for feature selection and classification. The algorithms are discriminative and fast. The main application area considered is hand posture recognition. The book also discusses utility of these algorithms in other visual as well as non-visual pattern recognition tasks including face recognition, general object recognition and cancer / tumor classification. Part 3 presents biologically inspired algorithms for feature extraction. The visual cortex model based features discussed have invariance with respect to appearance and size of the hand, and provide good...

  16. Derivation and computation of discrete-delay and continuous-delay SDEs in mathematical biology.

    Science.gov (United States)

    Allen, Edward J

    2014-06-01

    Stochastic versions of several discrete-delay and continuous-delay differential equations, useful in mathematical biology, are derived from basic principles carefully taking into account the demographic, environmental, or physiological randomness in the dynamic processes. In particular, stochastic delay differential equation (SDDE) models are derived and studied for Nicholson's blowflies equation, Hutchinson's equation, an SIS epidemic model with delay, bacteria/phage dynamics, and glucose/insulin levels. Computational methods for approximating the SDDE models are described. Comparisons between computational solutions of the SDDEs and independently formulated Monte Carlo calculations support the accuracy of the derivations and of the computational methods.

  17. A direct method for computing extreme value (Gumbel) parameters for gapped biological sequence alignments.

    Science.gov (United States)

    Quinn, Terrance; Sinkala, Zachariah

    2014-01-01

    We develop a general method for computing extreme value distribution (Gumbel, 1958) parameters for gapped alignments. Our approach uses mixture distribution theory to obtain associated BLOSUM matrices for gapped alignments, which in turn are used for determining significance of gapped alignment scores for pairs of biological sequences. We compare our results with parameters already obtained in the literature.

  18. G-LoSA: An efficient computational tool for local structure-centric biological studies and drug design.

    Science.gov (United States)

    Lee, Hui Sun; Im, Wonpil

    2016-04-01

    Molecular recognition by protein mostly occurs in a local region on the protein surface. Thus, an efficient computational method for accurate characterization of protein local structural conservation is necessary to better understand biology and drug design. We present a novel local structure alignment tool, G-LoSA. G-LoSA aligns protein local structures in a sequence order independent way and provides a GA-score, a chemical feature-based and size-independent structure similarity score. Our benchmark validation shows the robust performance of G-LoSA to the local structures of diverse sizes and characteristics, demonstrating its universal applicability to local structure-centric comparative biology studies. In particular, G-LoSA is highly effective in detecting conserved local regions on the entire surface of a given protein. In addition, the applications of G-LoSA to identifying template ligands and predicting ligand and protein binding sites illustrate its strong potential for computer-aided drug design. We hope that G-LoSA can be a useful computational method for exploring interesting biological problems through large-scale comparison of protein local structures and facilitating drug discovery research and development. G-LoSA is freely available to academic users at http://im.compbio.ku.edu/GLoSA/. © 2016 The Protein Society.

  19. Towards a Population Dynamics Theory for Evolutionary Computing: Learning from Biological Population Dynamics in Nature

    Science.gov (United States)

    Ma, Zhanshan (Sam)

    In evolutionary computing (EC), population size is one of the critical parameters that a researcher has to deal with. Hence, it was no surprise that the pioneers of EC, such as De Jong (1975) and Holland (1975), had already studied the population sizing from the very beginning of EC. What is perhaps surprising is that more than three decades later, we still largely depend on the experience or ad-hoc trial-and-error approach to set the population size. For example, in a recent monograph, Eiben and Smith (2003) indicated: "In almost all EC applications, the population size is constant and does not change during the evolutionary search." Despite enormous research on this issue in recent years, we still lack a well accepted theory for population sizing. In this paper, I propose to develop a population dynamics theory forEC with the inspiration from the population dynamics theory of biological populations in nature. Essentially, the EC population is considered as a dynamic system over time (generations) and space (search space or fitness landscape), similar to the spatial and temporal dynamics of biological populations in nature. With this conceptual mapping, I propose to 'transplant' the biological population dynamics theory to EC via three steps: (i) experimentally test the feasibility—whether or not emulating natural population dynamics improves the EC performance; (ii) comparatively study the underlying mechanisms—why there are improvements, primarily via statistical modeling analysis; (iii) conduct theoretical analysis with theoretical models such as percolation theory and extended evolutionary game theory that are generally applicable to both EC and natural populations. This article is a summary of a series of studies we have performed to achieve the general goal [27][30]-[32]. In the following, I start with an extremely brief introduction on the theory and models of natural population dynamics (Sections 1 & 2). In Sections 4 to 6, I briefly discuss three

  20. Computer-aided design of biological circuits using TinkerCell.

    Science.gov (United States)

    Chandran, Deepak; Bergmann, Frank T; Sauro, Herbert M

    2010-01-01

    Synthetic biology is an engineering discipline that builds on modeling practices from systems biology and wet-lab techniques from genetic engineering. As synthetic biology advances, efficient procedures will be developed that will allow a synthetic biologist to design, analyze, and build biological networks. In this idealized pipeline, computer-aided design (CAD) is a necessary component. The role of a CAD application would be to allow efficient transition from a general design to a final product. TinkerCell is a design tool for serving this purpose in synthetic biology. In TinkerCell, users build biological networks using biological parts and modules. The network can be analyzed using one of several functions provided by TinkerCell or custom programs from third-party sources. Since best practices for modeling and constructing synthetic biology networks have not yet been established, TinkerCell is designed as a flexible and extensible application that can adjust itself to changes in the field. © 2010 Landes Bioscience

  1. An Analysis of 27 Years of Research into Computer Education Published in Australian Educational Computing

    Science.gov (United States)

    Zagami, Jason

    2015-01-01

    Analysis of three decades of publications in Australian Educational Computing (AEC) provides insight into the historical trends in Australian educational computing, highlighting an emphasis on pedagogy, comparatively few articles on educational technologies, and strong research topic alignment with similar international journals. Analysis confirms…

  2. Using biological control research in the classroom to promote scientific inquiry and literacy

    Science.gov (United States)

    Many scientists who research biological control also teach at universities or more informally through cooperative outreach. The purpose of this paper is to review biological control activities for the classroom in four refereed journals, The American Biology Teacher, Journal of Biological Education...

  3. Impact of Interdisciplinary Undergraduate Research in Mathematics and Biology on the Development of a New Course Integrating Five STEM Disciplines

    OpenAIRE

    Caudill, Lester; Hill, April; Hoke, Kathy; Lipan, Ovidiu

    2010-01-01

    Funded by innovative programs at the National Science Foundation and the Howard Hughes Medical Institute, University of Richmond faculty in biology, chemistry, mathematics, physics, and computer science teamed up to offer first- and second-year students the opportunity to contribute to vibrant, interdisciplinary research projects. The result was not only good science but also good science that motivated and informed course development. Here, we describe four recent undergraduate research proj...

  4. Measuring Impact of EPAs Computational Toxicology Research (BOSC)

    Science.gov (United States)

    Computational Toxicology (CompTox) research at the EPA was initiated in 2005. Since 2005, CompTox research efforts have made tremendous advances in developing new approaches to evaluate thousands of chemicals for potential health effects. The purpose of this case study is to trac...

  5. Stochastic processes, multiscale modeling, and numerical methods for computational cellular biology

    CERN Document Server

    2017-01-01

    This book focuses on the modeling and mathematical analysis of stochastic dynamical systems along with their simulations. The collected chapters will review fundamental and current topics and approaches to dynamical systems in cellular biology. This text aims to develop improved mathematical and computational methods with which to study biological processes. At the scale of a single cell, stochasticity becomes important due to low copy numbers of biological molecules, such as mRNA and proteins that take part in biochemical reactions driving cellular processes. When trying to describe such biological processes, the traditional deterministic models are often inadequate, precisely because of these low copy numbers. This book presents stochastic models, which are necessary to account for small particle numbers and extrinsic noise sources. The complexity of these models depend upon whether the biochemical reactions are diffusion-limited or reaction-limited. In the former case, one needs to adopt the framework of s...

  6. Using Biological-Control Research in the Classroom to Promote Scientific Inquiry & Literacy

    Science.gov (United States)

    Richardson, Matthew L.; Richardson, Scott L.; Hall, David G.

    2012-01-01

    Scientists researching biological control should engage in education because translating research programs into classroom activities is a pathway to increase scientific literacy among students. Classroom activities focused on biological control target all levels of biological organization and can be cross-disciplinary by drawing from subject areas…

  7. Computational chemistry in pharmaceutical research: at the crossroads.

    Science.gov (United States)

    Bajorath, Jürgen

    2012-01-01

    Computational approaches are an integral part of pharmaceutical research. However, there are many of unsolved key questions that limit the scientific progress in the still evolving computational field and its impact on drug discovery. Importantly, a number of these questions are not new but date back many years. Hence, it might be difficult to conclusively answer them in the foreseeable future. Moreover, the computational field as a whole is characterized by a high degree of heterogeneity and so is, unfortunately, the quality of its scientific output. In light of this situation, it is proposed that changes in scientific standards and culture should be seriously considered now in order to lay a foundation for future progress in computational research.

  8. Implications of Plasmodium vivax Biology for Control, Elimination, and Research

    Science.gov (United States)

    Olliaro, Piero L.; Barnwell, John W.; Barry, Alyssa; Mendis, Kamini; Mueller, Ivo; Reeder, John C.; Shanks, G. Dennis; Snounou, Georges; Wongsrichanalai, Chansuda

    2016-01-01

    This paper summarizes our current understanding of the biology of Plasmodium vivax, how it differs from Plasmodium falciparum, and how these differences explain the need for P. vivax-tailored interventions. The article further pinpoints knowledge gaps where investments in research are needed to help identify and develop such specific interventions. The principal obstacles to reduce and eventually eliminate P. vivax reside in 1) its higher vectorial capacity compared with P. falciparum due to its ability to develop at lower temperature and over a shorter sporogonic cycle in the vector, allowing transmission in temperate zones and making it less sensitive to vector control measures that are otherwise effective on P. falciparum; 2) the presence of dormant liver forms (hypnozoites), sustaining multiple relapsing episodes from a single infectious bite that cannot be diagnosed and are not susceptible to any available antimalarial except primaquine, with routine deployment restricted by toxicity; 3) low parasite densities, which are difficult to detect with current diagnostics leading to missed diagnoses and delayed treatments (and protracted transmission), coupled with 4) transmission stages (gametocytes) occurring early in acute infections, before infection is diagnosed. PMID:27799636

  9. Dynamic models in research and management of biological invasions.

    Science.gov (United States)

    Buchadas, Ana; Vaz, Ana Sofia; Honrado, João P; Alagador, Diogo; Bastos, Rita; Cabral, João A; Santos, Mário; Vicente, Joana R

    2017-07-01

    Invasive species are increasing in number, extent and impact worldwide. Effective invasion management has thus become a core socio-ecological challenge. To tackle this challenge, integrating spatial-temporal dynamics of invasion processes with modelling approaches is a promising approach. The inclusion of dynamic processes in such modelling frameworks (i.e. dynamic or hybrid models, here defined as models that integrate both dynamic and static approaches) adds an explicit temporal dimension to the study and management of invasions, enabling the prediction of invasions and optimisation of multi-scale management and governance. However, the extent to which dynamic approaches have been used for that purpose is under-investigated. Based on a literature review, we examined the extent to which dynamic modelling has been used to address invasions worldwide. We then evaluated how the use of dynamic modelling has evolved through time in the scope of invasive species management. The results suggest that modelling, in particular dynamic modelling, has been increasingly applied to biological invasions, especially to support management decisions at local scales. Also, the combination of dynamic and static modelling approaches (hybrid models with a spatially explicit output) can be especially effective, not only to support management at early invasion stages (from prevention to early detection), but also to improve the monitoring of invasion processes and impact assessment. Further development and testing of such hybrid models may well be regarded as a priority for future research aiming to improve the management of invasions across scales. Copyright © 2017 Elsevier Ltd. All rights reserved.

  10. BRIC-100VC Biological Research in Canisters (BRIC)-100VC

    Science.gov (United States)

    Richards, Stephanie E.; Levine, Howard G. (Compiler); Romero, Vergel

    2016-01-01

    The Biological Research in Canisters (BRIC) is an anodized-aluminum cylinder used to provide passive stowage for investigations of the effects of space flight on small specimens. The BRIC 100 mm petri dish vacuum containment unit (BRIC-100VC) has supported Dugesia japonica (flatworm) within spring under normal atmospheric conditions for 29 days in space and Hemerocallis lilioasphodelus L. (daylily) somatic embryo development within a 5% CO2 gaseous environment for 4.5 months in space. BRIC-100VC is a completely sealed, anodized-aluminum cylinder (Fig. 1) providing containment and structural support of the experimental specimens. The top and bottom lids of the canister include rapid disconnect valves for filling the canister with selected gases. These specialized valves allow for specific atmospheric containment within the canister, providing a gaseous environment defined by the investigator. Additionally, the top lid has been designed with a toggle latch and O-ring assembly allowing for prompt sealing and removal of the lid. The outside dimensions of the BRIC-100VC canisters are 16.0 cm (height) x 11.4 cm (outside diameter). The lower portion of the canister has been equipped with sufficient storage space for passive temperature and relative humidity data loggers. The BRIC- 100VC canister has been optimized to accommodate standard 100 mm laboratory petri dishes or 50 mL conical tubes. Depending on storage orientation, up to 6 or 9 canisters have been flown within an International Space Station (ISS) stowage locker.

  11. Implications of Plasmodium vivax Biology for Control, Elimination, and Research.

    Science.gov (United States)

    Olliaro, Piero L; Barnwell, John W; Barry, Alyssa; Mendis, Kamini; Mueller, Ivo; Reeder, John C; Shanks, G Dennis; Snounou, Georges; Wongsrichanalai, Chansuda

    2016-12-28

    This paper summarizes our current understanding of the biology of Plasmodium vivax, how it differs from Plasmodium falciparum, and how these differences explain the need for P. vivax-tailored interventions. The article further pinpoints knowledge gaps where investments in research are needed to help identify and develop such specific interventions. The principal obstacles to reduce and eventually eliminate P. vivax reside in 1) its higher vectorial capacity compared with P. falciparum due to its ability to develop at lower temperature and over a shorter sporogonic cycle in the vector, allowing transmission in temperate zones and making it less sensitive to vector control measures that are otherwise effective on P. falciparum; 2) the presence of dormant liver forms (hypnozoites), sustaining multiple relapsing episodes from a single infectious bite that cannot be diagnosed and are not susceptible to any available antimalarial except primaquine, with routine deployment restricted by toxicity; 3) low parasite densities, which are difficult to detect with current diagnostics leading to missed diagnoses and delayed treatments (and protracted transmission), coupled with 4) transmission stages (gametocytes) occurring early in acute infections, before infection is diagnosed. © The American Society of Tropical Medicine and Hygiene.

  12. From Levy to Brownian: a computational model based on biological fluctuation.

    Directory of Open Access Journals (Sweden)

    Surya G Nurzaman

    Full Text Available BACKGROUND: Theoretical studies predict that Lévy walks maximizes the chance of encountering randomly distributed targets with a low density, but Brownian walks is favorable inside a patch of targets with high density. Recently, experimental data reports that some animals indeed show a Lévy and Brownian walk movement patterns when forage for foods in areas with low and high density. This paper presents a simple, Gaussian-noise utilizing computational model that can realize such behavior. METHODOLOGY/PRINCIPAL FINDINGS: We extend Lévy walks model of one of the simplest creature, Escherichia coli, based on biological fluctuation framework. We build a simulation of a simple, generic animal to observe whether Lévy or Brownian walks will be performed properly depends on the target density, and investigate the emergent behavior in a commonly faced patchy environment where the density alternates. CONCLUSIONS/SIGNIFICANCE: Based on the model, animal behavior of choosing Lévy or Brownian walk movement patterns based on the target density is able to be generated, without changing the essence of the stochastic property in Escherichia coli physiological mechanism as explained by related researches. The emergent behavior and its benefits in a patchy environment are also discussed. The model provides a framework for further investigation on the role of internal noise in realizing adaptive and efficient foraging behavior.

  13. From Lévy to Brownian: a computational model based on biological fluctuation.

    Science.gov (United States)

    Nurzaman, Surya G; Matsumoto, Yoshio; Nakamura, Yutaka; Shirai, Kazumichi; Koizumi, Satoshi; Ishiguro, Hiroshi

    2011-02-03

    Theoretical studies predict that Lévy walks maximizes the chance of encountering randomly distributed targets with a low density, but Brownian walks is favorable inside a patch of targets with high density. Recently, experimental data reports that some animals indeed show a Lévy and Brownian walk movement patterns when forage for foods in areas with low and high density. This paper presents a simple, Gaussian-noise utilizing computational model that can realize such behavior. We extend Lévy walks model of one of the simplest creature, Escherichia coli, based on biological fluctuation framework. We build a simulation of a simple, generic animal to observe whether Lévy or Brownian walks will be performed properly depends on the target density, and investigate the emergent behavior in a commonly faced patchy environment where the density alternates. Based on the model, animal behavior of choosing Lévy or Brownian walk movement patterns based on the target density is able to be generated, without changing the essence of the stochastic property in Escherichia coli physiological mechanism as explained by related researches. The emergent behavior and its benefits in a patchy environment are also discussed. The model provides a framework for further investigation on the role of internal noise in realizing adaptive and efficient foraging behavior.

  14. The nature and use of prediction skills in a biological computer simulation

    Science.gov (United States)

    Lavoie, Derrick R.; Good, Ron

    The primary goal of this study was to examine the science process skill of prediction using qualitative research methodology. The think-aloud interview, modeled after Ericsson and Simon (1984), let to the identification of 63 program exploration and prediction behaviors.The performance of seven formal and seven concrete operational high-school biology students were videotaped during a three-phase learning sequence on water pollution. Subjects explored the effects of five independent variables on two dependent variables over time using a computer-simulation program. Predictions were made concerning the effect of the independent variables upon dependent variables through time. Subjects were identified according to initial knowledge of the subject matter and success at solving three selected prediction problems.Successful predictors generally had high initial knowledge of the subject matter and were formal operational. Unsuccessful predictors generally had low initial knowledge and were concrete operational. High initial knowledge seemed to be more important to predictive success than stage of Piagetian cognitive development.Successful prediction behaviors involved systematic manipulation of the independent variables, note taking, identification and use of appropriate independent-dependent variable relationships, high interest and motivation, and in general, higher-level thinking skills. Behaviors characteristic of unsuccessful predictors were nonsystematic manipulation of independent variables, lack of motivation and persistence, misconceptions, and the identification and use of inappropriate independent-dependent variable relationships.

  15. Community-driven development for computational biology at Sprints, Hackathons and Codefests.

    Science.gov (United States)

    Möller, Steffen; Afgan, Enis; Banck, Michael; Bonnal, Raoul J P; Booth, Timothy; Chilton, John; Cock, Peter J A; Gumbel, Markus; Harris, Nomi; Holland, Richard; Kalaš, Matúš; Kaján, László; Kibukawa, Eri; Powel, David R; Prins, Pjotr; Quinn, Jacqueline; Sallou, Olivier; Strozzi, Francesco; Seemann, Torsten; Sloggett, Clare; Soiland-Reyes, Stian; Spooner, William; Steinbiss, Sascha; Tille, Andreas; Travis, Anthony J; Guimera, Roman; Katayama, Toshiaki; Chapman, Brad A

    2014-01-01

    Computational biology comprises a wide range of technologies and approaches. Multiple technologies can be combined to create more powerful workflows if the individuals contributing the data or providing tools for its interpretation can find mutual understanding and consensus. Much conversation and joint investigation are required in order to identify and implement the best approaches. Traditionally, scientific conferences feature talks presenting novel technologies or insights, followed up by informal discussions during coffee breaks. In multi-institution collaborations, in order to reach agreement on implementation details or to transfer deeper insights in a technology and practical skills, a representative of one group typically visits the other. However, this does not scale well when the number of technologies or research groups is large. Conferences have responded to this issue by introducing Birds-of-a-Feather (BoF) sessions, which offer an opportunity for individuals with common interests to intensify their interaction. However, parallel BoF sessions often make it hard for participants to join multiple BoFs and find common ground between the different technologies, and BoFs are generally too short to allow time for participants to program together. This report summarises our experience with computational biology Codefests, Hackathons and Sprints, which are interactive developer meetings. They are structured to reduce the limitations of traditional scientific meetings described above by strengthening the interaction among peers and letting the participants determine the schedule and topics. These meetings are commonly run as loosely scheduled "unconferences" (self-organized identification of participants and topics for meetings) over at least two days, with early introductory talks to welcome and organize contributors, followed by intensive collaborative coding sessions. We summarise some prominent achievements of those meetings and describe differences in how

  16. Quantum Biology at the Cellular Level - elements of the research program

    OpenAIRE

    Bordonaro, Michael; Ogryzko, Vasily

    2013-01-01

    Quantum Biology is emerging as a new field at the intersection between fundamental physics and biology, promising novel insights into the nature and origin of biological order. We discuss several elements of QBCL (Quantum Biology at Cellular Level), a research program designed to extend the reach of quantum concepts to higher than molecular levels of biological organization. Key words. decoherence, macroscopic superpositions, basis-dependence, formal superposition, non-classical correlations,...

  17. Ethical Guidelines for Computer Security Researchers: "Be Reasonable"

    Science.gov (United States)

    Sassaman, Len

    For most of its existence, the field of computer science has been lucky enough to avoid ethical dilemmas by virtue of its relatively benign nature. The subdisciplines of programming methodology research, microprocessor design, and so forth have little room for the greater questions of human harm. Other, more recently developed sub-disciplines, such as data mining, social network analysis, behavioral profiling, and general computer security, however, open the door to abuse of users by practitioners and researchers. It is therefore the duty of the men and women who chart the course of these fields to set rules for themselves regarding what sorts of actions on their part are to be considered acceptable and what should be avoided or handled with caution out of ethical concerns. This paper deals solely with the issues faced by computer security researchers, be they vulnerability analysts, privacy system designers, malware experts, or reverse engineers.

  18. A Survey of Comics Research in Computer Science

    Directory of Open Access Journals (Sweden)

    Olivier Augereau

    2018-06-01

    Full Text Available Graphic novels such as comic books and mangas are well known all over the world. The digital transition started to change the way people are reading comics: more and more on smartphones and tablets, and less and less on paper. In recent years, a wide variety of research about comics has been proposed and might change the way comics are created, distributed and read in the future. Early work focuses on low level document image analysis. Comic books are complex; they contains text, drawings, balloons, panels, onomatopoeia, etc. Different fields of computer science covered research about user interaction and content generation such as multimedia, artificial intelligence, human–computer interaction, etc. with different sets of values. We review the previous research about comics in computer science to state what has been done and give some insights about the main outlooks.

  19. The progress of molecular biology in radiation research

    International Nuclear Information System (INIS)

    Wei Kang

    1989-01-01

    The recent progress in application of molecular biology techniques in the study of radiation biology is reviewed. The three sections are as follows: (1) the study of DNA damage on molecular level, (2) the molecular mechanism of radiation cell genetics, including chromosome abberation and cell mutation, (3) the study on DNA repair gene with DNA mediated gene transfer techniques

  20. Large scale computing in the Energy Research Programs

    International Nuclear Information System (INIS)

    1991-05-01

    The Energy Research Supercomputer Users Group (ERSUG) comprises all investigators using resources of the Department of Energy Office of Energy Research supercomputers. At the December 1989 meeting held at Florida State University (FSU), the ERSUG executive committee determined that the continuing rapid advances in computational sciences and computer technology demanded a reassessment of the role computational science should play in meeting DOE's commitments. Initial studies were to be performed for four subdivisions: (1) Basic Energy Sciences (BES) and Applied Mathematical Sciences (AMS), (2) Fusion Energy, (3) High Energy and Nuclear Physics, and (4) Health and Environmental Research. The first two subgroups produced formal subreports that provided a basis for several sections of this report. Additional information provided in the AMS/BES is included as Appendix C in an abridged form that eliminates most duplication. Additionally, each member of the executive committee was asked to contribute area-specific assessments; these assessments are included in the next section. In the following sections, brief assessments are given for specific areas, a conceptual model is proposed that the entire computational effort for energy research is best viewed as one giant nation-wide computer, and then specific recommendations are made for the appropriate evolution of the system

  1. Biology panel: coming to a clinic near you. Translational research in radiation biology

    International Nuclear Information System (INIS)

    Travis, Elizabeth L.; Thames, Howard D.

    1996-01-01

    The explosion of knowledge in molecular biology coupled with the rapid and continuing development of molecular techniques allow a new level of research in radiation biology aimed at understanding the processes that govern radiation damage and response in both tumors and normal tissues. The challenge to radiation biologists and radiation oncologists is to use this knowledge to improve the therapeutic ratio in the management of human tumors by rapidly translating these new findings into clinical practice. This panel will focus on both sides of the therapeutic ratio coin, the manipulation of tumor control by manipulating the processes that control cell cycle regulation and apoptosis, and the reduction of normal tissue morbidity by applying the emerging information on the genetic basis of radiosensitivity. Apoptosis is a form of cell death believed to represent a minor component of the clinical effects of radiation. However, if apoptosis is regulated by anti-apoptotic mechanisms, then it may be possible to produce a pro-apoptotic phenotype in the tumor cell population by modulating the balance between pro- and anti-apoptotic mechanisms by pharmacological intervention. Thus signaling-based apoptosis therapy, designed to overcome the relative resistance to radiation-induced apoptosis, may improve the therapeutic ratio in the management of human tumors. The explosion of information concerning cell cycle regulation in both normal and tumor cells has provided the opportunity for insights into the mechanism of action of chemotherapeutic agents that can act as radiosensitizers. The second talk will explore the hypothesis that the dysregulation of cell cycle checkpoints in some cancers can be exploited to improve the therapeutic index of radiation sensitizers, specifically the fluoropyrimidines which appear to act at the G1/S transition. Finally, efforts to increase tumor control will be translated into clinical practice only if such treatments do not increase the complication

  2. Computational Science Research in Support of Petascale Electromagnetic Modeling

    International Nuclear Information System (INIS)

    Lee, L.-Q.

    2008-01-01

    Computational science research components were vital parts of the SciDAC-1 accelerator project and are continuing to play a critical role in newly-funded SciDAC-2 accelerator project, the Community Petascale Project for Accelerator Science and Simulation (ComPASS). Recent advances and achievements in the area of computational science research in support of petascale electromagnetic modeling for accelerator design analysis are presented, which include shape determination of superconducting RF cavities, mesh-based multilevel preconditioner in solving highly-indefinite linear systems, moving window using h- or p- refinement for time-domain short-range wakefield calculations, and improved scalable application I/O

  3. Large Scale Computing and Storage Requirements for Nuclear Physics Research

    Energy Technology Data Exchange (ETDEWEB)

    Gerber, Richard A.; Wasserman, Harvey J.

    2012-03-02

    IThe National Energy Research Scientific Computing Center (NERSC) is the primary computing center for the DOE Office of Science, serving approximately 4,000 users and hosting some 550 projects that involve nearly 700 codes for a wide variety of scientific disciplines. In addition to large-scale computing resources NERSC provides critical staff support and expertise to help scientists make the most efficient use of these resources to advance the scientific mission of the Office of Science. In May 2011, NERSC, DOE’s Office of Advanced Scientific Computing Research (ASCR) and DOE’s Office of Nuclear Physics (NP) held a workshop to characterize HPC requirements for NP research over the next three to five years. The effort is part of NERSC’s continuing involvement in anticipating future user needs and deploying necessary resources to meet these demands. The workshop revealed several key requirements, in addition to achieving its goal of characterizing NP computing. The key requirements include: 1. Larger allocations of computational resources at NERSC; 2. Visualization and analytics support; and 3. Support at NERSC for the unique needs of experimental nuclear physicists. This report expands upon these key points and adds others. The results are based upon representative samples, called “case studies,” of the needs of science teams within NP. The case studies were prepared by NP workshop participants and contain a summary of science goals, methods of solution, current and future computing requirements, and special software and support needs. Participants were also asked to describe their strategy for computing in the highly parallel, “multi-core” environment that is expected to dominate HPC architectures over the next few years. The report also includes a section with NERSC responses to the workshop findings. NERSC has many initiatives already underway that address key workshop findings and all of the action items are aligned with NERSC strategic plans.

  4. BrisSynBio: a BBSRC/EPSRC-funded Synthetic Biology Research Centre.

    Science.gov (United States)

    Sedgley, Kathleen R; Race, Paul R; Woolfson, Derek N

    2016-06-15

    BrisSynBio is the Bristol-based Biotechnology and Biological Sciences Research Council (BBSRC)/Engineering and Physical Sciences Research Council (EPSRC)-funded Synthetic Biology Research Centre. It is one of six such Centres in the U.K. BrisSynBio's emphasis is on rational and predictive bimolecular modelling, design and engineering in the context of synthetic biology. It trains the next generation of synthetic biologists in these approaches, to facilitate translation of fundamental synthetic biology research to industry and the clinic, and to do this within an innovative and responsible research framework. © 2016 The Author(s).

  5. Chemical and biological warfare. Should defenses be researched and deployed?

    Science.gov (United States)

    Orient, J M

    1989-08-04

    The threat of chemical and biological weapons of mass destruction has intensified because of improved delivery systems and advances in chemistry, genetics, and other sciences. Possible US responses to this threat include deterrence, defenses, and/or disarmament, including a reaffirmation of the Biological and Toxin Weapons Convention of 1972, which is now in jeopardy. This article discusses the history of chemical and biological warfare, existing and potential weapons, the proliferation of weapons and delivery systems, ways to prevent the use of these weapons, and ways to protect populations from their effects.

  6. Neuromorphic Computing – From Materials Research to Systems Architecture Roundtable

    Energy Technology Data Exchange (ETDEWEB)

    Schuller, Ivan K. [Univ. of California, San Diego, CA (United States); Stevens, Rick [Argonne National Lab. (ANL), Argonne, IL (United States); Univ. of Chicago, IL (United States); Pino, Robinson [Dept. of Energy (DOE) Office of Science, Washington, DC (United States); Pechan, Michael [Dept. of Energy (DOE) Office of Science, Washington, DC (United States)

    2015-10-29

    Computation in its many forms is the engine that fuels our modern civilization. Modern computation—based on the von Neumann architecture—has allowed, until now, the development of continuous improvements, as predicted by Moore’s law. However, computation using current architectures and materials will inevitably—within the next 10 years—reach a limit because of fundamental scientific reasons. DOE convened a roundtable of experts in neuromorphic computing systems, materials science, and computer science in Washington on October 29-30, 2015 to address the following basic questions: Can brain-like (“neuromorphic”) computing devices based on new material concepts and systems be developed to dramatically outperform conventional CMOS based technology? If so, what are the basic research challenges for materials sicence and computing? The overarching answer that emerged was: The development of novel functional materials and devices incorporated into unique architectures will allow a revolutionary technological leap toward the implementation of a fully “neuromorphic” computer. To address this challenge, the following issues were considered: The main differences between neuromorphic and conventional computing as related to: signaling models, timing/clock, non-volatile memory, architecture, fault tolerance, integrated memory and compute, noise tolerance, analog vs. digital, and in situ learning New neuromorphic architectures needed to: produce lower energy consumption, potential novel nanostructured materials, and enhanced computation Device and materials properties needed to implement functions such as: hysteresis, stability, and fault tolerance Comparisons of different implementations: spin torque, memristors, resistive switching, phase change, and optical schemes for enhanced breakthroughs in performance, cost, fault tolerance, and/or manufacturability.

  7. Defining Biological Networks for Noise Buffering and Signaling Sensitivity Using Approximate Bayesian Computation

    Directory of Open Access Journals (Sweden)

    Shuqiang Wang

    2014-01-01

    Full Text Available Reliable information processing in cells requires high sensitivity to changes in the input signal but low sensitivity to random fluctuations in the transmitted signal. There are often many alternative biological circuits qualifying for this biological function. Distinguishing theses biological models and finding the most suitable one are essential, as such model ranking, by experimental evidence, will help to judge the support of the working hypotheses forming each model. Here, we employ the approximate Bayesian computation (ABC method based on sequential Monte Carlo (SMC to search for biological circuits that can maintain signaling sensitivity while minimizing noise propagation, focusing on cases where the noise is characterized by rapid fluctuations. By systematically analyzing three-component circuits, we rank these biological circuits and identify three-basic-biological-motif buffering noise while maintaining sensitivity to long-term changes in input signals. We discuss in detail a particular implementation in control of nutrient homeostasis in yeast. The principal component analysis of the posterior provides insight into the nature of the reaction between nodes.

  8. The transhumanism of Ray Kurzweil. Is biological ontology reducible to computation?

    Directory of Open Access Journals (Sweden)

    Javier Monserrat

    2016-02-01

    Full Text Available Computer programs, primarily engineering machine vision and programming of somatic sensors, have already allowed, and they will do it more perfectly in the future, to build high perfection androids or cyborgs. They will collaborate with man and open new moral reflections to respect the ontological dignity in the new humanoid machines. In addition, both men and new androids will be in connection with huge external computer networks that will grow up to almost incredible levels the efficiency in the domain of body and nature. However, our current scientific knowledge, on the one hand, about hardware and software that will support both the humanoid machines and external computer networks, made with existing engineering (and also the foreseeable medium and even long term engineering and, on the other hand, our scientific knowledge about animal and human behavior from neural-biological structures that produce a psychic system, allow us to establish that there is no scientific basis to talk about an ontological identity between the computational machines and man. Accordingly, different ontologies (computational machines and biological entities will produce various different functional systems. There may be simulation, but never ontological identity. These ideas are essential to assess the transhumanism of Ray Kurzweil.

  9. A Research Roadmap for Computation-Based Human Reliability Analysis

    Energy Technology Data Exchange (ETDEWEB)

    Boring, Ronald [Idaho National Lab. (INL), Idaho Falls, ID (United States); Mandelli, Diego [Idaho National Lab. (INL), Idaho Falls, ID (United States); Joe, Jeffrey [Idaho National Lab. (INL), Idaho Falls, ID (United States); Smith, Curtis [Idaho National Lab. (INL), Idaho Falls, ID (United States); Groth, Katrina [Sandia National Lab. (SNL-NM), Albuquerque, NM (United States)

    2015-08-01

    The United States (U.S.) Department of Energy (DOE) is sponsoring research through the Light Water Reactor Sustainability (LWRS) program to extend the life of the currently operating fleet of commercial nuclear power plants. The Risk Informed Safety Margin Characterization (RISMC) research pathway within LWRS looks at ways to maintain and improve the safety margins of these plants. The RISMC pathway includes significant developments in the area of thermalhydraulics code modeling and the development of tools to facilitate dynamic probabilistic risk assessment (PRA). PRA is primarily concerned with the risk of hardware systems at the plant; yet, hardware reliability is often secondary in overall risk significance to human errors that can trigger or compound undesirable events at the plant. This report highlights ongoing efforts to develop a computation-based approach to human reliability analysis (HRA). This computation-based approach differs from existing static and dynamic HRA approaches in that it: (i) interfaces with a dynamic computation engine that includes a full scope plant model, and (ii) interfaces with a PRA software toolset. The computation-based HRA approach presented in this report is called the Human Unimodels for Nuclear Technology to Enhance Reliability (HUNTER) and incorporates in a hybrid fashion elements of existing HRA methods to interface with new computational tools developed under the RISMC pathway. The goal of this research effort is to model human performance more accurately than existing approaches, thereby minimizing modeling uncertainty found in current plant risk models.

  10. A Research Roadmap for Computation-Based Human Reliability Analysis

    International Nuclear Information System (INIS)

    Boring, Ronald; Mandelli, Diego; Joe, Jeffrey; Smith, Curtis; Groth, Katrina

    2015-01-01

    The United States (U.S.) Department of Energy (DOE) is sponsoring research through the Light Water Reactor Sustainability (LWRS) program to extend the life of the currently operating fleet of commercial nuclear power plants. The Risk Informed Safety Margin Characterization (RISMC) research pathway within LWRS looks at ways to maintain and improve the safety margins of these plants. The RISMC pathway includes significant developments in the area of thermalhydraulics code modeling and the development of tools to facilitate dynamic probabilistic risk assessment (PRA). PRA is primarily concerned with the risk of hardware systems at the plant; yet, hardware reliability is often secondary in overall risk significance to human errors that can trigger or compound undesirable events at the plant. This report highlights ongoing efforts to develop a computation-based approach to human reliability analysis (HRA). This computation-based approach differs from existing static and dynamic HRA approaches in that it: (i) interfaces with a dynamic computation engine that includes a full scope plant model, and (ii) interfaces with a PRA software toolset. The computation-based HRA approach presented in this report is called the Human Unimodels for Nuclear Technology to Enhance Reliability (HUNTER) and incorporates in a hybrid fashion elements of existing HRA methods to interface with new computational tools developed under the RISMC pathway. The goal of this research effort is to model human performance more accurately than existing approaches, thereby minimizing modeling uncertainty found in current plant risk models.

  11. Periodicity computation of generalized mathematical biology problems involving delay differential equations.

    Science.gov (United States)

    Jasim Mohammed, M; Ibrahim, Rabha W; Ahmad, M Z

    2017-03-01

    In this paper, we consider a low initial population model. Our aim is to study the periodicity computation of this model by using neutral differential equations, which are recognized in various studies including biology. We generalize the neutral Rayleigh equation for the third-order by exploiting the model of fractional calculus, in particular the Riemann-Liouville differential operator. We establish the existence and uniqueness of a periodic computational outcome. The technique depends on the continuation theorem of the coincidence degree theory. Besides, an example is presented to demonstrate the finding.

  12. G‐LoSA: An efficient computational tool for local structure‐centric biological studies and drug design

    Science.gov (United States)

    2016-01-01

    Abstract Molecular recognition by protein mostly occurs in a local region on the protein surface. Thus, an efficient computational method for accurate characterization of protein local structural conservation is necessary to better understand biology and drug design. We present a novel local structure alignment tool, G‐LoSA. G‐LoSA aligns protein local structures in a sequence order independent way and provides a GA‐score, a chemical feature‐based and size‐independent structure similarity score. Our benchmark validation shows the robust performance of G‐LoSA to the local structures of diverse sizes and characteristics, demonstrating its universal applicability to local structure‐centric comparative biology studies. In particular, G‐LoSA is highly effective in detecting conserved local regions on the entire surface of a given protein. In addition, the applications of G‐LoSA to identifying template ligands and predicting ligand and protein binding sites illustrate its strong potential for computer‐aided drug design. We hope that G‐LoSA can be a useful computational method for exploring interesting biological problems through large‐scale comparison of protein local structures and facilitating drug discovery research and development. G‐LoSA is freely available to academic users at http://im.compbio.ku.edu/GLoSA/. PMID:26813336

  13. Tumor Biology and Immunology | Center for Cancer Research

    Science.gov (United States)

    Tumor Biology and Immunology The Comparative Brain Tumor Consortium is collaborating with National Center for Advanced Translational Sciences to complete whole exome sequencing on canine meningioma samples. Results will be published and made publicly available.

  14. [Research progress of mammalian synthetic biology in biomedical field].

    Science.gov (United States)

    Yang, Linfeng; Yin, Jianli; Wang, Meiyan; Ye, Haifeng

    2017-03-25

    Although still in its infant stage, synthetic biology has achieved remarkable development and progress during the past decade. Synthetic biology applies engineering principles to design and construct gene circuits uploaded into living cells or organisms to perform novel or improved functions, and it has been widely used in many fields. In this review, we describe the recent advances of mammalian synthetic biology for the treatment of diseases. We introduce common tools and design principles of synthetic gene circuits, and then we demonstrate open-loop gene circuits induced by different trigger molecules used in disease diagnosis and close-loop gene circuits used for biomedical applications. Finally, we discuss the perspectives and potential challenges of synthetic biology for clinical applications.

  15. Applications of neutron scattering in molecular biological research

    International Nuclear Information System (INIS)

    Nierhaus, K.H.

    1984-01-01

    The study of the molecular structure of biological materials by neutron scattering is described. As example the results of the study of the components of a ribosome of Escherichia coli are presented. (HSI) [de

  16. Biometry: the principles and practice of statistics in biological research

    National Research Council Canada - National Science Library

    Sokal, R.R; Rohlf, F.J

    1969-01-01

    In this introductory textbook, with its companion volume of tables, the authors provide a balanced presentation of statistical methodology for the descriptive, experimental, and analytical study of biological phenomena...

  17. Secure Encapsulation and Publication of Biological Services in the Cloud Computing Environment

    Science.gov (United States)

    Zhang, Weizhe; Wang, Xuehui; Lu, Bo; Kim, Tai-hoon

    2013-01-01

    Secure encapsulation and publication for bioinformatics software products based on web service are presented, and the basic function of biological information is realized in the cloud computing environment. In the encapsulation phase, the workflow and function of bioinformatics software are conducted, the encapsulation interfaces are designed, and the runtime interaction between users and computers is simulated. In the publication phase, the execution and management mechanisms and principles of the GRAM components are analyzed. The functions such as remote user job submission and job status query are implemented by using the GRAM components. The services of bioinformatics software are published to remote users. Finally the basic prototype system of the biological cloud is achieved. PMID:24078906

  18. Secure Encapsulation and Publication of Biological Services in the Cloud Computing Environment

    Directory of Open Access Journals (Sweden)

    Weizhe Zhang

    2013-01-01

    Full Text Available Secure encapsulation and publication for bioinformatics software products based on web service are presented, and the basic function of biological information is realized in the cloud computing environment. In the encapsulation phase, the workflow and function of bioinformatics software are conducted, the encapsulation interfaces are designed, and the runtime interaction between users and computers is simulated. In the publication phase, the execution and management mechanisms and principles of the GRAM components are analyzed. The functions such as remote user job submission and job status query are implemented by using the GRAM components. The services of bioinformatics software are published to remote users. Finally the basic prototype system of the biological cloud is achieved.

  19. Computer Simulation and Data Analysis in Molecular Biology and Biophysics An Introduction Using R

    CERN Document Server

    Bloomfield, Victor

    2009-01-01

    This book provides an introduction, suitable for advanced undergraduates and beginning graduate students, to two important aspects of molecular biology and biophysics: computer simulation and data analysis. It introduces tools to enable readers to learn and use fundamental methods for constructing quantitative models of biological mechanisms, both deterministic and with some elements of randomness, including complex reaction equilibria and kinetics, population models, and regulation of metabolism and development; to understand how concepts of probability can help in explaining important features of DNA sequences; and to apply a useful set of statistical methods to analysis of experimental data from spectroscopic, genomic, and proteomic sources. These quantitative tools are implemented using the free, open source software program R. R provides an excellent environment for general numerical and statistical computing and graphics, with capabilities similar to Matlab®. Since R is increasingly used in bioinformat...

  20. Biological research work within the Association of the Government-Sponsored Research Institutions (AGF)

    International Nuclear Information System (INIS)

    1991-01-01

    Six of the thirteen government-sponsored research institutions in the Federal Republic of Germany carry out research work for the protection of the population against the harmful effects of ionizing radiation. Their activities in this field concentrate on the following four points of main interest: analysis of radiation-induced processes resulting in biological radiation injury; description and analysis of complex radiation effects on man; medical applications of ionizing radiation for diagnosis and therapy; concepts and methods for radiological protection. The work reported reviews the main problems encountered in the above-mentioned subject fields and presents examples of significant results, with illustrations. The original research papers and their authors are listed separately under the four points of main interest. (orig./MG) [de

  1. ADAM: analysis of discrete models of biological systems using computer algebra.

    Science.gov (United States)

    Hinkelmann, Franziska; Brandon, Madison; Guang, Bonny; McNeill, Rustin; Blekherman, Grigoriy; Veliz-Cuba, Alan; Laubenbacher, Reinhard

    2011-07-20

    Many biological systems are modeled qualitatively with discrete models, such as probabilistic Boolean networks, logical models, Petri nets, and agent-based models, to gain a better understanding of them. The computational complexity to analyze the complete dynamics of these models grows exponentially in the number of variables, which impedes working with complex models. There exist software tools to analyze discrete models, but they either lack the algorithmic functionality to analyze complex models deterministically or they are inaccessible to many users as they require understanding the underlying algorithm and implementation, do not have a graphical user interface, or are hard to install. Efficient analysis methods that are accessible to modelers and easy to use are needed. We propose a method for efficiently identifying attractors and introduce the web-based tool Analysis of Dynamic Algebraic Models (ADAM), which provides this and other analysis methods for discrete models. ADAM converts several discrete model types automatically into polynomial dynamical systems and analyzes their dynamics using tools from computer algebra. Specifically, we propose a method to identify attractors of a discrete model that is equivalent to solving a system of polynomial equations, a long-studied problem in computer algebra. Based on extensive experimentation with both discrete models arising in systems biology and randomly generated networks, we found that the algebraic algorithms presented in this manuscript are fast for systems with the structure maintained by most biological systems, namely sparseness and robustness. For a large set of published complex discrete models, ADAM identified the attractors in less than one second. Discrete modeling techniques are a useful tool for analyzing complex biological systems and there is a need in the biological community for accessible efficient analysis tools. ADAM provides analysis methods based on mathematical algorithms as a web

  2. Computers, Laptops and Tools. ACER Research Monograph No. 56.

    Science.gov (United States)

    Ainley, Mary; Bourke, Valerie; Chatfield, Robert; Hillman, Kylie; Watkins, Ian

    In 1997, Balwyn High School (Australia) instituted a class of 28 Year 7 students to use laptop computers across the curriculum. This report details findings from an action research project that monitored important aspects of what happened when this program was introduced. A range of measures was developed to assess the influence of the use of…

  3. Results of a Research Evaluating Quality of Computer Science Education

    Science.gov (United States)

    Záhorec, Ján; Hašková, Alena; Munk, Michal

    2012-01-01

    The paper presents the results of an international research on a comparative assessment of the current status of computer science education at the secondary level (ISCED 3A) in Slovakia, the Czech Republic, and Belgium. Evaluation was carried out based on 14 specific factors gauging the students' point of view. The authors present qualitative…

  4. National Energy Research Scientific Computing Center 2007 Annual Report

    Energy Technology Data Exchange (ETDEWEB)

    Hules, John A.; Bashor, Jon; Wang, Ucilia; Yarris, Lynn; Preuss, Paul

    2008-10-23

    This report presents highlights of the research conducted on NERSC computers in a variety of scientific disciplines during the year 2007. It also reports on changes and upgrades to NERSC's systems and services aswell as activities of NERSC staff.

  5. Areas of research in radiation chemistry fundamental to radiation biology

    International Nuclear Information System (INIS)

    Powers, E.L.

    1980-01-01

    Among all the environmental hazards to which man is exposed, ionizing radiation is the most thoroughly investigated and the most responsibly monitored and controlled. Nevertheless, because of the importance of radiation in modern society from both the hazard as well as the utilitarian standpoints, much more information concerning the biological effects induced and their modification and reversal is required. Together with radiation physics, an understanding of radiation chemistry is necessary for full appreciation of biological effects of high and low energy radiations, and for the development of prophylactic, therapeutic and potentiating methods and techniques in biological organisms. The necessity of understanding the chemistry of any system, biological or not, that is to be manipulated and controlled, is so obvious as to make trivial a statement to that effect. If any natural phenomenon is to be put to our use, surely the elements of it must be studied and appreciated fully. In the preliminary statements of the various panels of this general group, the need for additional information on the basic radiation chemistry concerned in radiation-induced biological effects pervades throughout

  6. A computer control system for a research reactor

    International Nuclear Information System (INIS)

    Crawford, K.C.; Sandquist, G.M.

    1987-01-01

    Most reactor applications until now, have not required computer control of core output. Commercial reactors are generally operated at a constant power output to provide baseline power. However, if commercial reactor cores are to become load following over a wide range, then centralized digital computer control is required to make the entire facility respond as a single unit to continual changes in power demand. Navy and research reactors are much smaller and simpler and are operated at constant power levels as required, without concern for the number of operators required to operate the facility. For navy reactors, centralized digital computer control may provide space savings and reduced personnel requirements. Computer control offers research reactors versatility to efficiently change a system to develop new ideas. The operation of any reactor facility would be enhanced by a controller that does not panic and is continually monitoring all facility parameters. Eventually very sophisticated computer control systems may be developed which will sense operational problems, diagnose the problem, and depending on the severity of the problem, immediately activate safety systems or consult with operators before taking action

  7. Making Research Fly in Schools: "Drosophila" as a Powerful Modern Tool for Teaching Biology

    Science.gov (United States)

    Harbottle, Jennifer; Strangward, Patrick; Alnuamaani, Catherine; Lawes, Surita; Patel, Sanjai; Prokop, Andreas

    2016-01-01

    The "droso4schools" project aims to introduce the fruit fly "Drosophila" as a powerful modern teaching tool to convey curriculum-relevant specifications in biology lessons. Flies are easy and cheap to breed and have been at the forefront of biology research for a century, providing unique conceptual understanding of biology and…

  8. MOLNs: A CLOUD PLATFORM FOR INTERACTIVE, REPRODUCIBLE, AND SCALABLE SPATIAL STOCHASTIC COMPUTATIONAL EXPERIMENTS IN SYSTEMS BIOLOGY USING PyURDME.

    Science.gov (United States)

    Drawert, Brian; Trogdon, Michael; Toor, Salman; Petzold, Linda; Hellander, Andreas

    2016-01-01

    Computational experiments using spatial stochastic simulations have led to important new biological insights, but they require specialized tools and a complex software stack, as well as large and scalable compute and data analysis resources due to the large computational cost associated with Monte Carlo computational workflows. The complexity of setting up and managing a large-scale distributed computation environment to support productive and reproducible modeling can be prohibitive for practitioners in systems biology. This results in a barrier to the adoption of spatial stochastic simulation tools, effectively limiting the type of biological questions addressed by quantitative modeling. In this paper, we present PyURDME, a new, user-friendly spatial modeling and simulation package, and MOLNs, a cloud computing appliance for distributed simulation of stochastic reaction-diffusion models. MOLNs is based on IPython and provides an interactive programming platform for development of sharable and reproducible distributed parallel computational experiments.

  9. Statistical Methodologies to Integrate Experimental and Computational Research

    Science.gov (United States)

    Parker, P. A.; Johnson, R. T.; Montgomery, D. C.

    2008-01-01

    Development of advanced algorithms for simulating engine flow paths requires the integration of fundamental experiments with the validation of enhanced mathematical models. In this paper, we provide an overview of statistical methods to strategically and efficiently conduct experiments and computational model refinement. Moreover, the integration of experimental and computational research efforts is emphasized. With a statistical engineering perspective, scientific and engineering expertise is combined with statistical sciences to gain deeper insights into experimental phenomenon and code development performance; supporting the overall research objectives. The particular statistical methods discussed are design of experiments, response surface methodology, and uncertainty analysis and planning. Their application is illustrated with a coaxial free jet experiment and a turbulence model refinement investigation. Our goal is to provide an overview, focusing on concepts rather than practice, to demonstrate the benefits of using statistical methods in research and development, thereby encouraging their broader and more systematic application.

  10. Review of research on advanced computational science in FY2016

    International Nuclear Information System (INIS)

    2017-12-01

    Research on advanced computational science for nuclear applications, based on “Plan to Achieve Medium- to Long-term Objectives of the Japan Atomic Energy Agency (Medium- to Long-term Plan)”, has been performed at Center for Computational Science and e-Systems (CCSE), Japan Atomic Energy Agency. CCSE established the committee consisting of outside experts and authorities which does research evaluation and advices for the assistance of the research and development. This report summarizes the followings. (1) Results of the R and D performed at CCSE in FY 2016 (April 1st, 2016 - March 31st, 2017), (2) Results of the evaluation on the R and D by the committee in FY 2016. (author)

  11. Review of research on advanced computational science in FY2015

    International Nuclear Information System (INIS)

    2017-01-01

    Research on advanced computational science for nuclear applications, based on 'Plan to Achieve Medium- to Long-term Objectives of the Japan Atomic Energy Agency (Medium- to Long-term Plan)', has been performed at Center for Computational Science and e-Systems (CCSE), Japan Atomic Energy Agency. CCSE established the committee consisting of outside experts and authorities which does research evaluation and advices for the assistance of the research and development. This report summarizes the followings. (1) Results of the R and D performed at CCSE in FY 2015 (April 1st, 2015 - March 31st, 2016), (2) Results of the evaluation on the R and D by the committee in FY 2015 (April 1st, 2015 - March 31st, 2016). (author)

  12. Computing for magnetic fusion energy research: An updated vision

    International Nuclear Information System (INIS)

    Henline, P.; Giarrusso, J.; Davis, S.; Casper, T.

    1993-01-01

    This Fusion Computing Council perspective is written to present the primary of the fusion computing community at the time of publication of the report necessarily as a summary of the information contained in the individual sections. These concerns reflect FCC discussions during final review of contributions from the various working groups and portray our latest information. This report itself should be considered as dynamic, requiring periodic updating in an attempt to track rapid evolution of the computer industry relevant to requirements for magnetic fusion research. The most significant common concern among the Fusion Computing Council working groups is networking capability. All groups see an increasing need for network services due to the use of workstations, distributed computing environments, increased use of graphic services, X-window usage, remote experimental collaborations, remote data access for specific projects and other collaborations. Other areas of concern include support for workstations, enhanced infrastructure to support collaborations, the User Service Centers, NERSC and future massively parallel computers, and FCC sponsored workshops

  13. The role of ontologies in biological and biomedical research: a functional perspective

    KAUST Repository

    Hoehndorf, Robert

    2015-04-10

    Ontologies are widely used in biological and biomedical research. Their success lies in their combination of four main features present in almost all ontologies: provision of standard identifiers for classes and relations that represent the phenomena within a domain; provision of a vocabulary for a domain; provision of metadata that describes the intended meaning of the classes and relations in ontologies; and the provision of machine-readable axioms and definitions that enable computational access to some aspects of the meaning of classes and relations. While each of these features enables applications that facilitate data integration, data access and analysis, a great potential lies in the possibility of combining these four features to support integrative analysis and interpretation of multimodal data. Here, we provide a functional perspective on ontologies in biology and biomedicine, focusing on what ontologies can do and describing how they can be used in support of integrative research. We also outline perspectives for using ontologies in data-driven science, in particular their application in structured data mining and machine learning applications.

  14. The role of ontologies in biological and biomedical research: a functional perspective

    KAUST Repository

    Hoehndorf, Robert; Schofield, P. N.; Gkoutos, G. V.

    2015-01-01

    Ontologies are widely used in biological and biomedical research. Their success lies in their combination of four main features present in almost all ontologies: provision of standard identifiers for classes and relations that represent the phenomena within a domain; provision of a vocabulary for a domain; provision of metadata that describes the intended meaning of the classes and relations in ontologies; and the provision of machine-readable axioms and definitions that enable computational access to some aspects of the meaning of classes and relations. While each of these features enables applications that facilitate data integration, data access and analysis, a great potential lies in the possibility of combining these four features to support integrative analysis and interpretation of multimodal data. Here, we provide a functional perspective on ontologies in biology and biomedicine, focusing on what ontologies can do and describing how they can be used in support of integrative research. We also outline perspectives for using ontologies in data-driven science, in particular their application in structured data mining and machine learning applications.

  15. Grid computing : enabling a vision for collaborative research

    International Nuclear Information System (INIS)

    von Laszewski, G.

    2002-01-01

    In this paper the authors provide a motivation for Grid computing based on a vision to enable a collaborative research environment. The authors vision goes beyond the connection of hardware resources. They argue that with an infrastructure such as the Grid, new modalities for collaborative research are enabled. They provide an overview showing why Grid research is difficult, and they present a number of management-related issues that must be addressed to make Grids a reality. They list projects that provide solutions to subsets of these issues

  16. Radiation research contracts: Biological effects of small radiation doses

    Energy Technology Data Exchange (ETDEWEB)

    Hug, O [International Atomic Energy Agency, Division of Health, Safety and Waste Disposal, Vienna (Austria)

    1959-04-15

    To establish the maximum permissible radiation doses for occupational and other kinds of radiation exposure, it is necessary to know those biological effects which can be produced by very small radiation doses. This particular field of radiation biology has not yet been sufficiently explored. This holds true for possible delayed damage after occupational radiation exposure over a period of many years as well as for acute reactions of the organism to single low level exposures. We know that irradiation of less than 25 Roentgen units (r) is unlikely to produce symptoms of radiation sickness. We have, however, found indications that even smaller doses may produce certain instantaneous reactions which must not be neglected

  17. Shared-resource computing for small research labs.

    Science.gov (United States)

    Ackerman, M J

    1982-04-01

    A real time laboratory computer network is described. This network is composed of four real-time laboratory minicomputers located in each of four division laboratories and a larger minicomputer in a centrally located computer room. Off the shelf hardware and software were used with no customization. The network is configured for resource sharing using DECnet communications software and the RSX-11-M multi-user real-time operating system. The cost effectiveness of the shared resource network and multiple real-time processing using priority scheduling is discussed. Examples of utilization within a medical research department are given.

  18. Research foci of computing research in South Africa as reflected by publications in the South African computer journal

    CSIR Research Space (South Africa)

    Kotzé, P

    2009-01-01

    Full Text Available of research articles published in SACJ over its first 40 volumes of the journal using the ACM Computing Classification Scheme as basis. In their analysis the authors divided the publications into three cycles of more or less six years in order to identify...

  19. Personal recollections of radiation biology research at Hanford

    International Nuclear Information System (INIS)

    Thompson, R.C.

    1995-01-01

    This paper traces the evolution of the Hanford biology programme over a period of nearly five decades. The programme began in the 1940s with a focus on understanding the potential health effects of radionuclides such as 131 I associated with fallout from the atomic bomb. These studies were extended in the 1950s to experiments on the toxicity and metabolism of plutonium and fission products such as 90 Sr and 137 Cs. In the 1960s, a major long term project was initiated on the inhalation toxicology and carcinogenic effects of plutonium oxide and plutonium nitrate in dogs and rodents. The project remained a major effort within the overall Hanford biology programme throughout the 1970s and 1980s, during which time a broad range of new projects on energy-related pollutants, radon health effects, and basic radiation biology were initiated. Despite the many evolutionary changes that have occurred in the Hanford biology programme, the fundamental mission of understanding the effects of radiation on human health has endured for nearly five decades. (author)

  20. Advances in Biological Water-saving Research: Challenge and Perspectives

    Institute of Scientific and Technical Information of China (English)

    Lun Shan; Xiping Deng; Suiqi Zhang

    2006-01-01

    Increasing the efficiency of water use by crops continues to escalate as a topic of concern because drought is a restrictive environmental factor for crop productivity worldwide. Greater yield per unit rainfall is one of the most important challenges in water-saving agriculture. Besides water-saving by irrigation engineering and conservation tillage, a good understanding of factors limiting and/or regulating yield now provides us with an opportunity to identify and then precisely select for physiological and breeding traits that increase the efficiency of water use and drought tolerance under water-limited conditions, biological water-saving is one means of achieving this goat. A definition of biological water-saving measures is proposed which embraces improvements in water-use efficiency (WUE) and drought tolerance, by genetic improvement and physiological regulation. The preponderance of biological water-saving measures is discussed and strategies identified for working within natural resource constraints. The technology and future perspectives of biological water saving could provide not only new water-saving techniques but also a scientific base for application of water-saving irrigation and conservation tillage.

  1. Redox Biology Course Evaluation Form | Center for Cancer Research

    Science.gov (United States)

    To improve the Redox Biology (RB) course in future years, we would appreciate your feedback by completing this course evaluation. Please score the course elements as poor, fair, average, good or excellent. Please type any comments that you have in response to the questions at the bottom of the form. Remember to include your name as you wish it to appear on the certificate.

  2. Redox Biology Final Examination 2016 | Center for Cancer Research

    Science.gov (United States)

    Numerous registrants have requested a certificate upon completion of the Redox Biology (RB) course. In order to obtain a certificate, you must answer 8 of the 12 questions below correctly. In the final examination, 1 question is derived from each of the 1-hour lectures. It is highly recommended that you have a copy of each PowerPoint presentation prior to taking the

  3. The Prospects For Research In Biological Psychiatry In Nigeria ...

    African Journals Online (AJOL)

    Biological psychiatry deals with abnormalities of brain and genetic functioning and how they interact with environmental factors to underlie the genesis, manifestation, and response to treatment of mental disorders. These issues have not featured significantly in the Nigerian psychiatric scene. Hence, we are witnessing a ...

  4. Introduction to basic molecular biologic techniques for molecular imaging researches

    International Nuclear Information System (INIS)

    Kang, Joo Hyun

    2004-01-01

    Molecular imaging is a rapidly growing field due to the advances in molecular biology and imaging technologies. With the introduction of imaging reporter genes into the cell, diverse cellular processes can be monitored, quantified and imaged non-invasively in vivo. These processes include the gene expression, protein-protein interactions, signal transduction pathways, and monitoring of cells such as cancer cells, immune cells, and stem cells. In the near future, molecular imaging analysis will allow us to observe the incipience and progression of the disease. These will make us easier to give a diagnosis in the early stage of intractable diseases such as cancer, neuro-degenerative disease, and immunological disorders. Additionally, molecular imaging method will be a valuable tool for the real-time evaluation of cells in molecular biology and the basic biological studies. As newer and more powerful molecular imaging tools become available, it will be necessary to corporate clinicians, molecular biologists and biochemists for the planning, interpretation, and application of these techniques to their fullest potential. In order for such a multidisciplinary team to be effective, it is essential that a common understanding of basic biochemical and molecular biologic techniques is achieved. Basic molecular techniques for molecular imaging methods are presented in this paper

  5. Intelligent Buildings and pervasive computing - research perspectives and discussions

    DEFF Research Database (Denmark)

    Grønbæk, Kaj; Krogh, Peter Gall; Kyng, Morten

    2001-01-01

    computers are everywhere, for everyone, at all times. Where IT becomes a still more integrated part of our environments with processors, sensors, and actuators connected via high-speed networks and combined with new visualization devices ranging from projections directly in the eye to large panorama......Intelligent Buildings have been the subject of research and commercial interest for more than two decades. The different perspectives range from monitoring and controlling energy consumption over interactive rooms supporting work in offices and leisure in the home, to buildings providing...... information to by-passers in plazas and urban environments. This paper puts forward the hypothesis that the coming decade will witness a dramatic increase in both quality and quantity of intelligent buildings due to the emerging field of pervasive computing: the next generation computing environments where...

  6. Research progress on quantum informatics and quantum computation

    Science.gov (United States)

    Zhao, Yusheng

    2018-03-01

    Quantum informatics is an emerging interdisciplinary subject developed by the combination of quantum mechanics, information science, and computer science in the 1980s. The birth and development of quantum information science has far-reaching significance in science and technology. At present, the application of quantum information technology has become the direction of people’s efforts. The preparation, storage, purification and regulation, transmission, quantum coding and decoding of quantum state have become the hotspot of scientists and technicians, which have a profound impact on the national economy and the people’s livelihood, technology and defense technology. This paper first summarizes the background of quantum information science and quantum computer and the current situation of domestic and foreign research, and then introduces the basic knowledge and basic concepts of quantum computing. Finally, several quantum algorithms are introduced in detail, including Quantum Fourier transform, Deutsch-Jozsa algorithm, Shor’s quantum algorithm, quantum phase estimation.

  7. Tuneable resolution as a systems biology approach for multi-scale, multi-compartment computational models.

    Science.gov (United States)

    Kirschner, Denise E; Hunt, C Anthony; Marino, Simeone; Fallahi-Sichani, Mohammad; Linderman, Jennifer J

    2014-01-01

    The use of multi-scale mathematical and computational models to study complex biological processes is becoming increasingly productive. Multi-scale models span a range of spatial and/or temporal scales and can encompass multi-compartment (e.g., multi-organ) models. Modeling advances are enabling virtual experiments to explore and answer questions that are problematic to address in the wet-lab. Wet-lab experimental technologies now allow scientists to observe, measure, record, and analyze experiments focusing on different system aspects at a variety of biological scales. We need the technical ability to mirror that same flexibility in virtual experiments using multi-scale models. Here we present a new approach, tuneable resolution, which can begin providing that flexibility. Tuneable resolution involves fine- or coarse-graining existing multi-scale models at the user's discretion, allowing adjustment of the level of resolution specific to a question, an experiment, or a scale of interest. Tuneable resolution expands options for revising and validating mechanistic multi-scale models, can extend the longevity of multi-scale models, and may increase computational efficiency. The tuneable resolution approach can be applied to many model types, including differential equation, agent-based, and hybrid models. We demonstrate our tuneable resolution ideas with examples relevant to infectious disease modeling, illustrating key principles at work. © 2014 The Authors. WIREs Systems Biology and Medicine published by Wiley Periodicals, Inc.

  8. Using Biology Education Research and Qualitative Inquiry to Inform Genomic Nursing Education.

    Science.gov (United States)

    Ward, Linda D

    Decades of research in biology education show that learning genetics is difficult and reveals specific sources of learning difficulty. Little is known about how nursing students learn in this domain, although they likely encounter similar difficulties as nonnursing students. Using qualitative approaches, this study investigated challenges to learning genetics among nursing students. Findings indicate that nursing students face learning difficulties already identified among biology students, suggesting that nurse educators might benefit from biology education research.

  9. Creating a pipeline of talent for informatics: STEM initiative for high school students in computer science, biology, and biomedical informatics

    Directory of Open Access Journals (Sweden)

    Joyeeta Dutta-Moscato

    2014-01-01

    Full Text Available This editorial provides insights into how informatics can attract highly trained students by involving them in science, technology, engineering, and math (STEM training at the high school level and continuing to provide mentorship and research opportunities through the formative years of their education. Our central premise is that the trajectory necessary to be expert in the emergent fields in front of them requires acceleration at an early time point. Both pathology (and biomedical informatics are new disciplines which would benefit from involvement by students at an early stage of their education. In 2009, Michael T Lotze MD, Kirsten Livesey (then a medical student, now a medical resident at University of Pittsburgh Medical Center (UPMC, Richard Hersheberger, PhD (Currently, Dean at Roswell Park, and Megan Seippel, MS (the administrator launched the University of Pittsburgh Cancer Institute (UPCI Summer Academy to bring high school students for an 8 week summer academy focused on Cancer Biology. Initially, pathology and biomedical informatics were involved only in the classroom component of the UPCI Summer Academy. In 2011, due to popular interest, an informatics track called Computer Science, Biology and Biomedical Informatics (CoSBBI was launched. CoSBBI currently acts as a feeder program for the undergraduate degree program in bioinformatics at the University of Pittsburgh, which is a joint degree offered by the Departments of Biology and Computer Science. We believe training in bioinformatics is the best foundation for students interested in future careers in pathology informatics or biomedical informatics. We describe our approach to the recruitment, training and research mentoring of high school students to create a pipeline of exceptionally well-trained applicants for both the disciplines of pathology informatics and biomedical informatics. We emphasize here how mentoring of high school students in pathology informatics and biomedical

  10. Creating a pipeline of talent for informatics: STEM initiative for high school students in computer science, biology, and biomedical informatics.

    Science.gov (United States)

    Dutta-Moscato, Joyeeta; Gopalakrishnan, Vanathi; Lotze, Michael T; Becich, Michael J

    2014-01-01

    This editorial provides insights into how informatics can attract highly trained students by involving them in science, technology, engineering, and math (STEM) training at the high school level and continuing to provide mentorship and research opportunities through the formative years of their education. Our central premise is that the trajectory necessary to be expert in the emergent fields in front of them requires acceleration at an early time point. Both pathology (and biomedical) informatics are new disciplines which would benefit from involvement by students at an early stage of their education. In 2009, Michael T Lotze MD, Kirsten Livesey (then a medical student, now a medical resident at University of Pittsburgh Medical Center (UPMC)), Richard Hersheberger, PhD (Currently, Dean at Roswell Park), and Megan Seippel, MS (the administrator) launched the University of Pittsburgh Cancer Institute (UPCI) Summer Academy to bring high school students for an 8 week summer academy focused on Cancer Biology. Initially, pathology and biomedical informatics were involved only in the classroom component of the UPCI Summer Academy. In 2011, due to popular interest, an informatics track called Computer Science, Biology and Biomedical Informatics (CoSBBI) was launched. CoSBBI currently acts as a feeder program for the undergraduate degree program in bioinformatics at the University of Pittsburgh, which is a joint degree offered by the Departments of Biology and Computer Science. We believe training in bioinformatics is the best foundation for students interested in future careers in pathology informatics or biomedical informatics. We describe our approach to the recruitment, training and research mentoring of high school students to create a pipeline of exceptionally well-trained applicants for both the disciplines of pathology informatics and biomedical informatics. We emphasize here how mentoring of high school students in pathology informatics and biomedical informatics

  11. Synthetic glycopeptides and glycoproteins with applications in biological research

    Directory of Open Access Journals (Sweden)

    Ulrika Westerlind

    2012-05-01

    Full Text Available Over the past few years, synthetic methods for the preparation of complex glycopeptides have been drastically improved. The need for homogenous glycopeptides and glycoproteins with defined chemical structures to study diverse biological phenomena further enhances the development of methodologies. Selected recent advances in synthesis and applications, in which glycopeptides or glycoproteins serve as tools for biological studies, are reviewed. The importance of specific antibodies directed to the glycan part, as well as the peptide backbone has been realized during the development of synthetic glycopeptide-based anti-tumor vaccines. The fine-tuning of native chemical ligation (NCL, expressed protein ligation (EPL, and chemoenzymatic glycosylation techniques have all together enabled the synthesis of functional glycoproteins. The synthesis of structurally defined, complex glycopeptides or glyco-clusters presented on natural peptide backbones, or mimics thereof, offer further possibilities to study protein-binding events.

  12. Grete Kellenberger-Gujer: Molecular biology research pioneer.

    Science.gov (United States)

    Citi, Sandra; Berg, Douglas E

    2016-01-01

    Grete Kellenberger-Gujer was a Swiss molecular biologist who pioneered fundamental studies of bacteriophage in the mid-20(th) century at the University of Geneva. Her life and career stories are reviewed here, focusing on her fundamental contributions to our early understanding of phage biology via her insightful analyses of phenomena such as the lysogenic state of a temperate phage (λ), genetic recombination, radiation's in vivo consequences, and DNA restriction-modification; on her creative personality and interactions with peers; and how her academic advancement was affected by gender, societal conditions and cultural attitudes of the time. Her story is important scientifically, putting into perspective features of the scientific community from just before the molecular biology era started through its early years, and also sociologically, in illustrating the numerous "glass ceilings" that, especially then, often hampered the advancement of creative women.

  13. 75 FR 6401 - Medical Devices Regulated by the Center for Biologics Evaluation and Research; Availability of...

    Science.gov (United States)

    2010-02-09

    ... Biologics Evaluation and Research (HFM-17), Food and Drug Administration, suite 200N, 1401 Rockville Pike... DEPARTMENT OF HEALTH AND HUMAN SERVICES Food and Drug Administration [Docket No. FDA-2009-M-0513] Medical Devices Regulated by the Center for Biologics Evaluation and Research; Availability of Summaries...

  14. Interdisciplinary Biomathematics: Engaging Undergraduates in Research on the Fringe of Mathematical Biology

    Science.gov (United States)

    Fowler, Kathleen; Luttman, Aaron; Mondal, Sumona

    2013-01-01

    The US National Science Foundation's (NSF's) Undergraduate Biology and Mathematics (UBM) program significantly increased undergraduate research in the biomathematical sciences. We discuss three UBM-funded student research projects at Clarkson University that lie at the intersection of not just mathematics and biology, but also other fields. The…

  15. 76 FR 71045 - Center for Biologics Evaluation and Research Report of Scientific and Medical Literature and...

    Science.gov (United States)

    2011-11-16

    ...] Center for Biologics Evaluation and Research Report of Scientific and Medical Literature and Information... period for the notice on its report of scientific and medical literature and information concerning the... ``Center for Biologics Evaluation and Research Report of Scientific and Medical Literature and Information...

  16. Redox Biology Course Registration Form | Center for Cancer Research

    Science.gov (United States)

    The Redox Biology class is open to all NIH/NCI fellows and staff and will be held Septhember 27 - November 8, 2016. The last day to register is: September 21, 2016. The first 100 registrants will be accepted for the class. Those who plan to participate by Video TeleConference should also register so that you can receive the speaker handouts in advance.

  17. Redox Biology Course Evaluation Form | Center for Cancer Research

    Science.gov (United States)

    To improve the Redox Biology (RB) course in future years, we would appreciate your feedback by completing this course evaluation. Please score the course elements as poor, fair, average, good or excellent. Please type any comments that you have in response to the questions at the bottom of the form. Remember to include your name as you wish it to appear on the certificate. Thank you for your feedback.

  18. Computer codes for problems of isotope and radiation research

    International Nuclear Information System (INIS)

    Remer, M.

    1986-12-01

    A survey is given of computer codes for problems in isotope and radiation research. Altogether 44 codes are described as titles with abstracts. 17 of them are in the INIS scope and are processed individually. The subjects are indicated in the chapter headings: 1) analysis of tracer experiments, 2) spectrum calculations, 3) calculations of ion and electron trajectories, 4) evaluation of gamma irradiation plants, and 5) general software

  19. Computing at the leading edge: Research in the energy sciences

    Energy Technology Data Exchange (ETDEWEB)

    Mirin, A.A.; Van Dyke, P.T. [eds.

    1994-02-01

    The purpose of this publication is to highlight selected scientific challenges that have been undertaken by the DOE Energy Research community. The high quality of the research reflected in these contributions underscores the growing importance both to the Grand Challenge scientific efforts sponsored by DOE and of the related supporting technologies that the National Energy Research Supercomputer Center (NERSC) and other facilities are able to provide. The continued improvement of the computing resources available to DOE scientists is prerequisite to ensuring their future progress in solving the Grand Challenges. Titles of articles included in this publication include: the numerical tokamak project; static and animated molecular views of a tumorigenic chemical bound to DNA; toward a high-performance climate systems model; modeling molecular processes in the environment; lattice Boltzmann models for flow in porous media; parallel algorithms for modeling superconductors; parallel computing at the Superconducting Super Collider Laboratory; the advanced combustion modeling environment; adaptive methodologies for computational fluid dynamics; lattice simulations of quantum chromodynamics; simulating high-intensity charged-particle beams for the design of high-power accelerators; electronic structure and phase stability of random alloys.

  20. Computing at the leading edge: Research in the energy sciences

    International Nuclear Information System (INIS)

    Mirin, A.A.; Van Dyke, P.T.

    1994-01-01

    The purpose of this publication is to highlight selected scientific challenges that have been undertaken by the DOE Energy Research community. The high quality of the research reflected in these contributions underscores the growing importance both to the Grand Challenge scientific efforts sponsored by DOE and of the related supporting technologies that the National Energy Research Supercomputer Center (NERSC) and other facilities are able to provide. The continued improvement of the computing resources available to DOE scientists is prerequisite to ensuring their future progress in solving the Grand Challenges. Titles of articles included in this publication include: the numerical tokamak project; static and animated molecular views of a tumorigenic chemical bound to DNA; toward a high-performance climate systems model; modeling molecular processes in the environment; lattice Boltzmann models for flow in porous media; parallel algorithms for modeling superconductors; parallel computing at the Superconducting Super Collider Laboratory; the advanced combustion modeling environment; adaptive methodologies for computational fluid dynamics; lattice simulations of quantum chromodynamics; simulating high-intensity charged-particle beams for the design of high-power accelerators; electronic structure and phase stability of random alloys

  1. Neural computation and particle accelerators research, technology and applications

    CERN Document Server

    D'Arras, Horace

    2010-01-01

    This book discusses neural computation, a network or circuit of biological neurons and relatedly, particle accelerators, a scientific instrument which accelerates charged particles such as protons, electrons and deuterons. Accelerators have a very broad range of applications in many industrial fields, from high energy physics to medical isotope production. Nuclear technology is one of the fields discussed in this book. The development that has been reached by particle accelerators in energy and particle intensity has opened the possibility to a wide number of new applications in nuclear technology. This book reviews the applications in the nuclear energy field and the design features of high power neutron sources are explained. Surface treatments of niobium flat samples and superconducting radio frequency cavities by a new technique called gas cluster ion beam are also studied in detail, as well as the process of electropolishing. Furthermore, magnetic devises such as solenoids, dipoles and undulators, which ...

  2. Computer Science Research Institute 2004 annual report of activities.

    Energy Technology Data Exchange (ETDEWEB)

    DeLap, Barbara J.; Womble, David Eugene; Ceballos, Deanna Rose

    2006-03-01

    This report summarizes the activities of the Computer Science Research Institute (CSRI) at Sandia National Laboratories during the period January 1, 2004 to December 31, 2004. During this period the CSRI hosted 166 visitors representing 81 universities, companies and laboratories. Of these 65 were summer students or faculty. The CSRI partially sponsored 2 workshops and also organized and was the primary host for 4 workshops. These 4 CSRI sponsored workshops had 140 participants--74 from universities, companies and laboratories, and 66 from Sandia. Finally, the CSRI sponsored 14 long-term collaborative research projects and 5 Sabbaticals.

  3. Computer Science Research Institute 2003 annual report of activities.

    Energy Technology Data Exchange (ETDEWEB)

    DeLap, Barbara J.; Womble, David Eugene; Ceballos, Deanna Rose

    2006-03-01

    This report summarizes the activities of the Computer Science Research Institute (CSRI) at Sandia National Laboratories during the period January 1, 2003 to December 31, 2003. During this period the CSRI hosted 164 visitors representing 78 universities, companies and laboratories. Of these 78 were summer students or faculty members. The CSRI partially sponsored 5 workshops and also organized and was the primary host for 3 workshops. These 3 CSRI sponsored workshops had 178 participants--137 from universities, companies and laboratories, and 41 from Sandia. Finally, the CSRI sponsored 18 long-term collaborative research projects and 5 Sabbaticals.

  4. Computer Science Research Institute 2005 annual report of activities.

    Energy Technology Data Exchange (ETDEWEB)

    Watts, Bernadette M.; Collis, Samuel Scott; Ceballos, Deanna Rose; Womble, David Eugene

    2008-04-01

    This report summarizes the activities of the Computer Science Research Institute (CSRI) at Sandia National Laboratories during the period January 1, 2005 to December 31, 2005. During this period, the CSRI hosted 182 visitors representing 83 universities, companies and laboratories. Of these, 60 were summer students or faculty. The CSRI partially sponsored 2 workshops and also organized and was the primary host for 3 workshops. These 3 CSRI sponsored workshops had 105 participants, 78 from universities, companies and laboratories, and 27 from Sandia. Finally, the CSRI sponsored 12 long-term collaborative research projects and 3 Sabbaticals.

  5. ElectroEncephaloGraphics: Making waves in computer graphics research.

    Science.gov (United States)

    Mustafa, Maryam; Magnor, Marcus

    2014-01-01

    Electroencephalography (EEG) is a novel modality for investigating perceptual graphics problems. Until recently, EEG has predominantly been used for clinical diagnosis, in psychology, and by the brain-computer-interface community. Researchers are extending it to help understand the perception of visual output from graphics applications and to create approaches based on direct neural feedback. Researchers have applied EEG to graphics to determine perceived image and video quality by detecting typical rendering artifacts, to evaluate visualization effectiveness by calculating the cognitive load, and to automatically optimize rendering parameters for images and videos on the basis of implicit neural feedback.

  6. Division of Biological and Medical Research annual report, 1979

    International Nuclear Information System (INIS)

    Rosenthal, M.W.

    1979-01-01

    Separate abstracts were prepared for 14 of the 20 sections included in this progress report. The other 6 sections include: introductory statements by the division director; descriptions of the animal, computer, electron microscope, and radiation support facilities; a listing of the educational activities, divisional seminars, and oral presentations by staff members; and divisional staff publications. An author index to the report is included

  7. Perspectives on Sharing Models and Related Resources in Computational Biomechanics Research.

    Science.gov (United States)

    Erdemir, Ahmet; Hunter, Peter J; Holzapfel, Gerhard A; Loew, Leslie M; Middleton, John; Jacobs, Christopher R; Nithiarasu, Perumal; Löhner, Rainlad; Wei, Guowei; Winkelstein, Beth A; Barocas, Victor H; Guilak, Farshid; Ku, Joy P; Hicks, Jennifer L; Delp, Scott L; Sacks, Michael; Weiss, Jeffrey A; Ateshian, Gerard A; Maas, Steve A; McCulloch, Andrew D; Peng, Grace C Y

    2018-02-01

    The role of computational modeling for biomechanics research and related clinical care will be increasingly prominent. The biomechanics community has been developing computational models routinely for exploration of the mechanics and mechanobiology of diverse biological structures. As a result, a large array of models, data, and discipline-specific simulation software has emerged to support endeavors in computational biomechanics. Sharing computational models and related data and simulation software has first become a utilitarian interest, and now, it is a necessity. Exchange of models, in support of knowledge exchange provided by scholarly publishing, has important implications. Specifically, model sharing can facilitate assessment of reproducibility in computational biomechanics and can provide an opportunity for repurposing and reuse, and a venue for medical training. The community's desire to investigate biological and biomechanical phenomena crossing multiple systems, scales, and physical domains, also motivates sharing of modeling resources as blending of models developed by domain experts will be a required step for comprehensive simulation studies as well as the enhancement of their rigor and reproducibility. The goal of this paper is to understand current perspectives in the biomechanics community for the sharing of computational models and related resources. Opinions on opportunities, challenges, and pathways to model sharing, particularly as part of the scholarly publishing workflow, were sought. A group of journal editors and a handful of investigators active in computational biomechanics were approached to collect short opinion pieces as a part of a larger effort of the IEEE EMBS Computational Biology and the Physiome Technical Committee to address model reproducibility through publications. A synthesis of these opinion pieces indicates that the community recognizes the necessity and usefulness of model sharing. There is a strong will to facilitate

  8. Phenomenography and Grounded Theory as Research Methods in Computing Education Research Field

    Science.gov (United States)

    Kinnunen, Paivi; Simon, Beth

    2012-01-01

    This paper discusses two qualitative research methods, phenomenography and grounded theory. We introduce both methods' data collection and analysis processes and the type or results you may get at the end by using examples from computing education research. We highlight some of the similarities and differences between the aim, data collection and…

  9. Proceedings of the 8. Mediterranean Conference on Medical and Biological Engineering and Computing (Medicon `98)

    Energy Technology Data Exchange (ETDEWEB)

    Christofides, Stelios; Pattichis, Constantinos; Schizas, Christos; Keravnou-Papailiou, Elpida; Kaplanis, Prodromos; Spyros, Spyrou; Christodoulides, George; Theodoulou, Yiannis [eds.

    1999-12-31

    Medicon `98 is the eighth in the series of regional meetings of the International Federation of Medical and Biological Engineering (IFMBE) in the Mediterranean. The goal of Medicon `98 is to provide updated information on the state of the art on medical and biological engineering and computing. Medicon `98 was held in Lemesos, Cyprus, between 14-17 June, 1998. The full papers of the proceedings were published on CD and consisted of 190 invited and submitted papers. A book of abstracts was also published in paper form and was available to all the participants. Twenty seven papers fall within the scope of INIS and are dealing with Nuclear Medicine,Computerized Tomography, Radiology, Radiotherapy, Magnetic Resonance Imaging and Personnel Dosimetry (eds).

  10. Proceedings of the 8. Mediterranean Conference on Medical and Biological Engineering and Computing (Medicon '98)

    International Nuclear Information System (INIS)

    Christofides, Stelios; Pattichis, Constantinos; Schizas, Christos; Keravnou-Papailiou, Elpida; Kaplanis, Prodromos; Spyros, Spyrou; Christodoulides, George; Theodoulou, Yiannis

    1998-01-01

    Medicon '98 is the eighth in the series of regional meetings of the International Federation of Medical and Biological Engineering (IFMBE) in the Mediterranean. The goal of Medicon '98 is to provide updated information on the state of the art on medical and biological engineering and computing. Medicon '98 was held in Lemesos, Cyprus, between 14-17 June, 1998. The full papers of the proceedings were published on CD and consisted of 190 invited and submitted papers. A book of abstracts was also published in paper form and was available to all the participants. Twenty seven papers fall within the scope of INIS and are dealing with Nuclear Medicine,Computerized Tomography, Radiology, Radiotherapy, Magnetic Resonance Imaging and Personnel Dosimetry (eds)

  11. Can Nuclear Installations and Research Centres Adopt Cloud Computing Platform-

    International Nuclear Information System (INIS)

    Pichan, A.; Lazarescu, M.; Soh, S.T.

    2015-01-01

    Cloud Computing is arguably one of the recent and highly significant advances in information technology today. It produces transformative changes in the history of computing and presents many promising technological and economic opportunities. The pay-per-use model, the computing power, abundance of storage, skilled resources, fault tolerance and the economy of scale it offers, provides significant advantages to enterprises to adopt cloud platform for their business needs. However, customers especially those dealing with national security, high end scientific research institutions, critical national infrastructure service providers (like power, water) remain very much reluctant to move their business system to the cloud. One of the main concerns is the question of information security in the cloud and the threat of the unknown. Cloud Service Providers (CSP) indirectly encourages this perception by not letting their customers see what is behind their virtual curtain. Jurisdiction (information assets being stored elsewhere), data duplication, multi-tenancy, virtualisation and decentralized nature of data processing are the default characteristics of cloud computing. Therefore traditional approach of enforcing and implementing security controls remains a big challenge and largely depends upon the service provider. The other biggest challenge and open issue is the ability to perform digital forensic investigations in the cloud in case of security breaches. Traditional approaches to evidence collection and recovery are no longer practical as they rely on unrestricted access to the relevant systems and user data, something that is not available in the cloud model. This continues to fuel high insecurity for the cloud customers. In this paper we analyze the cyber security and digital forensics challenges, issues and opportunities for nuclear facilities to adopt cloud computing. We also discuss the due diligence process and applicable industry best practices which shall be

  12. Computational systems biology and dose-response modeling in relation to new directions in toxicity testing.

    Science.gov (United States)

    Zhang, Qiang; Bhattacharya, Sudin; Andersen, Melvin E; Conolly, Rory B

    2010-02-01

    The new paradigm envisioned for toxicity testing in the 21st century advocates shifting from the current animal-based testing process to a combination of in vitro cell-based studies, high-throughput techniques, and in silico modeling. A strategic component of the vision is the adoption of the systems biology approach to acquire, analyze, and interpret toxicity pathway data. As key toxicity pathways are identified and their wiring details elucidated using traditional and high-throughput techniques, there is a pressing need to understand their qualitative and quantitative behaviors in response to perturbation by both physiological signals and exogenous stressors. The complexity of these molecular networks makes the task of understanding cellular responses merely by human intuition challenging, if not impossible. This process can be aided by mathematical modeling and computer simulation of the networks and their dynamic behaviors. A number of theoretical frameworks were developed in the last century for understanding dynamical systems in science and engineering disciplines. These frameworks, which include metabolic control analysis, biochemical systems theory, nonlinear dynamics, and control theory, can greatly facilitate the process of organizing, analyzing, and understanding toxicity pathways. Such analysis will require a comprehensive examination of the dynamic properties of "network motifs"--the basic building blocks of molecular circuits. Network motifs like feedback and feedforward loops appear repeatedly in various molecular circuits across cell types and enable vital cellular functions like homeostasis, all-or-none response, memory, and biological rhythm. These functional motifs and associated qualitative and quantitative properties are the predominant source of nonlinearities observed in cellular dose response data. Complex response behaviors can arise from toxicity pathways built upon combinations of network motifs. While the field of computational cell

  13. On the Modelling of Biological Patterns with Mechanochemical Models: Insights from Analysis and Computation

    KAUST Repository

    Moreo, P.; Gaffney, E. A.; Garcí a-Aznar, J. M.; Doblaré , M.

    2009-01-01

    The diversity of biological form is generated by a relatively small number of underlying mechanisms. Consequently, mathematical and computational modelling can, and does, provide insight into how cellular level interactions ultimately give rise

  14. Progress in nucleic acid research and molecular biology

    International Nuclear Information System (INIS)

    Cohn, W.E.; Moldave, K.

    1988-01-01

    Complementary Use of Chemical Modification and Site-Directed Mutagenesis to Probe Structure-Activity Relationships in Enzymes. Mechanisms of the Antiviral Action of Inteferons. Modulation of Cellular Genes by Oncogenes. DNA Damage Produced by Ionizing Radiation in Mammalian Cells: Identities, Mechanisms of Formation, and Reparability. Human Ferritin Gene Expression. Molecular Biology of the Insulin Receptor. Cap-Binding Proteins of Eukaryotic Messenger RNA: Functions in Initiation and Control of Translation. Physical Monitoring of Meiotic and Mitotic Recombination in Yeast. Early Signals Underlying the Induction of the c-fos and c-myc Genes in Quiescent Fibroblasts: Studies with Bombesin and Other Growth Factors. Each chapter includes references

  15. SED-ED, a workflow editor for computational biology experiments written in SED-ML.

    Science.gov (United States)

    Adams, Richard R

    2012-04-15

    The simulation experiment description markup language (SED-ML) is a new community data standard to encode computational biology experiments in a computer-readable XML format. Its widespread adoption will require the development of software support to work with SED-ML files. Here, we describe a software tool, SED-ED, to view, edit, validate and annotate SED-ML documents while shielding end-users from the underlying XML representation. SED-ED supports modellers who wish to create, understand and further develop a simulation description provided in SED-ML format. SED-ED is available as a standalone Java application, as an Eclipse plug-in and as an SBSI (www.sbsi.ed.ac.uk) plug-in, all under an MIT open-source license. Source code is at https://sed-ed-sedmleditor.googlecode.com/svn. The application itself is available from https://sourceforge.net/projects/jlibsedml/files/SED-ED/.

  16. Computational local stiffness analysis of biological cell: High aspect ratio single wall carbon nanotube tip

    Energy Technology Data Exchange (ETDEWEB)

    TermehYousefi, Amin, E-mail: at.tyousefi@gmail.com [Department of Human Intelligence Systems, Graduate School of Life Science and Systems Engineering, Kyushu Institute of Technology (Kyutech) (Japan); Bagheri, Samira; Shahnazar, Sheida [Nanotechnology & Catalysis Research Centre (NANOCAT), IPS Building, University Malaya, 50603 Kuala Lumpur (Malaysia); Rahman, Md. Habibur [Department of Computer Science and Engineering, University of Asia Pacific, Green Road, Dhaka-1215 (Bangladesh); Kadri, Nahrizul Adib [Department of Biomedical Engineering, Faculty of Engineering, University Malaya, 50603 Kuala Lumpur (Malaysia)

    2016-02-01

    Carbon nanotubes (CNTs) are potentially ideal tips for atomic force microscopy (AFM) due to the robust mechanical properties, nanoscale diameter and also their ability to be functionalized by chemical and biological components at the tip ends. This contribution develops the idea of using CNTs as an AFM tip in computational analysis of the biological cells. The proposed software was ABAQUS 6.13 CAE/CEL provided by Dassault Systems, which is a powerful finite element (FE) tool to perform the numerical analysis and visualize the interactions between proposed tip and membrane of the cell. Finite element analysis employed for each section and displacement of the nodes located in the contact area was monitored by using an output database (ODB). Mooney–Rivlin hyperelastic model of the cell allows the simulation to obtain a new method for estimating the stiffness and spring constant of the cell. Stress and strain curve indicates the yield stress point which defines as a vertical stress and plan stress. Spring constant of the cell and the local stiffness was measured as well as the applied force of CNT-AFM tip on the contact area of the cell. This reliable integration of CNT-AFM tip process provides a new class of high performance nanoprobes for single biological cell analysis. - Graphical abstract: This contribution develops the idea of using CNTs as an AFM tip in computational analysis of the biological cells. The proposed software was ABAQUS 6.13 CAE/CEL provided by Dassault Systems. Finite element analysis employed for each section and displacement of the nodes located in the contact area was monitored by using an output database (ODB). Mooney–Rivlin hyperelastic model of the cell allows the simulation to obtain a new method for estimating the stiffness and spring constant of the cell. Stress and strain curve indicates the yield stress point which defines as a vertical stress and plan stress. Spring constant of the cell and the local stiffness was measured as well

  17. Atomic Force Microscopy Application in Biological Research: A Review Study

    Directory of Open Access Journals (Sweden)

    Surena Vahabi

    2013-06-01

    Full Text Available Atomic force microscopy (AFM is a three-dimensional topographic technique with a high atomic resolution to measure surface roughness. AFM is a kind of scanning probe microscope, and its near-field technique is based on the interaction between a sharp tip and the atoms of the sample surface. There are several methods and many ways to modify the tip of the AFM to investigate surface properties, including measuring friction, adhesion forces and viscoelastic properties as well as determining the Young modulus and imaging magnetic or electrostatic properties. The AFM technique can analyze any kind of samples such as polymers, adsorbed molecules, films or fibers, and powders in the air whether in a controlled atmosphere or in a liquid medium. In the past decade, the AFM has emerged as a powerful tool to obtain the nanostructural details and biomechanical properties of biological samples, including biomolecules and cells. The AFM applications, techniques, and -in particular- its ability to measure forces, are not still familiar to most clinicians. This paper reviews the literature on the main principles of the AFM modality and highlights the advantages of this technique in biology, medicine, and- especially- dentistry. This literature review was performed through E-resources, including Science Direct, PubMed, Blackwell Synergy, Embase, Elsevier, and Scholar Google for the references published between 1985 and 2010.

  18. Social justice and research using human biological material: A ...

    African Journals Online (AJOL)

    AJOL African Journals Online. HOW TO USE AJOL... for Researchers · for Librarians · for Authors · FAQ's · More about AJOL · AJOL's Partners · Terms and Conditions of Use · Contact AJOL · News. OTHER RESOURCES... for Researchers · for Journals · for Authors · for Policy Makers · about Open Access · Journal Quality.

  19. Progress report of a research program in computational physics

    International Nuclear Information System (INIS)

    Guralnik, G.S.

    1990-01-01

    Task D's research is focused on the understanding of elementary particle physics through the techniques of quantum field theory. We make intensive use of computers to aid our research. During the last year we have made significant progress in understanding the weak interactions through the use of Monte Carlo methods as applied to the equations of quenched lattice QCD. We have launched a program to understand full (not quenched) lattice QCD on relatively large lattices using massively parallel computers. Because of our awareness that Monte Carlo methods might not be able to give a good solution to field theories with the computer power likely to be available to us for the forseeable future we have launched an entirely different numerical approach to study these problems. This ''Source Galerkin'' method is based on an algebraic approach to the field theoretic equations of motion and is (somewhat) related to variational and finite element techniques applied to a source rather than a coordinate space. The results for relatively simple problems are sensationally good. In particular, fermions can be treated in a way which allows them to retain their status as independent dynamical entities in the theory. 8 refs

  20. Centralized digital computer control of a research nuclear reactor

    International Nuclear Information System (INIS)

    Crawford, K.C.

    1987-01-01

    A hardware and software design for the centralized control of a research nuclear reactor by a digital computer are presented, as well as an investigation of automatic-feedback control. Current reactor-control philosophies including redundancy, inherent safety in failure, and conservative-yet-operational scram initiation were used as the bases of the design. The control philosophies were applied to the power-monitoring system, the fuel-temperature monitoring system, the area-radiation monitoring system, and the overall system interaction. Unlike the single-function analog computers currently used to control research and commercial reactors, this system will be driven by a multifunction digital computer. Specifically, the system will perform control-rod movements to conform with operator requests, automatically log the required physical parameters during reactor operation, perform the required system tests, and monitor facility safety and security. Reactor power control is based on signals received from ion chambers located near the reactor core. Absorber-rod movements are made to control the rate of power increase or decrease during power changes and to control the power level during steady-state operation. Additionally, the system incorporates a rudimentary level of artificial intelligence

  1. [New materia medica project: synthetic biology based bioactive metabolites research in medicinal plant].

    Science.gov (United States)

    Wang, Yong

    2017-03-25

    In the last decade, synthetic biology research has been gradually transited from monocellular parts or devices toward more complex multicellular systems. The emerging plant synthetic biology is regarded as the "next chapter" of synthetic biology. The complex and diverse plant metabolism as the entry point, plant synthetic biology research not only helps us understand how real life is working, but also facilitates us to learn how to design and construct more complex artificial life. Bioactive compounds innovation and large-scale production are expected to be breakthrough with the redesigned plant metabolism as well. In this review, we discuss the research progress in plant synthetic biology and propose the new materia medica project to lift the level of traditional Chinese herbal medicine research.

  2. Distributed and grid computing projects with research focus in human health.

    Science.gov (United States)

    Diomidous, Marianna; Zikos, Dimitrios

    2012-01-01

    Distributed systems and grid computing systems are used to connect several computers to obtain a higher level of performance, in order to solve a problem. During the last decade, projects use the World Wide Web to aggregate individuals' CPU power for research purposes. This paper presents the existing active large scale distributed and grid computing projects with research focus in human health. There have been found and presented 11 active projects with more than 2000 Processing Units (PUs) each. The research focus for most of them is molecular biology and, specifically on understanding or predicting protein structure through simulation, comparing proteins, genomic analysis for disease provoking genes and drug design. Though not in all cases explicitly stated, common target diseases include research to find cure against HIV, dengue, Duchene dystrophy, Parkinson's disease, various types of cancer and influenza. Other diseases include malaria, anthrax, Alzheimer's disease. The need for national initiatives and European Collaboration for larger scale projects is stressed, to raise the awareness of citizens to participate in order to create a culture of internet volunteering altruism.

  3. The research of computer multimedia assistant in college English listening

    Science.gov (United States)

    Zhang, Qian

    2012-04-01

    With the technology development of network information, there exists more and more seriously questions to our education. Computer multimedia application breaks the traditional foreign language teaching and brings new challenges and opportunities for the education. Through the multiple media application, the teaching process is full of animation, image, voice, and characters. This can improve the learning initiative and objective with great development of learning efficiency. During the traditional foreign language teaching, people use characters learning. However, through this method, the theory performance is good but the practical application is low. During the long time computer multimedia application in the foreign language teaching, many teachers still have prejudice. Therefore, the method is not obtaining the effect. After all the above, the research has significant meaning for improving the teaching quality of foreign language.

  4. Dermal tumorigen PAH and complex mixtures for biological research

    International Nuclear Information System (INIS)

    Griest, W.H.; Guerin, M.R.; Ho, C.

    1985-01-01

    Thirteen commercially available, commonly reported four-five ring dermal tumorigen PAHs, were determined in a set of complex mixtures consisting of crude and upgraded coal liquids, and petroleum crude oils and their distillate fractions. Semi-preparative scale, normal phase high performance liquid chromatographic fractionation followed by capillary column gas chromatography or gas chromatography-mass spectroscopy were used for the measurements. Deuterated or carbon-14 labeled PAH served as internal standards or allowed recovery corrections. Approaches for the preparation and measurement of radiolabeled PAH were examined to provide chemical probes for biological study. Synthetic routes for production of 14 C labeled dihydrobenzo[a]pyrene and 14 C- or 3 H 10-azabenzo[a]pyrene are being studied to provide tracers for fundamental studies in tracheal transplant and skin penetration systems. (DT)

  5. Epigenetics in radiation biology: a new research frontier

    International Nuclear Information System (INIS)

    Agarwal, Sural

    2014-01-01

    The number of people that receive exposure to ionizing radiation (IR) via occupational, diagnostic, or treatment-related modalities is progressively rising. It is now accepted that the negative consequences of radiation exposure are not isolated to exposed cells or individuals. Exposure to IR can induce genome instability in the germ line, and is further associated with transgenerational genomic instability in the off spring of exposed males. The exact molecular mechanisms for transgenerational genome instability have yet to be elucidated, although there is support for it being an epigenetically induced phenomenon. This review is centered on the long-term biological effects associated with IR exposure, mainly focusing on the epigentic mechanisms and also some facts about whether dental radiology (IOPA, OPG, CT, MRI, CBCT) can lead to carcinogenesis. (author)

  6. Mixed-Methods Design in Biology Education Research: Approach and Uses

    Science.gov (United States)

    Warfa, Abdi-Rizak M.

    2016-01-01

    Educational research often requires mixing different research methodologies to strengthen findings, better contextualize or explain results, or minimize the weaknesses of a single method. This article provides practical guidelines on how to conduct such research in biology education, with a focus on mixed-methods research (MMR) that uses both…

  7. Northeast Cooperative Research Study Fleet (SF) Program Biological Sampling Data

    Data.gov (United States)

    National Oceanic and Atmospheric Administration, Department of Commerce — Northeast Cooperative Research Study Fleet (SF) Program partners with a subset of commercial fishermen to collect high quality, high resolution, haul by haul...

  8. Social justice and research using human biological material: A ...

    African Journals Online (AJOL)

    generated from research to which they contributed; therefore, in effect ... Mahomed et al. employ the terms 'human tissue' and 'tissue donors'. ... in favour of shifting away from altruism; secondly, I caution against framing the debate in terms of ...

  9. Using research to teach an "introduction to biological thinking".

    Science.gov (United States)

    Bell, Ellis

    2011-01-01

    A course design for first-year science students is described, where the focus is on the skills necessary to do science. The course uses original research projects, designed by the students, to teach a variety of skills including reading the scientific literature, hypothesis development and testing, experimental design, data analysis and interpretation, and quantitative skills and presentation of the research in a variety of formats. Copyright © 2011 Wiley Periodicals, Inc.

  10. Geochemical, hydrological, and biological cycling of energy residual. Research plan

    International Nuclear Information System (INIS)

    Wobber, F.J.

    1983-03-01

    Proposed research goals and specific research areas designed to provide a base of fundamental scientific information so that the geochemical, hydrological, and biophysical mechanisms that contribute to the transport and long term fate of energy residuals in natural systems can be understood are described. Energy development and production have resulted in a need for advanced scientific information on the geochemical transformations, transport rates, and potential for bioaccumulation of contaminants in subsurface environments

  11. The RCSB Protein Data Bank: views of structural biology for basic and applied research and education.

    Science.gov (United States)

    Rose, Peter W; Prlić, Andreas; Bi, Chunxiao; Bluhm, Wolfgang F; Christie, Cole H; Dutta, Shuchismita; Green, Rachel Kramer; Goodsell, David S; Westbrook, John D; Woo, Jesse; Young, Jasmine; Zardecki, Christine; Berman, Helen M; Bourne, Philip E; Burley, Stephen K

    2015-01-01

    The RCSB Protein Data Bank (RCSB PDB, http://www.rcsb.org) provides access to 3D structures of biological macromolecules and is one of the leading resources in biology and biomedicine worldwide. Our efforts over the past 2 years focused on enabling a deeper understanding of structural biology and providing new structural views of biology that support both basic and applied research and education. Herein, we describe recently introduced data annotations including integration with external biological resources, such as gene and drug databases, new visualization tools and improved support for the mobile web. We also describe access to data files, web services and open access software components to enable software developers to more effectively mine the PDB archive and related annotations. Our efforts are aimed at expanding the role of 3D structure in understanding biology and medicine. © The Author(s) 2014. Published by Oxford University Press on behalf of Nucleic Acids Research.

  12. Computational Assessment of Pharmacokinetics and Biological Effects of Some Anabolic and Androgen Steroids.

    Science.gov (United States)

    Roman, Marin; Roman, Diana Larisa; Ostafe, Vasile; Ciorsac, Alecu; Isvoran, Adriana

    2018-02-05

    The aim of this study is to use computational approaches to predict the ADME-Tox profiles, pharmacokinetics, molecular targets, biological activity spectra and side/toxic effects of 31 anabolic and androgen steroids in humans. The following computational tools are used: (i) FAFDrugs4, SwissADME and admetSARfor obtaining the ADME-Tox profiles and for predicting pharmacokinetics;(ii) SwissTargetPrediction and PASS online for predicting the molecular targets and biological activities; (iii) PASS online, Toxtree, admetSAR and Endocrine Disruptomefor envisaging the specific toxicities; (iv) SwissDock to assess the interactions of investigated steroids with cytochromes involved in drugs metabolism. Investigated steroids usually reveal a high gastrointestinal absorption and a good oral bioavailability, may inhibit someof the human cytochromes involved in the metabolism of xenobiotics (CYP2C9 being the most affected) and reflect a good capacity for skin penetration. There are predicted numerous side effects of investigated steroids in humans: genotoxic carcinogenicity, hepatotoxicity, cardiovascular, hematotoxic and genitourinary effects, dermal irritations, endocrine disruption and reproductive dysfunction. These results are important to be known as an occupational exposure to anabolic and androgenic steroids at workplaces may occur and because there also is a deliberate human exposure to steroids for their performance enhancement and anti-aging properties.

  13. Experimental and Computational Characterization of Biological Liquid Crystals: A Review of Single-Molecule Bioassays

    Directory of Open Access Journals (Sweden)

    Sungsoo Na

    2009-09-01

    Full Text Available Quantitative understanding of the mechanical behavior of biological liquid crystals such as proteins is essential for gaining insight into their biological functions, since some proteins perform notable mechanical functions. Recently, single-molecule experiments have allowed not only the quantitative characterization of the mechanical behavior of proteins such as protein unfolding mechanics, but also the exploration of the free energy landscape for protein folding. In this work, we have reviewed the current state-of-art in single-molecule bioassays that enable quantitative studies on protein unfolding mechanics and/or various molecular interactions. Specifically, single-molecule pulling experiments based on atomic force microscopy (AFM have been overviewed. In addition, the computational simulations on single-molecule pulling experiments have been reviewed. We have also reviewed the AFM cantilever-based bioassay that provides insight into various molecular interactions. Our review highlights the AFM-based single-molecule bioassay for quantitative characterization of biological liquid crystals such as proteins.

  14. Discovering local patterns of co - evolution: computational aspects and biological examples

    Directory of Open Access Journals (Sweden)

    Tuller Tamir

    2010-01-01

    Full Text Available Abstract Background Co-evolution is the process in which two (or more sets of orthologs exhibit a similar or correlative pattern of evolution. Co-evolution is a powerful way to learn about the functional interdependencies between sets of genes and cellular functions and to predict physical interactions. More generally, it can be used for answering fundamental questions about the evolution of biological systems. Orthologs that exhibit a strong signal of co-evolution in a certain part of the evolutionary tree may show a mild signal of co-evolution in other branches of the tree. The major reasons for this phenomenon are noise in the biological input, genes that gain or lose functions, and the fact that some measures of co-evolution relate to rare events such as positive selection. Previous publications in the field dealt with the problem of finding sets of genes that co-evolved along an entire underlying phylogenetic tree, without considering the fact that often co-evolution is local. Results In this work, we describe a new set of biological problems that are related to finding patterns of local co-evolution. We discuss their computational complexity and design algorithms for solving them. These algorithms outperform other bi-clustering methods as they are designed specifically for solving the set of problems mentioned above. We use our approach to trace the co-evolution of fungal, eukaryotic, and mammalian genes at high resolution across the different parts of the corresponding phylogenetic trees. Specifically, we discover regions in the fungi tree that are enriched with positive evolution. We show that metabolic genes exhibit a remarkable level of co-evolution and different patterns of co-evolution in various biological datasets. In addition, we find that protein complexes that are related to gene expression exhibit non-homogenous levels of co-evolution across different parts of the fungi evolutionary line. In the case of mammalian evolution

  15. Correlation between crystallographic computing and artificial intelligence research

    Energy Technology Data Exchange (ETDEWEB)

    Feigenbaum, E A [Stanford Univ., CA; Engelmore, R S; Johnson, C K

    1977-01-01

    Artificial intelligence research, as a part of computer science, has produced a variety of programs of experimental and applications interest: programs for scientific inference, chemical synthesis, planning robot control, extraction of meaning from English sentences, speech understanding, interpretation of visual images, and so on. The symbolic manipulation techniques used in artificial intelligence provide a framework for analyzing and coding the knowledge base of a problem independently of an algorithmic implementation. A possible application of artificial intelligence methodology to protein crystallography is described. 2 figures, 2 tables.

  16. More Ideas for Monitoring Biological Experiments with the BBC Computer: Absorption Spectra, Yeast Growth, Enzyme Reactions and Animal Behaviour.

    Science.gov (United States)

    Openshaw, Peter

    1988-01-01

    Presented are five ideas for A-level biology experiments using a laboratory computer interface. Topics investigated include photosynthesis, yeast growth, animal movements, pulse rates, and oxygen consumption and production by organisms. Includes instructions specific to the BBC computer system. (CW)

  17. The trajectory of dispersal research in conservation biology. Systematic review.

    Directory of Open Access Journals (Sweden)

    Don A Driscoll

    Full Text Available Dispersal knowledge is essential for conservation management, and demand is growing. But are we accumulating dispersal knowledge at a pace that can meet the demand? To answer this question we tested for changes in dispersal data collection and use over time. Our systematic review of 655 conservation-related publications compared five topics: climate change, habitat restoration, population viability analysis, land planning (systematic conservation planning and invasive species. We analysed temporal changes in the: (i questions asked by dispersal-related research; (ii methods used to study dispersal; (iii the quality of dispersal data; (iv extent that dispersal knowledge is lacking, and; (v likely consequences of limited dispersal knowledge. Research questions have changed little over time; the same problems examined in the 1990s are still being addressed. The most common methods used to study dispersal were occupancy data, expert opinion and modelling, which often provided indirect, low quality information about dispersal. Although use of genetics for estimating dispersal has increased, new ecological and genetic methods for measuring dispersal are not yet widely adopted. Almost half of the papers identified knowledge gaps related to dispersal. Limited dispersal knowledge often made it impossible to discover ecological processes or compromised conservation outcomes. The quality of dispersal data used in climate change research has increased since the 1990s. In comparison, restoration ecology inadequately addresses large-scale process, whilst the gap between knowledge accumulation and growth in applications may be increasing in land planning. To overcome apparent stagnation in collection and use of dispersal knowledge, researchers need to: (i improve the quality of available data using new approaches; (ii understand the complementarities of different methods and; (iii define the value of different kinds of dispersal information for supporting

  18. The trajectory of dispersal research in conservation biology. Systematic review.

    Science.gov (United States)

    Driscoll, Don A; Banks, Sam C; Barton, Philip S; Ikin, Karen; Lentini, Pia; Lindenmayer, David B; Smith, Annabel L; Berry, Laurence E; Burns, Emma L; Edworthy, Amanda; Evans, Maldwyn J; Gibson, Rebecca; Heinsohn, Rob; Howland, Brett; Kay, Geoff; Munro, Nicola; Scheele, Ben C; Stirnemann, Ingrid; Stojanovic, Dejan; Sweaney, Nici; Villaseñor, Nélida R; Westgate, Martin J

    2014-01-01

    Dispersal knowledge is essential for conservation management, and demand is growing. But are we accumulating dispersal knowledge at a pace that can meet the demand? To answer this question we tested for changes in dispersal data collection and use over time. Our systematic review of 655 conservation-related publications compared five topics: climate change, habitat restoration, population viability analysis, land planning (systematic conservation planning) and invasive species. We analysed temporal changes in the: (i) questions asked by dispersal-related research; (ii) methods used to study dispersal; (iii) the quality of dispersal data; (iv) extent that dispersal knowledge is lacking, and; (v) likely consequences of limited dispersal knowledge. Research questions have changed little over time; the same problems examined in the 1990s are still being addressed. The most common methods used to study dispersal were occupancy data, expert opinion and modelling, which often provided indirect, low quality information about dispersal. Although use of genetics for estimating dispersal has increased, new ecological and genetic methods for measuring dispersal are not yet widely adopted. Almost half of the papers identified knowledge gaps related to dispersal. Limited dispersal knowledge often made it impossible to discover ecological processes or compromised conservation outcomes. The quality of dispersal data used in climate change research has increased since the 1990s. In comparison, restoration ecology inadequately addresses large-scale process, whilst the gap between knowledge accumulation and growth in applications may be increasing in land planning. To overcome apparent stagnation in collection and use of dispersal knowledge, researchers need to: (i) improve the quality of available data using new approaches; (ii) understand the complementarities of different methods and; (iii) define the value of different kinds of dispersal information for supporting management

  19. 2012 Gordon Research Conference on Cellular and Molecular Fungal Biology, Final Progress Report

    Energy Technology Data Exchange (ETDEWEB)

    Berman, Judith [Univ. of Minnesota, Minneapolis, MN (United States)

    2012-06-22

    The Gordon Research Conference on Cellular and Molecular Fungal Biology was held at Holderness School, Holderness New Hampshire, June 17 - 22, 2012. The 2012 Gordon Conference on Cellular and Molecular Fungal Biology (CMFB) will present the latest, cutting-edge research on the exciting and growing field of molecular and cellular aspects of fungal biology. Topics will range from yeast to filamentous fungi, from model systems to economically important organisms, and from saprophytes and commensals to pathogens of plants and animals. The CMFB conference will feature a wide range of topics including systems biology, cell biology and morphogenesis, organismal interactions, genome organisation and regulation, pathogenesis, energy metabolism, biomass production and population genomics. The Conference was well-attended with 136 participants. Gordon Research Conferences does not permit publication of meeting proceedings.

  20. 2013 Gordon Research Conference on metals in biology and seminar on bioinorganic chemistry

    Energy Technology Data Exchange (ETDEWEB)

    Rosenzweig, Amy C. [Northwestern Univ., Evanston, IL (United States)

    2013-01-25

    Typical topics for lectures and posters include: biochemical and biophysical characterization of new metal containing proteins, enzymes, nucleic acids, factors, and chelators from all forms of life; synthesis, detailed characterization, and reaction chemistry of biomimetic compounds; novel crystal and solution structures of biological molecules and synthetic metal-chelates; discussions of the roles that metals play in medicine, maintenance of the environment, and biogeochemical processes; metal homeostasis; application of theory and computations to the structure and mechanism of metal-containing biological systems; and novel applications of spectroscopy to metals in biological systems.