WorldWideScience

Sample records for resource discovery tool

  1. Role of Open Source Tools and Resources in Virtual Screening for Drug Discovery.

    Science.gov (United States)

    Karthikeyan, Muthukumarasamy; Vyas, Renu

    2015-01-01

    Advancement in chemoinformatics research in parallel with availability of high performance computing platform has made handling of large scale multi-dimensional scientific data for high throughput drug discovery easier. In this study we have explored publicly available molecular databases with the help of open-source based integrated in-house molecular informatics tools for virtual screening. The virtual screening literature for past decade has been extensively investigated and thoroughly analyzed to reveal interesting patterns with respect to the drug, target, scaffold and disease space. The review also focuses on the integrated chemoinformatics tools that are capable of harvesting chemical data from textual literature information and transform them into truly computable chemical structures, identification of unique fragments and scaffolds from a class of compounds, automatic generation of focused virtual libraries, computation of molecular descriptors for structure-activity relationship studies, application of conventional filters used in lead discovery along with in-house developed exhaustive PTC (Pharmacophore, Toxicophores and Chemophores) filters and machine learning tools for the design of potential disease specific inhibitors. A case study on kinase inhibitors is provided as an example.

  2. Discovery of natural resources

    Science.gov (United States)

    Guild, P.W.

    1976-01-01

    Mankind will continue to need ores of more or less the types and grades used today to supply its needs for new mineral raw materials, at least until fusion or some other relatively cheap, inexhaustible energy source is developed. Most deposits being mined today were exposed at the surface or found by relatively simple geophysical or other prospecting techniques, but many of these will be depleted in the foreseeable future. The discovery of deeper or less obvious deposits to replace them will require the conjunction of science and technology to deduce the laws that governed the concentration of elements into ores and to detect and evaluate the evidence of their whereabouts. Great theoretical advances are being made to explain the origins of ore deposits and understand the general reasons for their localization. These advances have unquestionable value for exploration. Even a large deposit is, however, very small, and, with few exceptions, it was formed under conditions that have long since ceased to exist. The explorationist must suppress a great deal of "noise" to read and interpret correctly the "signals" that can define targets and guide the drilling required to find it. Is enough being done to ensure the long-term availability of mineral raw materials? The answer is probably no, in view of the expanding consumption and the difficulty of finding new deposits, but ingenuity, persistence, and continued development of new methods and tools to add to those already at hand should put off the day of "doing without" for many years. The possibility of resource exhaustion, especially in view of the long and increasing lead time needed to carry out basic field and laboratory studies in geology, geophysics, and geochemistry and to synthesize and analyze the information gained from them counsels against any letting down of our guard, however (17). Research and exploration by government, academia, and industry must be supported and encouraged; we cannot wait until an eleventh

  3. Discovery Mondays: Surveyors' Tools

    CERN Multimedia

    2003-01-01

    Surveyors of all ages, have your rulers and compasses at the ready! This sixth edition of Discovery Monday is your chance to learn about the surveyor's tools - the state of the art in measuring instruments - and see for yourself how they work. With their usual daunting precision, the members of CERN's Surveying Group have prepared some demonstrations and exercises for you to try. Find out the techniques for ensuring accelerator alignment and learn about high-tech metrology systems such as deviation indicators, tracking lasers and total stations. The surveyors will show you how they precisely measure magnet positioning, with accuracy of a few thousandths of a millimetre. You can try your hand at precision measurement using different types of sensor and a modern-day version of the Romans' bubble level, accurate to within a thousandth of a millimetre. You will learn that photogrammetry techniques can transform even a simple digital camera into a remarkable measuring instrument. Finally, you will have a chance t...

  4. The Role of School District Science Coordinators in the District-Wide Appropriation of an Online Resource Discovery and Sharing Tool for Teachers

    Science.gov (United States)

    Lee, Victor R.; Leary, Heather M.; Sellers, Linda; Recker, Mimi

    2014-06-01

    When introducing and implementing a new technology for science teachers within a school district, we must consider not only the end users but also the roles and influence district personnel have on the eventual appropriation of that technology. School districts are, by their nature, complex systems with multiple individuals at different levels in the organization who are involved in supporting and providing instruction. Varying levels of support for new technologies between district coordinators and teachers can sometimes lead to counterintuitive outcomes. In this article, we examine the role of the district science coordinator in five school districts that participated in the implementation of an online resource discovery and sharing tool for Earth science teachers. Using a qualitative approach, we conducted and coded interviews with district coordinators and teachers to examine the varied responsibilities associated with the district coordinator and to infer the relationships that were developed and perceived by teachers. We then examine and discuss two cases that illustrate how those relationships could have influenced how the tool was adopted and used to differing degrees in the two districts. Specifically, the district that had high support for online resource use from its coordinator appeared to have the lowest level of tool use, and the district with much less visible support from its coordinator had the highest level of tool use. We explain this difference in terms of how the coordinator's promotion of teacher autonomy took distinctly different forms at those two districts.

  5. Resource Discovery and Universal Access: Understanding Enablers and Barriers from the User Perspective

    OpenAIRE

    Beyene, Wondwossen

    2016-01-01

    Resource discovery tools are keys to explore, find , and retrieve resources from multitudes of collections hosted by library and information systems. Modern resource discovery tools provide facet - rich interfaces that provide multiple alternatives to ex pose resources for their potential users and help them navigate to the resources they need. This paper examines one of those tools from the perspective of universal access, ...

  6. Resource-estimation models and predicted discovery

    International Nuclear Information System (INIS)

    Hill, G.W.

    1982-01-01

    Resources have been estimated by predictive extrapolation from past discovery experience, by analogy with better explored regions, or by inference from evidence of depletion of targets for exploration. Changes in technology and new insights into geological mechanisms have occurred sufficiently often in the long run to form part of the pattern of mature discovery experience. The criterion, that a meaningful resource estimate needs an objective measure of its precision or degree of uncertainty, excludes 'estimates' based solely on expert opinion. This is illustrated by development of error measures for several persuasive models of discovery and production of oil and gas in USA, both annually and in terms of increasing exploration effort. Appropriate generalizations of the models resolve many points of controversy. This is illustrated using two USA data sets describing discovery of oil and of U 3 O 8 ; the latter set highlights an inadequacy of available official data. Review of the oil-discovery data set provides a warrant for adjusting the time-series prediction to a higher resource figure for USA petroleum. (author)

  7. Semantic distributed resource discovery for multiple resource providers

    NARCIS (Netherlands)

    Pittaras, C.; Ghijsen, M.; Wibisono, A.; Grosso, P.; van der Ham, J.; de Laat, C.

    2012-01-01

    An emerging modus operandi among providers of cloud infrastructures is the one where they share and combine their heterogenous resources to offer end user services tailored to specific scientific and business needs. A challenge to overcome is the discovery of suitable resources among these multiple

  8. Resource Discovery within the Networked "Hybrid" Library.

    Science.gov (United States)

    Leigh, Sally-Anne

    This paper focuses on the development, adoption, and integration of resource discovery, knowledge management, and/or knowledge sharing interfaces such as interactive portals, and the use of the library's World Wide Web presence to increase the availability and usability of information services. The introduction addresses changes in library…

  9. Discovery and Use of Online Learning Resources: Case Study Findings

    Directory of Open Access Journals (Sweden)

    Laurie Miller Nelson

    2004-04-01

    Full Text Available Much recent research and funding have focused on building Internet-based repositories that contain collections of high-quality learning resources, often called ‘learning objects.’ Yet little is known about how non-specialist users, in particular teachers, find, access, and use digital learning resources. To address this gap, this article describes a case study of mathematics and science teachers’ practices and desires surrounding the discovery, selection, and use of digital library resources for instructional purposes. Findings suggest that the teacher participants used a broad range of search strategies in order to find resources that they deemed were age-appropriate, current, and accurate. They intended to include these resources with little modifications into planned instructional activities. The article concludes with a discussion of the implications of the findings for improving the design of educational digital library systems, including tools supporting resource reuse.

  10. ATO Resource Tool -

    Data.gov (United States)

    Department of Transportation — Cru-X/ART is a shift management tool designed for?use by operational employees in Air Traffic Facilities.? Cru-X/ART is used for shift scheduling, shift sign in/out,...

  11. Usability Test Results for a Discovery Tool in an Academic Library

    Directory of Open Access Journals (Sweden)

    Jody Condit Fagan

    2008-03-01

    Full Text Available Discovery tools are emerging in libraries. These tools offer library patrons the ability to concurrently search the library catalog and journal articles. While vendors rush to provide feature-rich interfaces and access to as much content as possible, librarians wonder about the usefulness of these tools to library patrons. In order to learn about both the utility and usability of EBSCO Discovery Service, James Madison University conducted a usability test with eight students and two faculty members. The test consisted of nine tasks focused on common patron requests or related to the utility of specific discovery tool features. Software recorded participants’ actions and time on task, human observers judged the success of each task, and a post-survey questionnaire gathered qualitative feedback and comments from the participants.  Overall, participants were successful at most tasks, but specific usability problems suggested some interface changes for both EBSCO Discovery Service and JMU’s customizations of the tool.  The study also raised several questions for libraries above and beyond any specific discovery tool interface, including the scope and purpose of a discovery tool versus other library systems, working with the large result sets made possible by discovery tools, and navigation between the tool and other library services and resources.  This article will be of interest to those who are investigating discovery tools, selecting products, integrating discovery tools into a library web presence, or performing evaluations of similar systems.

  12. The Biomedical Resource Ontology (BRO) to enable resource discovery in clinical and translational research.

    Science.gov (United States)

    Tenenbaum, Jessica D; Whetzel, Patricia L; Anderson, Kent; Borromeo, Charles D; Dinov, Ivo D; Gabriel, Davera; Kirschner, Beth; Mirel, Barbara; Morris, Tim; Noy, Natasha; Nyulas, Csongor; Rubenson, David; Saxman, Paul R; Singh, Harpreet; Whelan, Nancy; Wright, Zach; Athey, Brian D; Becich, Michael J; Ginsburg, Geoffrey S; Musen, Mark A; Smith, Kevin A; Tarantal, Alice F; Rubin, Daniel L; Lyster, Peter

    2011-02-01

    The biomedical research community relies on a diverse set of resources, both within their own institutions and at other research centers. In addition, an increasing number of shared electronic resources have been developed. Without effective means to locate and query these resources, it is challenging, if not impossible, for investigators to be aware of the myriad resources available, or to effectively perform resource discovery when the need arises. In this paper, we describe the development and use of the Biomedical Resource Ontology (BRO) to enable semantic annotation and discovery of biomedical resources. We also describe the Resource Discovery System (RDS) which is a federated, inter-institutional pilot project that uses the BRO to facilitate resource discovery on the Internet. Through the RDS framework and its associated Biositemaps infrastructure, the BRO facilitates semantic search and discovery of biomedical resources, breaking down barriers and streamlining scientific research that will improve human health. Copyright © 2010 Elsevier Inc. All rights reserved.

  13. Facilitating NCAR Data Discovery by Connecting Related Resources

    Science.gov (United States)

    Rosati, A.

    2012-12-01

    Linking datasets, creators, and users by employing the proper standards helps to increase the impact of funded research. In order for users to find a dataset, it must first be named. Data citations play the important role of giving datasets a persistent presence by assigning a formal "name" and location. This project focuses on the next step of the "name-find-use" sequence: enhancing discoverability of NCAR data by connecting related resources on the web. By examining metadata schemas that document datasets, I examined how Semantic Web approaches can help to ensure the widest possible range of data users. The focus was to move from search engine optimization (SEO) to information connectivity. Two main markup types are very visible in the Semantic Web and applicable to scientific dataset discovery: The Open Archives Initiative-Object Reuse and Exchange (OAI-ORE - www.openarchives.org) and Microdata (HTML5 and www.schema.org). My project creates pilot aggregations of related resources using both markup types for three case studies: The North American Regional Climate Change Assessment Program (NARCCAP) dataset and related publications, the Palmer Drought Severity Index (PSDI) animation and image files from NCAR's Visualization Lab (VisLab), and the multidisciplinary data types and formats from the Advanced Cooperative Arctic Data and Information Service (ACADIS). This project documents the differences between these markups and how each creates connectedness on the web. My recommendations point toward the most efficient and effective markup schema for aggregating resources within the three case studies based on the following assessment criteria: ease of use, current state of support and adoption of technology, integration with typical web tools, available vocabularies and geoinformatic standards, interoperability with current repositories and access portals (e.g. ESG, Java), and relation to data citation tools and methods.

  14. Discovery and Use of Online Learning Resources: Case Study Findings

    OpenAIRE

    Laurie Miller Nelson; James Dorward; Mimi M. Recker

    2004-01-01

    Much recent research and funding have focused on building Internet-based repositories that contain collections of high-quality learning resources, often called learning objects. Yet little is known about how non-specialist users, in particular teachers, find, access, and use digital learning resources. To address this gap, this article describes a case study of mathematics and science teachers practices and desires surrounding the discovery, selection, and use of digital library resources for...

  15. Resource Discovery in Activity-Based Sensor Networks

    DEFF Research Database (Denmark)

    Bucur, Doina; Bardram, Jakob

    This paper proposes a service discovery protocol for sensor networks that is specifically tailored for use in humancentered pervasive environments. It uses the high-level concept of computational activities (as logical bundles of data and resources) to give sensors in Activity-Based Sensor Networks...... (ABSNs) knowledge about their usage even at the network layer. ABSN redesigns classical network-level service discovery protocols to include and use this logical structuring of the network for a more practically applicable service discovery scheme. Noting that in practical settings activity-based sensor...

  16. Node Discovery and Interpretation in Unstructured Resource-Constrained Environments

    DEFF Research Database (Denmark)

    Gechev, Miroslav; Kasabova, Slavyana; Mihovska, Albena D.

    2014-01-01

    for the discovery, linking and interpretation of nodes in unstructured and resource-constrained network environments and their interrelated and collective use for the delivery of smart services. The model is based on a basic mathematical approach, which describes and predicts the success of human interactions...... in the context of long-term relationships and identifies several key variables in the context of communications in resource-constrained environments. The general theoretical model is described and several algorithms are proposed as part of the node discovery, identification, and linking processes in relation...

  17. Resource Discovery in Activity-Based Sensor Networks

    DEFF Research Database (Denmark)

    Bucur, Doina; Bardram, Jakob

    This paper proposes a service discovery protocol for sensor networks that is specifically tailored for use in humancentered pervasive environments. It uses the high-level concept of computational activities (as logical bundles of data and resources) to give sensors in Activity-Based Sensor Networ....... ABSN enhances the generic Extended Zone Routing Protocol with logical sensor grouping and greatly lowers network overhead during the process of discovery, while keeping discovery latency close to optimal.......This paper proposes a service discovery protocol for sensor networks that is specifically tailored for use in humancentered pervasive environments. It uses the high-level concept of computational activities (as logical bundles of data and resources) to give sensors in Activity-Based Sensor Networks...... (ABSNs) knowledge about their usage even at the network layer. ABSN redesigns classical network-level service discovery protocols to include and use this logical structuring of the network for a more practically applicable service discovery scheme. Noting that in practical settings activity-based sensor...

  18. Laboratory informatics tools integration strategies for drug discovery: integration of LIMS, ELN, CDS, and SDMS.

    Science.gov (United States)

    Machina, Hari K; Wild, David J

    2013-04-01

    There are technologies on the horizon that could dramatically change how informatics organizations design, develop, deliver, and support applications and data infrastructures to deliver maximum value to drug discovery organizations. Effective integration of data and laboratory informatics tools promises the ability of organizations to make better informed decisions about resource allocation during the drug discovery and development process and for more informed decisions to be made with respect to the market opportunity for compounds. We propose in this article a new integration model called ELN-centric laboratory informatics tools integration.

  19. Uranium Exploration (2004-2014): New Discoveries, New Resources

    International Nuclear Information System (INIS)

    Polak, Christian

    2014-01-01

    Conclusion: 10 years of discovery? • Large effort of exploration; • Large amount of compliant resources discovered or confirmed; • New process development for low cost and for low grade; • New production from this effort still limited < 10%; • Feasibilty studies must confirm viability of economic exploitation and therefore resources quality; • Consolidation to set up critical mass deposits. ► To be ready for the coming decades 2020 +A

  20. Tools & Resources | Efficient Windows Collaborative

    Science.gov (United States)

    Foundry Foundry New Construction Windows Window Selection Tool Selection Process Design Guidance Installation Replacement Windows Window Selection Tool Assessing Options Selection Process Design Guidance Installation Understanding Windows Benefits Design Considerations Measuring Performance Performance Standards

  1. Recent development of computational resources for new antibiotics discovery

    DEFF Research Database (Denmark)

    Kim, Hyun Uk; Blin, Kai; Lee, Sang Yup

    2017-01-01

    Understanding a complex working mechanism of biosynthetic gene clusters (BGCs) encoding secondary metabolites is a key to discovery of new antibiotics. Computational resources continue to be developed in order to better process increasing volumes of genome and chemistry data, and thereby better...

  2. Wide-Area Publish/Subscribe Mobile Resource Discovery Based on IPv6 GeoNetworking

    OpenAIRE

    Noguchi, Satoru; Matsuura, Satoshi; Inomata, Atsuo; Fujikawa, Kazutoshi; Sunahara, Hideki

    2013-01-01

    Resource discovery is an essential function for distributed mobile applications integrated in vehicular communication systems. Key requirements of the mobile resource discovery are wide-area geographic-based discovery and scalable resource discovery not only inside a vehicular ad-hoc network but also through the Internet. While a number of resource discovery solutions have been proposed, most of them have focused on specific scale of network. Furthermore, managing a large number of mobile res...

  3. Study of Tools for Network Discovery and Network Mapping

    Science.gov (United States)

    2003-11-01

    connected to the switch. iv. Accessibility of historical data and event data In general, network discovery tools keep a history of the collected...has the following software dependencies: - Java Virtual machine 76 - Perl modules - RRD Tool - TomCat - PostgreSQL STRENGTHS AND...systems - provide a simple view of the current network status - generate alarms on status change - generate history of status change VISUAL MAP

  4. A Metadata Schema for Geospatial Resource Discovery Use Cases

    Directory of Open Access Journals (Sweden)

    Darren Hardy

    2014-07-01

    Full Text Available We introduce a metadata schema that focuses on GIS discovery use cases for patrons in a research library setting. Text search, faceted refinement, and spatial search and relevancy are among GeoBlacklight's primary use cases for federated geospatial holdings. The schema supports a variety of GIS data types and enables contextual, collection-oriented discovery applications as well as traditional portal applications. One key limitation of GIS resource discovery is the general lack of normative metadata practices, which has led to a proliferation of metadata schemas and duplicate records. The ISO 19115/19139 and FGDC standards specify metadata formats, but are intricate, lengthy, and not focused on discovery. Moreover, they require sophisticated authoring environments and cataloging expertise. Geographic metadata standards target preservation and quality measure use cases, but they do not provide for simple inter-institutional sharing of metadata for discovery use cases. To this end, our schema reuses elements from Dublin Core and GeoRSS to leverage their normative semantics, community best practices, open-source software implementations, and extensive examples already deployed in discovery contexts such as web search and mapping. Finally, we discuss a Solr implementation of the schema using a "geo" extension to MODS.

  5. SNPServer: a real-time SNP discovery tool.

    Science.gov (United States)

    Savage, David; Batley, Jacqueline; Erwin, Tim; Logan, Erica; Love, Christopher G; Lim, Geraldine A C; Mongin, Emmanuel; Barker, Gary; Spangenberg, German C; Edwards, David

    2005-07-01

    SNPServer is a real-time flexible tool for the discovery of SNPs (single nucleotide polymorphisms) within DNA sequence data. The program uses BLAST, to identify related sequences, and CAP3, to cluster and align these sequences. The alignments are parsed to the SNP discovery software autoSNP, a program that detects SNPs and insertion/deletion polymorphisms (indels). Alternatively, lists of related sequences or pre-assembled sequences may be entered for SNP discovery. SNPServer and autoSNP use redundancy to differentiate between candidate SNPs and sequence errors. For each candidate SNP, two measures of confidence are calculated, the redundancy of the polymorphism at a SNP locus and the co-segregation of the candidate SNP with other SNPs in the alignment. SNPServer is available at http://hornbill.cspp.latrobe.edu.au/snpdiscovery.html.

  6. The Discovery Dome: A Tool for Increasing Student Engagement

    Science.gov (United States)

    Brevik, Corinne

    2015-04-01

    The Discovery Dome is a portable full-dome theater that plays professionally-created science films. Developed by the Houston Museum of Natural Science and Rice University, this inflatable planetarium offers a state-of-the-art visual learning experience that can address many different fields of science for any grade level. It surrounds students with roaring dinosaurs, fascinating planets, and explosive storms - all immersive, engaging, and realistic. Dickinson State University has chosen to utilize its Discovery Dome to address Earth Science education at two levels. University courses across the science disciplines can use the Discovery Dome as part of their curriculum. The digital shows immerse the students in various topics ranging from astronomy to geology to weather and climate. The dome has proven to be a valuable tool for introducing new material to students as well as for reinforcing concepts previously covered in lectures or laboratory settings. The Discovery Dome also serves as an amazing science public-outreach tool. University students are trained to run the dome, and they travel with it to schools and libraries around the region. During the 2013-14 school year, our Discovery Dome visited over 30 locations. Many of the schools visited are in rural settings which offer students few opportunities to experience state-of-the-art science technology. The school kids are extremely excited when the Discovery Dome visits their community, and they will talk about the experience for many weeks. Traveling with the dome is also very valuable for the university students who get involved in the program. They become very familiar with the science content, and they gain experience working with teachers as well as the general public. They get to share their love of science, and they get to help inspire a new generation of scientists.

  7. Radio Resource Management for V2V Discovery

    DEFF Research Database (Denmark)

    Alvarez, Beatriz Soret; Gatnau, Marta; Kovács, Istvan

    2016-01-01

    Big expectations are put into vehicular communications (V2X) for a safer and more intelligent driving. With human lives at risk, the system cannot afford to fail, which translates into very stringent reliability and latency requirements to the radio network. One of the challenges is to find...... efficient radio resource management (RRM) strategies for direct vehicle-to-vehicle (V2V) communication that can fulfil the requirements even with high traffic density. In cellular networks, a device-to-device (D2D) communication is usually split into two phases: the discovery process, for node awareness...... of each other; and the communication phase itself, where data exchange takes place. In the case of V2V, the discovery phase can utilize the status information that cars broadcast periodically as the beacons to detect the presence of neighbouring cars. For the delivery of specific messages (e...

  8. Bioinformatics Tools for the Discovery of New Nonribosomal Peptides

    DEFF Research Database (Denmark)

    Leclère, Valérie; Weber, Tilmann; Jacques, Philippe

    2016-01-01

    -dimensional structure of the peptides can be compared with the structural patterns of all known NRPs. The presented workflow leads to an efficient and rapid screening of genomic data generated by high throughput technologies. The exploration of such sequenced genomes may lead to the discovery of new drugs (i......This chapter helps in the use of bioinformatics tools relevant to the discovery of new nonribosomal peptides (NRPs) produced by microorganisms. The strategy described can be applied to draft or fully assembled genome sequences. It relies on the identification of the synthetase genes...... and the deciphering of the domain architecture of the nonribosomal peptide synthetases (NRPSs). In the next step, candidate peptides synthesized by these NRPSs are predicted in silico, considering the specificity of incorporated monomers together with their isomery. To assess their novelty, the two...

  9. Freely Accessible Chemical Database Resources of Compounds for in Silico Drug Discovery.

    Science.gov (United States)

    Yang, JingFang; Wang, Di; Jia, Chenyang; Wang, Mengyao; Hao, GeFei; Yang, GuangFu

    2018-05-07

    In silico drug discovery has been proved to be a solidly established key component in early drug discovery. However, this task is hampered by the limitation of quantity and quality of compound databases for screening. In order to overcome these obstacles, freely accessible database resources of compounds have bloomed in recent years. Nevertheless, how to choose appropriate tools to treat these freely accessible databases are crucial. To the best of our knowledge, this is the first systematic review on this issue. The existed advantages and drawbacks of chemical databases were analyzed and summarized based on the collected six categories of freely accessible chemical databases from literature in this review. Suggestions on how and in which conditions the usage of these databases could be reasonable were provided. Tools and procedures for building 3D structure chemical libraries were also introduced. In this review, we described the freely accessible chemical database resources for in silico drug discovery. In particular, the chemical information for building chemical database appears as attractive resources for drug design to alleviate experimental pressure. Copyright© Bentham Science Publishers; For any queries, please email at epub@benthamscience.org.

  10. GeoSearch: A lightweight broking middleware for geospatial resources discovery

    Science.gov (United States)

    Gui, Z.; Yang, C.; Liu, K.; Xia, J.

    2012-12-01

    With petabytes of geodata, thousands of geospatial web services available over the Internet, it is critical to support geoscience research and applications by finding the best-fit geospatial resources from the massive and heterogeneous resources. Past decades' developments witnessed the operation of many service components to facilitate geospatial resource management and discovery. However, efficient and accurate geospatial resource discovery is still a big challenge due to the following reasons: 1)The entry barriers (also called "learning curves") hinder the usability of discovery services to end users. Different portals and catalogues always adopt various access protocols, metadata formats and GUI styles to organize, present and publish metadata. It is hard for end users to learn all these technical details and differences. 2)The cost for federating heterogeneous services is high. To provide sufficient resources and facilitate data discovery, many registries adopt periodic harvesting mechanism to retrieve metadata from other federated catalogues. These time-consuming processes lead to network and storage burdens, data redundancy, and also the overhead of maintaining data consistency. 3)The heterogeneous semantics issues in data discovery. Since the keyword matching is still the primary search method in many operational discovery services, the search accuracy (precision and recall) is hard to guarantee. Semantic technologies (such as semantic reasoning and similarity evaluation) offer a solution to solve these issues. However, integrating semantic technologies with existing service is challenging due to the expandability limitations on the service frameworks and metadata templates. 4)The capabilities to help users make final selection are inadequate. Most of the existing search portals lack intuitive and diverse information visualization methods and functions (sort, filter) to present, explore and analyze search results. Furthermore, the presentation of the value

  11. Uranium exploration (2004-2014): New discoveries, new resources

    International Nuclear Information System (INIS)

    Polack, C.

    2014-01-01

    The last decade has demonstrated the dynamic of the mining industry to respond of the need of the market to explore and discover new deposits. For the first time in the uranium industry, the effort was conducted not only by the majors but by numerous junior mining companies, more than 800 companies where involved. Junior miners introduced new methodologies, innovations and fresh approach. Working mainly on former prospects of the 70’s and 80’s they discovered new deposits, transformed historical resources into compliant resources and reserves and developed new large resources in Africa, North America and Australia. In Australia, the Four Mile, Mt Gee, Samphire (SA), Mount Isa (Qld), Mulga Rock, Wiluna-Lake Maitland, Carley Bore-Yanrey-Manyingee (WA) projects were all advanced to compliant resources or reserves by junior mining companies. In Canada, activity was mainly focused on Athabasca basin, Newfoundland and Québec, the results are quite amazing. In the Athabasca 2 new deposits were identified, Roughrider and Patterson South Lake, Whilst in Québec the Matouch project and in New Foundland the Michelin project are showing good potential. In Namibia, alaskite and surficial deposits, extended the model of the Dalmaradian Central belt with the extension of rich alaskite of Z20, Husab, Omahola and large deposits of Etango and Norasa. A new mine commenced production Langer Heinrich and two are well advanced on way to production: Trekkopje and Husab. The ISL model continues its success in Central Asia with large discoveries in Mongolia and China. Europe has been revisited by some juniors with an increase of resources in Spain (Salamanca) and Slovakia (Kuriskova). Some countries entered into the uranium club with maiden resources namely Mali (Falea), Mauritania and Peru (Macusani caldeira). The Karoo formation revitalised interest for exploration within Paraguay, South Africa (Rieskuil), Botswana (Lethlakane), Zambia (Mutanga, Chirundu) and the exploitation

  12. Big Biomedical data as the key resource for discovery science

    Energy Technology Data Exchange (ETDEWEB)

    Toga, Arthur W.; Foster, Ian; Kesselman, Carl; Madduri, Ravi; Chard, Kyle; Deutsch, Eric W.; Price, Nathan D.; Glusman, Gustavo; Heavner, Benjamin D.; Dinov, Ivo D.; Ames, Joseph; Van Horn, John; Kramer, Roger; Hood, Leroy

    2015-07-21

    Modern biomedical data collection is generating exponentially more data in a multitude of formats. This flood of complex data poses significant opportunities to discover and understand the critical interplay among such diverse domains as genomics, proteomics, metabolomics, and phenomics, including imaging, biometrics, and clinical data. The Big Data for Discovery Science Center is taking an “-ome to home” approach to discover linkages between these disparate data sources by mining existing databases of proteomic and genomic data, brain images, and clinical assessments. In support of this work, the authors developed new technological capabilities that make it easy for researchers to manage, aggregate, manipulate, integrate, and model large amounts of distributed data. Guided by biological domain expertise, the Center’s computational resources and software will reveal relationships and patterns, aiding researchers in identifying biomarkers for the most confounding conditions and diseases, such as Parkinson’s and Alzheimer’s.

  13. Big biomedical data as the key resource for discovery science.

    Science.gov (United States)

    Toga, Arthur W; Foster, Ian; Kesselman, Carl; Madduri, Ravi; Chard, Kyle; Deutsch, Eric W; Price, Nathan D; Glusman, Gustavo; Heavner, Benjamin D; Dinov, Ivo D; Ames, Joseph; Van Horn, John; Kramer, Roger; Hood, Leroy

    2015-11-01

    Modern biomedical data collection is generating exponentially more data in a multitude of formats. This flood of complex data poses significant opportunities to discover and understand the critical interplay among such diverse domains as genomics, proteomics, metabolomics, and phenomics, including imaging, biometrics, and clinical data. The Big Data for Discovery Science Center is taking an "-ome to home" approach to discover linkages between these disparate data sources by mining existing databases of proteomic and genomic data, brain images, and clinical assessments. In support of this work, the authors developed new technological capabilities that make it easy for researchers to manage, aggregate, manipulate, integrate, and model large amounts of distributed data. Guided by biological domain expertise, the Center's computational resources and software will reveal relationships and patterns, aiding researchers in identifying biomarkers for the most confounding conditions and diseases, such as Parkinson's and Alzheimer's. © The Author 2015. Published by Oxford University Press on behalf of the American Medical Informatics Association. All rights reserved. For Permissions, please email: journals.permissions@oup.com.

  14. Open science resources for the discovery and analysis of Tara Oceans data.

    Science.gov (United States)

    Pesant, Stéphane; Not, Fabrice; Picheral, Marc; Kandels-Lewis, Stefanie; Le Bescot, Noan; Gorsky, Gabriel; Iudicone, Daniele; Karsenti, Eric; Speich, Sabrina; Troublé, Romain; Dimier, Céline; Searson, Sarah

    2015-01-01

    The Tara Oceans expedition (2009-2013) sampled contrasting ecosystems of the world oceans, collecting environmental data and plankton, from viruses to metazoans, for later analysis using modern sequencing and state-of-the-art imaging technologies. It surveyed 210 ecosystems in 20 biogeographic provinces, collecting over 35,000 samples of seawater and plankton. The interpretation of such an extensive collection of samples in their ecological context requires means to explore, assess and access raw and validated data sets. To address this challenge, the Tara Oceans Consortium offers open science resources, including the use of open access archives for nucleotides (ENA) and for environmental, biogeochemical, taxonomic and morphological data (PANGAEA), and the development of on line discovery tools and collaborative annotation tools for sequences and images. Here, we present an overview of Tara Oceans Data, and we provide detailed registries (data sets) of all campaigns (from port-to-port), stations and sampling events.

  15. BEAM web server: a tool for structural RNA motif discovery.

    Science.gov (United States)

    Pietrosanto, Marco; Adinolfi, Marta; Casula, Riccardo; Ausiello, Gabriele; Ferrè, Fabrizio; Helmer-Citterich, Manuela

    2018-03-15

    RNA structural motif finding is a relevant problem that becomes computationally hard when working on high-throughput data (e.g. eCLIP, PAR-CLIP), often represented by thousands of RNA molecules. Currently, the BEAM server is the only web tool capable to handle tens of thousands of RNA in input with a motif discovery procedure that is only limited by the current secondary structure prediction accuracies. The recently developed method BEAM (BEAr Motifs finder) can analyze tens of thousands of RNA molecules and identify RNA secondary structure motifs associated to a measure of their statistical significance. BEAM is extremely fast thanks to the BEAR encoding that transforms each RNA secondary structure in a string of characters. BEAM also exploits the evolutionary knowledge contained in a substitution matrix of secondary structure elements, extracted from the RFAM database of families of homologous RNAs. The BEAM web server has been designed to streamline data pre-processing by automatically handling folding and encoding of RNA sequences, giving users a choice for the preferred folding program. The server provides an intuitive and informative results page with the list of secondary structure motifs identified, the logo of each motif, its significance, graphic representation and information about its position in the RNA molecules sharing it. The web server is freely available at http://beam.uniroma2.it/ and it is implemented in NodeJS and Python with all major browsers supported. marco.pietrosanto@uniroma2.it. Supplementary data are available at Bioinformatics online.

  16. Biomarkers as drug development tools: discovery, validation, qualification and use.

    Science.gov (United States)

    Kraus, Virginia B

    2018-06-01

    The 21st Century Cures Act, approved in the USA in December 2016, has encouraged the establishment of the national Precision Medicine Initiative and the augmentation of efforts to address disease prevention, diagnosis and treatment on the basis of a molecular understanding of disease. The Act adopts into law the formal process, developed by the FDA, of qualification of drug development tools, including biomarkers and clinical outcome assessments, to increase the efficiency of clinical trials and encourage an era of molecular medicine. The FDA and European Medicines Agency (EMA) have developed similar processes for the qualification of biomarkers intended for use as companion diagnostics or for development and regulatory approval of a drug or therapeutic. Biomarkers that are used exclusively for the diagnosis, monitoring or stratification of patients in clinical trials are not subject to regulatory approval, although their qualification can facilitate the conduct of a trial. In this Review, the salient features of biomarker discovery, analytical validation, clinical qualification and utilization are described in order to provide an understanding of the process of biomarker development and, through this understanding, convey an appreciation of their potential advantages and limitations.

  17. Research resources: curating the new eagle-i discovery system

    Science.gov (United States)

    Vasilevsky, Nicole; Johnson, Tenille; Corday, Karen; Torniai, Carlo; Brush, Matthew; Segerdell, Erik; Wilson, Melanie; Shaffer, Chris; Robinson, David; Haendel, Melissa

    2012-01-01

    Development of biocuration processes and guidelines for new data types or projects is a challenging task. Each project finds its way toward defining annotation standards and ensuring data consistency with varying degrees of planning and different tools to support and/or report on consistency. Further, this process may be data type specific even within the context of a single project. This article describes our experiences with eagle-i, a 2-year pilot project to develop a federated network of data repositories in which unpublished, unshared or otherwise ‘invisible’ scientific resources could be inventoried and made accessible to the scientific community. During the course of eagle-i development, the main challenges we experienced related to the difficulty of collecting and curating data while the system and the data model were simultaneously built, and a deficiency and diversity of data management strategies in the laboratories from which the source data was obtained. We discuss our approach to biocuration and the importance of improving information management strategies to the research process, specifically with regard to the inventorying and usage of research resources. Finally, we highlight the commonalities and differences between eagle-i and similar efforts with the hope that our lessons learned will assist other biocuration endeavors. Database URL: www.eagle-i.net PMID:22434835

  18. Nora: A Vocabulary Discovery Tool for Concept Extraction.

    Science.gov (United States)

    Divita, Guy; Carter, Marjorie E; Durgahee, B S Begum; Pettey, Warren E; Redd, Andrew; Samore, Matthew H; Gundlapalli, Adi V

    2015-01-01

    Coverage of terms in domain-specific terminologies and ontologies is often limited in controlled medical vocabularies. Creating and augmenting such terminologies is resource intensive. We developed Nora as an interactive tool to discover terminology from text corpora; the output can then be employed to refine and enhance natural language processing-based concept extraction tasks. Nora provides a visualization of chains of words foraged from word frequency indexes from a text corpus. Domain experts direct and curate chains that contain relevant terms, which are further curated to identify lexical variants. A test of Nora demonstrated an increase of a domain lexicon in homelessness and related psychosocial factors by 38%, yielding an additional 10% extracted concepts.

  19. Estimating long-term uranium resource availability and discovery requirements. A Canadian case study

    International Nuclear Information System (INIS)

    Martin, H.L.; Azis, A.; Williams, R.M.

    1979-01-01

    Well-founded estimates of the rate at which a country's resources might be made available are a prime requisite for energy planners and policy makers at the national level. To meet this need, a method is discussed that can aid in the analysis of future supply patterns of uranium and other metals. Known sources are first appraised, on a mine-by-mine basis, in relation to projected domestic needs and expectable export levels. The gap between (a) production from current and anticipated mines, and (b) production levels needed to meet both domestic needs and export opportunities, would have to be met by new sources. Using as measuring sticks the resources and production capabilities of typical uranium deposits, a measure can be obtained of the required timing and magnitude of discovery needs. The new discoveries, when developed into mines, would need to be sufficient to meet not only any shortfalls in production capability, but also any special reserve requirements as stipulated, for example, under Canada's uranium export guidelines. Since the method can be followed simply and quickly, it can serve as a valuable tool for long-term supply assessments of any mineral commodity from a nation's mines. (author)

  20. OpenSearch technology for geospatial resources discovery

    Science.gov (United States)

    Papeschi, Fabrizio; Enrico, Boldrini; Mazzetti, Paolo

    2010-05-01

    set of services for discovery, access, and processing of geospatial resources in a SOA framework. GI-cat is a distributed CSW framework implementation developed by the ESSI Lab of the Italian National Research Council (CNR-IMAA) and the University of Florence. It provides brokering and mediation functionalities towards heterogeneous resources and inventories, exposing several standard interfaces for query distribution. This work focuses on a new GI-cat interface which allows the catalog to be queried according to the OpenSearch syntax specification, thus filling the gap between the SOA architectural design of the CSW and the Web 2.0. At the moment, there is no OGC standard specification about this topic, but an official change request has been proposed in order to enable the OGC catalogues to support OpenSearch queries. In this change request, an OpenSearch extension is proposed providing a standard mechanism to query a resource based on temporal and geographic extents. Two new catalog operations are also proposed, in order to publish a suitable OpenSearch interface. This extended interface is implemented by the modular GI-cat architecture adding a new profiling module called "OpenSearch profiler". Since GI-cat also acts as a clearinghouse catalog, another component called "OpenSearch accessor" is added in order to access OpenSearch compliant services. An important role in the GI-cat extension, is played by the adopted mapping strategy. Two different kind of mappings are required: query, and response elements mapping. Query mapping is provided in order to fit the simple OpenSearch query syntax to the complex CSW query expressed by the OGC Filter syntax. GI-cat internal data model is based on the ISO-19115 profile, that is more complex than the simple XML syndication formats, such as RSS 2.0 and Atom 1.0, suggested by OpenSearch. Once response elements are available, in order to be presented, they need to be translated from the GI-cat internal data model, to the above

  1. Paths of Discovery: Comparing the Search Effectiveness of EBSCO Discovery Service, Summon, Google Scholar, and Conventional Library Resources

    Science.gov (United States)

    Asher, Andrew D.; Duke, Lynda M.; Wilson, Suzanne

    2013-01-01

    In 2011, researchers at Bucknell University and Illinois Wesleyan University compared the search efficacy of Serial Solutions Summon, EBSCO Discovery Service, Google Scholar, and conventional library databases. Using a mixed-methods approach, qualitative and quantitative data were gathered on students' usage of these tools. Regardless of the…

  2. Open discovery: An integrated live Linux platform of Bioinformatics tools.

    Science.gov (United States)

    Vetrivel, Umashankar; Pilla, Kalabharath

    2008-01-01

    Historically, live linux distributions for Bioinformatics have paved way for portability of Bioinformatics workbench in a platform independent manner. Moreover, most of the existing live Linux distributions limit their usage to sequence analysis and basic molecular visualization programs and are devoid of data persistence. Hence, open discovery - a live linux distribution has been developed with the capability to perform complex tasks like molecular modeling, docking and molecular dynamics in a swift manner. Furthermore, it is also equipped with complete sequence analysis environment and is capable of running windows executable programs in Linux environment. Open discovery portrays the advanced customizable configuration of fedora, with data persistency accessible via USB drive or DVD. The Open Discovery is distributed free under Academic Free License (AFL) and can be downloaded from http://www.OpenDiscovery.org.in.

  3. NETL's Energy Data Exchange (EDX) - a coordination, collaboration, and data resource discovery platform for energy science

    Science.gov (United States)

    Rose, K.; Rowan, C.; Rager, D.; Dehlin, M.; Baker, D. V.; McIntyre, D.

    2015-12-01

    Multi-organizational research teams working jointly on projects often encounter problems with discovery, access to relevant existing resources, and data sharing due to large file sizes, inappropriate file formats, or other inefficient options that make collaboration difficult. The Energy Data eXchange (EDX) from Department of Energy's (DOE) National Energy Technology Laboratory (NETL) is an evolving online research environment designed to overcome these challenges in support of DOE's fossil energy goals while offering improved access to data driven products of fossil energy R&D such as datasets, tools, and web applications. In 2011, development of NETL's Energy Data eXchange (EDX) was initiated and offers i) a means for better preserving of NETL's research and development products for future access and re-use, ii) efficient, discoverable access to authoritative, relevant, external resources, and iii) an improved approach and tools to support secure, private collaboration and coordination between multi-organizational teams to meet DOE mission and goals. EDX presently supports fossil energy and SubTER Crosscut research activities, with an ever-growing user base. EDX is built on a heavily customized instance of the open source platform, Comprehensive Knowledge Archive Network (CKAN). EDX connects users to externally relevant data and tools through connecting to external data repositories built on different platforms and other CKAN platforms (e.g. Data.gov). EDX does not download and repost data or tools that already have an online presence. This leads to redundancy and even error. If a relevant resource already has an online instance, is hosted by another online entity, EDX will point users to that external host either using web services, inventorying URLs and other methods. EDX offers users the ability to leverage private-secure capabilities custom built into the system. The team is presently working on version 3 of EDX which will incorporate big data analytical

  4. mySearch changed my life – a resource discovery journey

    OpenAIRE

    Crowley, Emma J.

    2013-01-01

    mySearch: the federated years mySearch: choosing a new platform mySearch: EBSCO Discovery Service (EDS) Implementing a new system Technical challenges Has resource discovery enhanced experiences at BU? Ongoing challenges Implications for library management systems Implications for information literacy Questions

  5. Discovery of resources using MADM approaches for parallel and distributed computing

    Directory of Open Access Journals (Sweden)

    Mandeep Kaur

    2017-06-01

    Full Text Available Grid, a form of parallel and distributed computing, allows the sharing of data and computational resources among its users from various geographical locations. The grid resources are diverse in terms of their underlying attributes. The majority of the state-of-the-art resource discovery techniques rely on the static resource attributes during resource selection. However, the matching resources based on the static resource attributes may not be the most appropriate resources for the execution of user applications because they may have heavy job loads, less storage space or less working memory (RAM. Hence, there is a need to consider the current state of the resources in order to find the most suitable resources. In this paper, we have proposed a two-phased multi-attribute decision making (MADM approach for discovery of grid resources by using P2P formalism. The proposed approach considers multiple resource attributes for decision making of resource selection and provides the best suitable resource(s to grid users. The first phase describes a mechanism to discover all matching resources and applies SAW method to shortlist the top ranked resources, which are communicated to the requesting super-peer. The second phase of our proposed methodology applies integrated MADM approach (AHP enriched PROMETHEE-II on the list of selected resources received from different super-peers. The pairwise comparison of the resources with respect to their attributes is made and the rank of each resource is determined. The top ranked resource is then communicated to the grid user by the grid scheduler. Our proposed methodology enables the grid scheduler to allocate the most suitable resource to the user application and also reduces the search complexity by filtering out the less suitable resources during resource discovery.

  6. Paths of discovery: Comparing the search effectiveness of EBSCO Discovery Service, Summon, Google Scholar, and conventional library resources.

    Directory of Open Access Journals (Sweden)

    Müge Akbulut

    2015-09-01

    Full Text Available It is becoming hard for users to select significant sources among many others as number of scientific publications increase (Henning and Gunn, 2012. Search engines that are using cloud computing methods such as Google can list related documents successfully answering user requirements (Johnson, Levine and Smith, 2009. In order to meet users’ increasing demands, libraries started to use systems which enable users to access printed and electronic sources through a single interface. This study uses quantitative and qualitative methods to compare search effectiveness between Serial Solutions Summon, EBSCO Discovery Service (EDS web discovery tools, Google Scholar (GS and conventional library databases among users from Bucknell University and Illinois Wesleyan University.

  7. RDA: Resource Description and Access: The new standard for metadata and resource discovery in the digital age

    Directory of Open Access Journals (Sweden)

    Carlo Bianchini

    2015-01-01

    Full Text Available RDA (Resource Description and Access is going to promote a great change. In fact, guidelines – rather than rules – are addressed to anyone wishes to describe and make accessible a cultural heritage collection or tout court a collection: librarians, archivists, curators and professionals in any other branch of knowledge. The work is organized in two parts: the former contains theoretical foundations of cataloguing (FRBR, ICP, semantic web and linked data, the latter a critical presentation of RDA guidelines. RDA aims to make possible creation of well-structured metadata for any kind of resources, reusable in any context and technological environment. RDA offers a “set of guidelines and instructions to create data for discovery of resources”. Guidelines stress four actions – to identify, to relate (from FRBR/FRAD user tasks and ICP, to represent and to discover – and a noun: resource. To identify entities of Group 1 and Group 2 of FRBR (Work, Expression, Manifestation, Item, Person, Family, Corporate Body; to relate entities of Group 1 and Group 2 of FRBR, by means of relationships. To enable users to represent and discover entities of Group 1 and Group 2 by means of their attributes and relationships. These last two actions are the reason of users’ searches, and users are the pinpoint of the process. RDA enables the discovery of recorded knowledge, that is any resource conveying information, any resources transmitting intellectual or artistic content by means of any kind of carrier and media. RDA is a content standard, not a display standard nor an encoding standard: it gives instructions to identify data and does not care about how display or encode data produced by guidelines. RDA requires an original approach, a metanoia, a deep change in the way we think about cataloguing. Innovations in RDA are many: it promotes interoperability between catalogs and other search tools, it adopts terminology and concepts of the Semantic Web, it

  8. Scientific workflows as productivity tools for drug discovery.

    Science.gov (United States)

    Shon, John; Ohkawa, Hitomi; Hammer, Juergen

    2008-05-01

    Large pharmaceutical companies annually invest tens to hundreds of millions of US dollars in research informatics to support their early drug discovery processes. Traditionally, most of these investments are designed to increase the efficiency of drug discovery. The introduction of do-it-yourself scientific workflow platforms has enabled research informatics organizations to shift their efforts toward scientific innovation, ultimately resulting in a possible increase in return on their investments. Unlike the handling of most scientific data and application integration approaches, researchers apply scientific workflows to in silico experimentation and exploration, leading to scientific discoveries that lie beyond automation and integration. This review highlights some key requirements for scientific workflow environments in the pharmaceutical industry that are necessary for increasing research productivity. Examples of the application of scientific workflows in research and a summary of recent platform advances are also provided.

  9. A Linked Data Approach for the Discovery of Educational ICT Tools in the Web of Data

    Science.gov (United States)

    Ruiz-Calleja, Adolfo; Vega-Gorgojo, Guillermo; Asensio-Perez, Juan I.; Bote-Lorenzo, Miguel L.; Gomez-Sanchez, Eduardo; Alario-Hoyos, Carlos

    2012-01-01

    The use of Information and Communication Technologies (ICT) tools to support learning activities is nowadays generalized. Several educational registries provide information about ICT tools in order to help educators in their discovery and selection. These registries are typically isolated and require much effort to keep tool information up to…

  10. Discovery and Use of Online Learning Resources: Case Study Findings

    Science.gov (United States)

    Recker, Mimi M.; Dorward, James; Nelson, Laurie Miller

    2004-01-01

    Much recent research and funding have focused on building Internet-based repositories that contain collections of high-quality learning resources, often called "learning objects." Yet little is known about how non-specialist users, in particular teachers, find, access, and use digital learning resources. To address this gap, this article…

  11. Mouse Models for Drug Discovery. Can New Tools and Technology Improve Translational Power?

    Science.gov (United States)

    Zuberi, Aamir; Lutz, Cathleen

    2016-01-01

    Abstract The use of mouse models in biomedical research and preclinical drug evaluation is on the rise. The advent of new molecular genome-altering technologies such as CRISPR/Cas9 allows for genetic mutations to be introduced into the germ line of a mouse faster and less expensively than previous methods. In addition, the rapid progress in the development and use of somatic transgenesis using viral vectors, as well as manipulations of gene expression with siRNAs and antisense oligonucleotides, allow for even greater exploration into genomics and systems biology. These technological advances come at a time when cost reductions in genome sequencing have led to the identification of pathogenic mutations in patient populations, providing unprecedented opportunities in the use of mice to model human disease. The ease of genetic engineering in mice also offers a potential paradigm shift in resource sharing and the speed by which models are made available in the public domain. Predictively, the knowledge alone that a model can be quickly remade will provide relief to resources encumbered by licensing and Material Transfer Agreements. For decades, mouse strains have provided an exquisite experimental tool to study the pathophysiology of the disease and assess therapeutic options in a genetically defined system. However, a major limitation of the mouse has been the limited genetic diversity associated with common laboratory mice. This has been overcome with the recent development of the Collaborative Cross and Diversity Outbred mice. These strains provide new tools capable of replicating genetic diversity to that approaching the diversity found in human populations. The Collaborative Cross and Diversity Outbred strains thus provide a means to observe and characterize toxicity or efficacy of new therapeutic drugs for a given population. The combination of traditional and contemporary mouse genome editing tools, along with the addition of genetic diversity in new modeling

  12. Silicon Detectors-Tools for Discovery in Particle Physics

    International Nuclear Information System (INIS)

    Krammer, Manfred

    2009-01-01

    Since the first application of Silicon strip detectors in high energy physics in the early 1980ies these detectors have enabled the experiments to perform new challenging measurements. With these devices it became possible to determine the decay lengths of heavy quarks, for example in the fixed target experiment NA11 at CERN. In this experiment Silicon tracking detectors were used for the identification of particles containing a c-quark. Later on, the experiments at the Large Electron Positron collider at CERN used already larger and sophisticated assemblies of Silicon detectors to identify and study particles containing the b-quark. A very important contribution to the discovery of the last of the six quarks, the top quark, has been made by even larger Silicon vertex detectors inside the experiments CDF and D0 at Fermilab. Nowadays a mature detector technology, the use of Silicon detectors is no longer restricted to the vertex regions of collider experiments. The two multipurpose experiments ATLAS and CMS at the Large Hadron Collider at CERN contain large tracking detectors made of Silicon. The largest is the CMS Inner Tracker consisting of 200 m 2 of Silicon sensor area. These detectors will be very important for a possible discovery of the Higgs boson or of Super Symmetric particles. This paper explains the first applications of Silicon sensors in particle physics and describes the continuous development of this technology up to the construction of the state of the art Silicon detector of CMS.

  13. Pharmacogenetics in type 2 diabetes: precision medicine or discovery tool?

    Science.gov (United States)

    Florez, Jose C

    2017-05-01

    In recent years, technological and analytical advances have led to an explosion in the discovery of genetic loci associated with type 2 diabetes. However, their ability to improve prediction of disease outcomes beyond standard clinical risk factors has been limited. On the other hand, genetic effects on drug response may be stronger than those commonly seen for disease incidence. Pharmacogenetic findings may aid in identifying new drug targets, elucidate pathophysiology, unravel disease heterogeneity, help prioritise specific genes in regions of genetic association, and contribute to personalised or precision treatment. In diabetes, precedent for the successful application of pharmacogenetic concepts exists in its monogenic subtypes, such as MODY or neonatal diabetes. Whether similar insights will emerge for the much more common entity of type 2 diabetes remains to be seen. As genetic approaches advance, the progressive deployment of candidate gene, large-scale genotyping and genome-wide association studies has begun to produce suggestive results that may transform clinical practice. However, many barriers to the translation of diabetes pharmacogenetic discoveries to the clinic still remain. This perspective offers a contemporary overview of the field with a focus on sulfonylureas and metformin, identifies the major uses of pharmacogenetics, and highlights potential limitations and future directions.

  14. Recent development in software and automation tools for high-throughput discovery bioanalysis.

    Science.gov (United States)

    Shou, Wilson Z; Zhang, Jun

    2012-05-01

    Bioanalysis with LC-MS/MS has been established as the method of choice for quantitative determination of drug candidates in biological matrices in drug discovery and development. The LC-MS/MS bioanalytical support for drug discovery, especially for early discovery, often requires high-throughput (HT) analysis of large numbers of samples (hundreds to thousands per day) generated from many structurally diverse compounds (tens to hundreds per day) with a very quick turnaround time, in order to provide important activity and liability data to move discovery projects forward. Another important consideration for discovery bioanalysis is its fit-for-purpose quality requirement depending on the particular experiments being conducted at this stage, and it is usually not as stringent as those required in bioanalysis supporting drug development. These aforementioned attributes of HT discovery bioanalysis made it an ideal candidate for using software and automation tools to eliminate manual steps, remove bottlenecks, improve efficiency and reduce turnaround time while maintaining adequate quality. In this article we will review various recent developments that facilitate automation of individual bioanalytical procedures, such as sample preparation, MS/MS method development, sample analysis and data review, as well as fully integrated software tools that manage the entire bioanalytical workflow in HT discovery bioanalysis. In addition, software tools supporting the emerging high-resolution accurate MS bioanalytical approach are also discussed.

  15. Evaluating Music Discovery Tools on Spotify: The Role of User Preference Characteristics

    Directory of Open Access Journals (Sweden)

    Muh-Chyun Tang

    2017-06-01

    Full Text Available An experimental study was conducted to assess the effectiveness of the four music discovery tools available on Spotify, a popular music streaming service, namely: radio recommendation, regional charts, genres and moods, as well as following Facebook friends. Both subjective judgment of user experience and objective measures of search effectiveness were used as the performance criteria. Other than comparison of these four tools, we also compared how consistent are these performance measures. The results show that user experience criteria were not necessarily corresponded to search effectiveness. Furthermore, three user preference characteristics: preference diversity, preference insight, and openness to novelty were introduced as mediating variables, with an aim to investigating how these attributes might interact with these four music discovery tools on performance. The results suggest that users’ preference characteristics did have an impact on the performance of these music discovery tools.

  16. Lambda-Display: A Powerful Tool for Antigen Discovery

    Directory of Open Access Journals (Sweden)

    Nicola Gargano

    2011-04-01

    Full Text Available Since its introduction in 1985, phage display technology has been successfully used in projects aimed at deciphering biological processes and isolating molecules of practical value in several applications. Bacteriophage lambda, representing a classical molecular cloning and expression system has also been exploited for generating large combinatorial libraries of small peptides and protein domains exposed on its capsid. More recently, lambda display has been consistently and successfully employed for domain mapping, antigen discovery and protein interaction studies or, more generally, in functional genomics. We show here the results obtained by the use of large libraries of cDNA and genomic DNA for the molecular dissection of the human B-cell response against complex pathogens, including protozoan parasites, bacteria and viruses. Moreover, by reviewing the experimental work performed in recent investigations we illustrate the potential of lambda display in the diagnostics field and for identifying antigens useful as targets for vaccine development.

  17. GIS Technology: Resource and Habitability Assessment Tool

    Data.gov (United States)

    National Aeronautics and Space Administration — We are applying Geographic Information Systems (GIS) to new orbital data sets for lunar resource assessment and the identification of past habitable environments on...

  18. Tools and data services registry: a community effort to document bioinformatics resources

    Science.gov (United States)

    Ison, Jon; Rapacki, Kristoffer; Ménager, Hervé; Kalaš, Matúš; Rydza, Emil; Chmura, Piotr; Anthon, Christian; Beard, Niall; Berka, Karel; Bolser, Dan; Booth, Tim; Bretaudeau, Anthony; Brezovsky, Jan; Casadio, Rita; Cesareni, Gianni; Coppens, Frederik; Cornell, Michael; Cuccuru, Gianmauro; Davidsen, Kristian; Vedova, Gianluca Della; Dogan, Tunca; Doppelt-Azeroual, Olivia; Emery, Laura; Gasteiger, Elisabeth; Gatter, Thomas; Goldberg, Tatyana; Grosjean, Marie; Grüning, Björn; Helmer-Citterich, Manuela; Ienasescu, Hans; Ioannidis, Vassilios; Jespersen, Martin Closter; Jimenez, Rafael; Juty, Nick; Juvan, Peter; Koch, Maximilian; Laibe, Camille; Li, Jing-Woei; Licata, Luana; Mareuil, Fabien; Mičetić, Ivan; Friborg, Rune Møllegaard; Moretti, Sebastien; Morris, Chris; Möller, Steffen; Nenadic, Aleksandra; Peterson, Hedi; Profiti, Giuseppe; Rice, Peter; Romano, Paolo; Roncaglia, Paola; Saidi, Rabie; Schafferhans, Andrea; Schwämmle, Veit; Smith, Callum; Sperotto, Maria Maddalena; Stockinger, Heinz; Vařeková, Radka Svobodová; Tosatto, Silvio C.E.; de la Torre, Victor; Uva, Paolo; Via, Allegra; Yachdav, Guy; Zambelli, Federico; Vriend, Gert; Rost, Burkhard; Parkinson, Helen; Løngreen, Peter; Brunak, Søren

    2016-01-01

    Life sciences are yielding huge data sets that underpin scientific discoveries fundamental to improvement in human health, agriculture and the environment. In support of these discoveries, a plethora of databases and tools are deployed, in technically complex and diverse implementations, across a spectrum of scientific disciplines. The corpus of documentation of these resources is fragmented across the Web, with much redundancy, and has lacked a common standard of information. The outcome is that scientists must often struggle to find, understand, compare and use the best resources for the task at hand. Here we present a community-driven curation effort, supported by ELIXIR—the European infrastructure for biological information—that aspires to a comprehensive and consistent registry of information about bioinformatics resources. The sustainable upkeep of this Tools and Data Services Registry is assured by a curation effort driven by and tailored to local needs, and shared amongst a network of engaged partners. As of November 2015, the registry includes 1785 resources, with depositions from 126 individual registrations including 52 institutional providers and 74 individuals. With community support, the registry can become a standard for dissemination of information about bioinformatics resources: we welcome everyone to join us in this common endeavour. The registry is freely available at https://bio.tools. PMID:26538599

  19. Citation Discovery Tools for Conducting Adaptive Meta-analyses to Update Systematic Reviews.

    Science.gov (United States)

    Bae, Jong-Myon; Kim, Eun Hee

    2016-03-01

    The systematic review (SR) is a research methodology that aims to synthesize related evidence. Updating previously conducted SRs is necessary when new evidence has been produced, but no consensus has yet emerged on the appropriate update methodology. The authors have developed a new SR update method called 'adaptive meta-analysis' (AMA) using the 'cited by', 'similar articles', and 'related articles' citation discovery tools in the PubMed and Scopus databases. This study evaluates the usefulness of these citation discovery tools for updating SRs. Lists were constructed by applying the citation discovery tools in the two databases to the articles analyzed by a published SR. The degree of overlap between the lists and distribution of excluded results were evaluated. The articles ultimately selected for the SR update meta-analysis were found in the lists obtained from the 'cited by' and 'similar' tools in PubMed. Most of the selected articles appeared in both the 'cited by' lists in Scopus and PubMed. The Scopus 'related' tool did not identify the appropriate articles. The AMA, which involves using both citation discovery tools in PubMed, and optionally, the 'related' tool in Scopus, was found to be useful for updating an SR.

  20. European Institutional and Organisational Tools for Maritime Human Resources Development

    OpenAIRE

    Dragomir Cristina

    2012-01-01

    Seafarers need to continuously develop their career, at all stages of their professional life. This paper presents some tools of institutional and organisational career development. At insitutional level there are presented vocational education and training tools provided by the European Union institutions while at organisational level are exemplified some tools used by private crewing companies for maritime human resources assessment and development.

  1. Computational tools for high-throughput discovery in biology

    OpenAIRE

    Jones, Neil Christopher

    2007-01-01

    High throughput data acquisition technology has inarguably transformed the landscape of the life sciences, in part by making possible---and necessary---the computational disciplines of bioinformatics and biomedical informatics. These fields focus primarily on developing tools for analyzing data and generating hypotheses about objects in nature, and it is in this context that we address three pressing problems in the fields of the computational life sciences which each require computing capaci...

  2. Resource Discovery: Comparative Results on Two Catalog Interfaces

    Directory of Open Access Journals (Sweden)

    Heather Hessel

    2012-06-01

    Full Text Available Like many libraries, the University of Minnesota Libraries-Twin Cities now offers a next-generation catalog alongside a traditional online public access catalog (OPAC. One year after the launch of its new platform as the default catalog, usage data for the OPAC remained relatively high, and anecdotal comments raised questions. In response, the Libraries conducted surveys that covered topics such as perceptions of success, known-item searching, preferred search environments, and desirable resource types. Results show distinct differences in the behavior of faculty, graduate student, and undergraduate survey respondents, and between library staff and non-library staff respondents. Both quantitative and qualitative data inform the analysis and conclusions.

  3. Estimation of uranium resources by life-cycle or discovery-rate models: a critique

    International Nuclear Information System (INIS)

    Harris, D.P.

    1976-10-01

    This report was motivated primarily by M. A. Lieberman's ''United States Uranium Resources: An Analysis of Historical Data'' (Science, April 30). His conclusion that only 87,000 tons of U 3 O 8 resources recoverable at a forward cost of $8/lb remain to be discovered is criticized. It is shown that there is no theoretical basis for selecting the exponential or any other function for the discovery rate. Some of the economic (productivity, inflation) and data issues involved in the analysis of undiscovered, recoverable U 3 O 8 resources on discovery rates of $8 reserves are discussed. The problem of the ratio of undiscovered $30 resources to undiscovered $8 resources is considered. It is concluded that: all methods for the estimation of unknown resources must employ a model of some form of the endowment-exploration-production complex, but every model is a simplification of the real world, and every estimate is intrinsically uncertain. The life-cycle model is useless for the appraisal of undiscovered, recoverable U 3 O 8 , and the discovery rate model underestimates these resources

  4. A Community Assessment Tool for Education Resources

    Science.gov (United States)

    Hou, C. Y.; Soyka, H.; Hutchison, V.; Budden, A. E.

    2016-12-01

    In order to facilitate and enhance better understanding of how to conserve life on earth and the environment that sustains it, Data Observation Network for Earth (DataONE) develops, implements, and shares educational activities and materials as part of its commitment to the education of its community, including scientific researchers, educators, and the public. Creating and maintaining educational materials that remain responsive to community needs is reliant on careful evaluations in order to enhance current and future resources. DataONE's extensive collaboration with individuals and organizations has informed the development of its educational resources and through these interactions, the need for a comprehensive, customizable education evaluation instrument became apparent. In this presentation, the authors will briefly describe the design requirements and research behind a prototype instrument that is intended to be used by the community for evaluation of its educational activities and resources. We will then demonstrate the functionality of a web based platform that enables users to identify the type of educational activity across multiple axes. This results in a set of structured evaluation questions that can be included in a survey instrument. Users can also access supporting documentation describing the types of question included in the output or simply download a full editable instrument. Our aim is that by providing the community with access to a structured evaluation instrument, Earth/Geoscience educators will be able to gather feedback easily and efficiently in order to help maintain the quality, currency/relevancy, and value of their resources, and ultimately, support a more data literate community.

  5. The Climate Data Analysis Tools (CDAT): Scientific Discovery Made Easy

    Science.gov (United States)

    Doutriaux, C. M.; Williams, D. N.; Drach, R. S.; McCoy, R. B.; Mlaker, V.

    2008-12-01

    In recent years, amount of data available to climate scientists has grown exponentially. Whether we're looking at the increasing number of organizations providing data, the finer resolutions of climate models, or the escalating number of experiments and realizations for those experiments, every aspect of climate research leads to an unprecedented growth of the volume of data to analyze. The recent success and visibility of the Intergovernmental Panel on Climate Change Annual Report 4 (IPCC AR4) is boosting the demand to unprecedented levels and keeping the numbers increasing. Meanwhile, technology available for scientists to analyze the data has remained largely unchanged since the early days. One tool, however, has proven itself flexible enough not only to follow the trend of escalating demand, but also to be ahead of the game: the Climate Data Analysis Tools (CDAT) from the Program for Climate Model Diagnosis and Comparison (PCMDI). While providing the cutting edge technology necessary to distribute the IPCC AR4 data via the Earth System Grid, PCMDI has continuously evolved CDAT to handle new grids and higher definitions, and provide new diagnostics. In the near future, in time for AR5, PCMDI will use CDAT for state-of-the-art remote data analysis in a grid computing environment.

  6. USMC Logistics Resource Allocation Optimization Tool

    Science.gov (United States)

    2015-12-01

    Pryor 2006). The background information included assumptions of the weight carried by pack animals vs. war horses and the amount of feed consumed...by different animals while being transported at sea versus open grazing on land. A book review of this workshop’s proceedings provides the essence...this study’s ability to model all aspects of mission, equipment types, failure modes, repair times, carcass cycles, etc., into a flexible tool

  7. Space technology in the discovery and development of mineral and energy resources

    Science.gov (United States)

    Lowman, P. D.

    1977-01-01

    Space technology, applied to the discovery and extraction of mineral and energy resources, is summarized. Orbital remote sensing for geological purposes has been widely applied through the use of LANDSAT satellites. These techniques also have been of value for protection against environmental hazards and for a better understanding of crustal structure.

  8. Mathematical modeling of physiological systems: an essential tool for discovery.

    Science.gov (United States)

    Glynn, Patric; Unudurthi, Sathya D; Hund, Thomas J

    2014-08-28

    Mathematical models are invaluable tools for understanding the relationships between components of a complex system. In the biological context, mathematical models help us understand the complex web of interrelations between various components (DNA, proteins, enzymes, signaling molecules etc.) in a biological system, gain better understanding of the system as a whole, and in turn predict its behavior in an altered state (e.g. disease). Mathematical modeling has enhanced our understanding of multiple complex biological processes like enzyme kinetics, metabolic networks, signal transduction pathways, gene regulatory networks, and electrophysiology. With recent advances in high throughput data generation methods, computational techniques and mathematical modeling have become even more central to the study of biological systems. In this review, we provide a brief history and highlight some of the important applications of modeling in biological systems with an emphasis on the study of excitable cells. We conclude with a discussion about opportunities and challenges for mathematical modeling going forward. In a larger sense, the review is designed to help answer a simple but important question that theoreticians frequently face from interested but skeptical colleagues on the experimental side: "What is the value of a model?" Copyright © 2014 Elsevier Inc. All rights reserved.

  9. Mathematical Tools for Discovery of Nanoporous Materials for Energy Applications

    International Nuclear Information System (INIS)

    Haranczyk, M; Martin, R L

    2015-01-01

    Porous materials such as zeolites and metal organic frameworks have been of growing importance as materials for energy-related applications such as CO 2 capture, hydrogen and methane storage, and catalysis. The current state-of-the-art molecular simulations allow for accurate in silico prediction of materials' properties but the computational cost of such calculations prohibits their application in the characterisation of very large sets of structures, which would be required to perform brute-force screening. Our work focuses on the development of novel methodologies to efficiently characterize and explore this complex materials space. In particular, we have been developing algorithms and tools for enumeration and characterisation of porous material databases as well as efficient screening approaches. Our methodology represents a ensemble of mathematical methods. We have used Voronoi tessellation-based techniques to enable high-throughput structure characterisation, statistical techniques to perform comparison and screening, and continuous optimisation to design materials. This article outlines our developments in material design

  10. Educational resources and tools for robotic learning

    Directory of Open Access Journals (Sweden)

    Pablo Gil Vazquez

    2012-07-01

    Full Text Available Normal.dotm 0 0 1 139 795 Universidad de Salamanca 6 1 976 12.0 0 false 18 pt 18 pt 0 0 false false false /* Style Definitions */ table.MsoNormalTable {mso-style-name:"Table Normal"; mso-tstyle-rowband-size:0; mso-tstyle-colband-size:0; mso-style-noshow:yes; mso-style-parent:""; mso-padding-alt:0cm 5.4pt 0cm 5.4pt; mso-para-margin-top:0cm; mso-para-margin-right:0cm; mso-para-margin-bottom:10.0pt; mso-para-margin-left:0cm; line-height:115%; mso-pagination:widow-orphan; font-size:12.0pt; font-family:"Times New Roman"; mso-ascii-font-family:Calibri; mso-ascii-theme-font:minor-latin; mso-fareast-font-family:"Times New Roman"; mso-fareast-theme-font:minor-fareast; mso-hansi-font-family:Calibri; mso-hansi-theme-font:minor-latin;} This paper discusses different teaching experiences which aims are the learning robotics in the university. These experiences are reflected in the development of several robotics courses and subjects at the University of Alicante.  The authors have created various educational platforms or they have used tools of free distribution and open source for the implementation of these courses. The main objetive of these courses is to teach the design and implementation of robotic solutions to solve various problems not only such as the control, programming and handling of robot but also the assembly, building and programming of educational mini-robots. On the one hand, new teaching tools are used such as simulators and virtual labs which make flexible the learning of robot arms. On the other hand, competitions are used to motivate students because this way, the students put into action the skills learned through building and programming low-cost mini-robots.

  11. Mouse Models for Drug Discovery. Can New Tools and Technology Improve Translational Power?

    Science.gov (United States)

    Zuberi, Aamir; Lutz, Cathleen

    2016-12-01

    The use of mouse models in biomedical research and preclinical drug evaluation is on the rise. The advent of new molecular genome-altering technologies such as CRISPR/Cas9 allows for genetic mutations to be introduced into the germ line of a mouse faster and less expensively than previous methods. In addition, the rapid progress in the development and use of somatic transgenesis using viral vectors, as well as manipulations of gene expression with siRNAs and antisense oligonucleotides, allow for even greater exploration into genomics and systems biology. These technological advances come at a time when cost reductions in genome sequencing have led to the identification of pathogenic mutations in patient populations, providing unprecedented opportunities in the use of mice to model human disease. The ease of genetic engineering in mice also offers a potential paradigm shift in resource sharing and the speed by which models are made available in the public domain. Predictively, the knowledge alone that a model can be quickly remade will provide relief to resources encumbered by licensing and Material Transfer Agreements. For decades, mouse strains have provided an exquisite experimental tool to study the pathophysiology of the disease and assess therapeutic options in a genetically defined system. However, a major limitation of the mouse has been the limited genetic diversity associated with common laboratory mice. This has been overcome with the recent development of the Collaborative Cross and Diversity Outbred mice. These strains provide new tools capable of replicating genetic diversity to that approaching the diversity found in human populations. The Collaborative Cross and Diversity Outbred strains thus provide a means to observe and characterize toxicity or efficacy of new therapeutic drugs for a given population. The combination of traditional and contemporary mouse genome editing tools, along with the addition of genetic diversity in new modeling systems

  12. Evaluation Tool for the Application of Discovery Teaching Method in the Greek Environmental School Projects

    Science.gov (United States)

    Kalathaki, Maria

    2015-01-01

    Greek school community emphasizes on the discovery direction of teaching methodology in the school Environmental Education (EE) in order to promote Education for the Sustainable Development (ESD). In ESD school projects the used methodology is experiential teamwork for inquiry based learning. The proposed tool checks whether and how a school…

  13. Augmented Reality-Based Simulators as Discovery Learning Tools: An Empirical Study

    Science.gov (United States)

    Ibáñez, María-Blanca; Di-Serio, Ángela; Villarán-Molina, Diego; Delgado-Kloos, Carlos

    2015-01-01

    This paper reports empirical evidence on having students use AR-SaBEr, a simulation tool based on augmented reality (AR), to discover the basic principles of electricity through a series of experiments. AR-SaBEr was enhanced with knowledge-based support and inquiry-based scaffolding mechanisms, which proved useful for discovery learning in…

  14. The use of web ontology languages and other semantic web tools in drug discovery.

    Science.gov (United States)

    Chen, Huajun; Xie, Guotong

    2010-05-01

    To optimize drug development processes, pharmaceutical companies require principled approaches to integrate disparate data on a unified infrastructure, such as the web. The semantic web, developed on the web technology, provides a common, open framework capable of harmonizing diversified resources to enable networked and collaborative drug discovery. We survey the state of art of utilizing web ontologies and other semantic web technologies to interlink both data and people to support integrated drug discovery across domains and multiple disciplines. Particularly, the survey covers three major application categories including: i) semantic integration and open data linking; ii) semantic web service and scientific collaboration and iii) semantic data mining and integrative network analysis. The reader will gain: i) basic knowledge of the semantic web technologies; ii) an overview of the web ontology landscape for drug discovery and iii) a basic understanding of the values and benefits of utilizing the web ontologies in drug discovery. i) The semantic web enables a network effect for linking open data for integrated drug discovery; ii) The semantic web service technology can support instant ad hoc collaboration to improve pipeline productivity and iii) The semantic web encourages publishing data in a semantic way such as resource description framework attributes and thus helps move away from a reliance on pure textual content analysis toward more efficient semantic data mining.

  15. Web-based tools from AHRQ's National Resource Center.

    Science.gov (United States)

    Cusack, Caitlin M; Shah, Sapna

    2008-11-06

    The Agency for Healthcare Research and Quality (AHRQ) has made an investment of over $216 million in research around health information technology (health IT). As part of their investment, AHRQ has developed the National Resource Center for Health IT (NRC) which includes a public domain Web site. New content for the web site, such as white papers, toolkits, lessons from the health IT portfolio and web-based tools, is developed as needs are identified. Among the tools developed by the NRC are the Compendium of Surveys and the Clinical Decision Support (CDS) Resources. The Compendium of Surveys is a searchable repository of health IT evaluation surveys made available for public use. The CDS Resources contains content which may be used to develop clinical decision support tools, such as rules, reminders and templates. This live demonstration will show the access, use, and content of both these freely available web-based tools.

  16. In silico tools used for compound selection during target-based drug discovery and development.

    Science.gov (United States)

    Caldwell, Gary W

    2015-01-01

    The target-based drug discovery process, including target selection, screening, hit-to-lead (H2L) and lead optimization stage gates, is the most common approach used in drug development. The full integration of in vitro and/or in vivo data with in silico tools across the entire process would be beneficial to R&D productivity by developing effective selection criteria and drug-design optimization strategies. This review focuses on understanding the impact and extent in the past 5 years of in silico tools on the various stage gates of the target-based drug discovery approach. There are a large number of in silico tools available for establishing selection criteria and drug-design optimization strategies in the target-based approach. However, the inconsistent use of in vitro and/or in vivo data integrated with predictive in silico multiparameter models throughout the process is contributing to R&D productivity issues. In particular, the lack of reliable in silico tools at the H2L stage gate is contributing to the suboptimal selection of viable lead compounds. It is suggested that further development of in silico multiparameter models and organizing biologists, medicinal and computational chemists into one team with a single accountable objective to expand the utilization of in silico tools in all phases of drug discovery would improve R&D productivity.

  17. Updates on resources, software tools, and databases for plant proteomics in 2016-2017.

    Science.gov (United States)

    Misra, Biswapriya B

    2018-02-08

    Proteomics data processing, annotation, and analysis can often lead to major hurdles in large-scale high-throughput bottom-up proteomics experiments. Given the recent rise in protein-based big datasets being generated, efforts in in silico tool development occurrences have had an unprecedented increase; so much so, that it has become increasingly difficult to keep track of all the advances in a particular academic year. However, these tools benefit the plant proteomics community in circumventing critical issues in data analysis and visualization, as these continually developing open-source and community-developed tools hold potential in future research efforts. This review will aim to introduce and summarize more than 50 software tools, databases, and resources developed and published during 2016-2017 under the following categories: tools for data pre-processing and analysis, statistical analysis tools, peptide identification tools, databases and spectral libraries, and data visualization and interpretation tools. Intended for a well-informed proteomics community, finally, efforts in data archiving and validation datasets for the community will be discussed as well. Additionally, the author delineates the current and most commonly used proteomics tools in order to introduce novice readers to this -omics discovery platform. © 2018 WILEY-VCH Verlag GmbH & Co. KGaA, Weinheim.

  18. Discovery of Sound in the Sea: Resources for Educators, Students, the Public, and Policymakers.

    Science.gov (United States)

    Vigness-Raposa, Kathleen J; Scowcroft, Gail; Miller, James H; Ketten, Darlene R; Popper, Arthur N

    2016-01-01

    There is increasing concern about the effects of underwater sound on marine life. However, the science of sound is challenging. The Discovery of Sound in the Sea (DOSITS) Web site ( http://www.dosits.org ) was designed to provide comprehensive scientific information on underwater sound for the public and educational and media professionals. It covers the physical science of underwater sound and its use by people and marine animals for a range of tasks. Celebrating 10 years of online resources, DOSITS continues to develop new material and improvements, providing the best resource for the most up-to-date information on underwater sound and its potential effects.

  19. Digital Resources in Instruction and Research: Assessing Faculty Discovery, Use and Needs--Final Summary Report

    Science.gov (United States)

    Tobias, Vicki

    2009-01-01

    In 2008, the Digital Initiatives Coordinating Committee (DICC) requested a comprehensive assessment of the UW Digital Collections (UWDC). The goal of this assessment was to better understand faculty awareness of and expectations for digital library resources, services and tools; obtain faculty feedback on digital resource and service needs that…

  20. Processes, Performance Drivers and ICT Tools in Human Resources Management

    OpenAIRE

    Oškrdal Václav; Pavlíček Antonín; Jelínková Petra

    2011-01-01

    This article presents an insight to processes, performance drivers and ICT tools in human resources (HR) management area. On the basis of a modern approach to HR management, a set of business processes that are handled by today’s HR managers is defined. Consequently, the concept of ICT-supported performance drivers and their relevance in the area of HR management as well as the relationship between HR business processes, performance drivers and ICT tools are defined. The theoretical outcomes ...

  1. User Driven Development of Software Tools for Open Data Discovery and Exploration

    Science.gov (United States)

    Schlobinski, Sascha; Keppel, Frank; Dihe, Pascal; Boot, Gerben; Falkenroth, Esa

    2016-04-01

    The use of open data in research faces challenges not restricted to inherent properties such as data quality, resolution of open data sets. Often Open data is catalogued insufficiently or fragmented. Software tools that support the effective discovery including the assessment of the data's appropriateness for research have shortcomings such as the lack of essential functionalities like support for data provenance. We believe that one of the reasons is the neglect of real end users requirements in the development process of aforementioned software tools. In the context of the FP7 Switch-On project we have pro-actively engaged the relevant user user community to collaboratively develop a means to publish, find and bind open data relevant for hydrologic research. Implementing key concepts of data discovery and exploration we have used state of the art web technologies to provide an interactive software tool that is easy to use yet powerful enough to satisfy the data discovery and access requirements of the hydrological research community.

  2. Electronic Safety Resource Tools -- Supporting Hydrogen and Fuel Cell Commercialization

    Energy Technology Data Exchange (ETDEWEB)

    Barilo, Nick F.

    2014-09-29

    The Pacific Northwest National Laboratory (PNNL) Hydrogen Safety Program conducted a planning session in Los Angeles, CA on April 1, 2014 to consider what electronic safety tools would benefit the next phase of hydrogen and fuel cell commercialization. A diverse, 20-person team led by an experienced facilitator considered the question as it applied to the eight most relevant user groups. The results and subsequent evaluation activities revealed several possible resource tools that could greatly benefit users. The tool identified as having the greatest potential for impact is a hydrogen safety portal, which can be the central location for integrating and disseminating safety information (including most of the tools identified in this report). Such a tool can provide credible and reliable information from a trustworthy source. Other impactful tools identified include a codes and standards wizard to guide users through a series of questions relating to application and specific features of the requirements; a scenario-based virtual reality training for first responders; peer networking tools to bring users from focused groups together to discuss and collaborate on hydrogen safety issues; and a focused tool for training inspectors. Table ES.1 provides results of the planning session, including proposed new tools and changes to existing tools.

  3. Modern ICT Tools: Online Electronic Resources Sharing Using Web ...

    African Journals Online (AJOL)

    Modern ICT Tools: Online Electronic Resources Sharing Using Web 2.0 and Its Implications For Library And Information Practice In Nigeria. ... The PDF file you selected should load here if your Web browser has a PDF reader plug-in installed (for example, a recent version of Adobe Acrobat Reader). If you would like more ...

  4. MEAT: An Authoring Tool for Generating Adaptable Learning Resources

    Science.gov (United States)

    Kuo, Yen-Hung; Huang, Yueh-Min

    2009-01-01

    Mobile learning (m-learning) is a new trend in the e-learning field. The learning services in m-learning environments are supported by fundamental functions, especially the content and assessment services, which need an authoring tool to rapidly generate adaptable learning resources. To fulfill the imperious demand, this study proposes an…

  5. Forest resource projection tools at the European level

    NARCIS (Netherlands)

    Schelhaas, M.; Nabuurs, G.J.; Verkerk, P.J.; Hengeveld, G.M.; Packalen, Tuula; Sallnäs, O.; Pilli, Roberto; Grassi, J.; Forsell, Nicklas; Frank, S.; Gusti, Mykola; Havlik, Petr

    2017-01-01

    Many countries have developed their own systems for projecting forest resources and wood availability. Although studies using these tools are helpful for developing national policies, they do not provide a consistent assessment for larger regions such as the European Union or Europe as a whole.

  6. Systems pharmacology-based drug discovery for marine resources: an example using sea cucumber (Holothurians).

    Science.gov (United States)

    Guo, Yingying; Ding, Yan; Xu, Feifei; Liu, Baoyue; Kou, Zinong; Xiao, Wei; Zhu, Jingbo

    2015-05-13

    Sea cucumber, a kind of marine animal, have long been utilized as tonic and traditional remedies in the Middle East and Asia because of its effectiveness against hypertension, asthma, rheumatism, cuts and burns, impotence, and constipation. In this study, an overall study performed on sea cucumber was used as an example to show drug discovery from marine resource by using systems pharmacology model. The value of marine natural resources has been extensively considered because these resources can be potentially used to treat and prevent human diseases. However, the discovery of drugs from oceans is difficult, because of complex environments in terms of composition and active mechanisms. Thus, a comprehensive systems approach which could discover active constituents and their targets from marine resource, understand the biological basis for their pharmacological properties is necessary. In this study, a feasible pharmacological model based on systems pharmacology was established to investigate marine medicine by incorporating active compound screening, target identification, and network and pathway analysis. As a result, 106 candidate components of sea cucumber and 26 potential targets were identified. Furthermore, the functions of sea cucumber in health improvement and disease treatment were elucidated in a holistic way based on the established compound-target and target-disease networks, and incorporated pathways. This study established a novel strategy that could be used to explore specific active mechanisms and discover new drugs from marine sources. Copyright © 2015 Elsevier Ireland Ltd. All rights reserved.

  7. A Water Resources Planning Tool for the Jordan River Basin

    Directory of Open Access Journals (Sweden)

    Christopher Bonzi

    2011-06-01

    Full Text Available The Jordan River basin is subject to extreme and increasing water scarcity. Management of transboundary water resources in the basin is closely intertwined with political conflicts in the region. We have jointly developed with stakeholders and experts from the riparian countries, a new dynamic consensus database and—supported by hydro-climatological model simulations and participatory scenario exercises in the GLOWA (Global Change and the Hydrological Cycle Jordan River project—a basin-wide Water Evaluation and Planning (WEAP tool, which will allow testing of various unilateral and multilateral adaptation options under climate and socio-economic change. We present its validation and initial (climate and socio-economic scenario analyses with this budget and allocation tool, and invite further adaptation and application of the tool for specific Integrated Water Resources Management (IWRM problems.

  8. Updates in metabolomics tools and resources: 2014-2015.

    Science.gov (United States)

    Misra, Biswapriya B; van der Hooft, Justin J J

    2016-01-01

    Data processing and interpretation represent the most challenging and time-consuming steps in high-throughput metabolomic experiments, regardless of the analytical platforms (MS or NMR spectroscopy based) used for data acquisition. Improved machinery in metabolomics generates increasingly complex datasets that create the need for more and better processing and analysis software and in silico approaches to understand the resulting data. However, a comprehensive source of information describing the utility of the most recently developed and released metabolomics resources--in the form of tools, software, and databases--is currently lacking. Thus, here we provide an overview of freely-available, and open-source, tools, algorithms, and frameworks to make both upcoming and established metabolomics researchers aware of the recent developments in an attempt to advance and facilitate data processing workflows in their metabolomics research. The major topics include tools and researches for data processing, data annotation, and data visualization in MS and NMR-based metabolomics. Most in this review described tools are dedicated to untargeted metabolomics workflows; however, some more specialist tools are described as well. All tools and resources described including their analytical and computational platform dependencies are summarized in an overview Table. © 2015 WILEY-VCH Verlag GmbH & Co. KGaA, Weinheim.

  9. Bringing your tools to CyVerse Discovery Environment using Docker.

    Science.gov (United States)

    Devisetty, Upendra Kumar; Kennedy, Kathleen; Sarando, Paul; Merchant, Nirav; Lyons, Eric

    2016-01-01

    Docker has become a very popular container-based virtualization platform for software distribution that has revolutionized the way in which scientific software and software dependencies (software stacks) can be packaged, distributed, and deployed. Docker makes the complex and time-consuming installation procedures needed for scientific software a one-time process. Because it enables platform-independent installation, versioning of software environments, and easy redeployment and reproducibility, Docker is an ideal candidate for the deployment of identical software stacks on different compute environments such as XSEDE and Amazon AWS. CyVerse's Discovery Environment also uses Docker for integrating its powerful, community-recommended software tools into CyVerse's production environment for public use. This paper will help users bring their tools into CyVerse Discovery Environment (DE) which will not only allows users to integrate their tools with relative ease compared to the earlier method of tool deployment in DE but will also help users to share their apps with collaborators and release them for public use.

  10. Bringing your tools to CyVerse Discovery Environment using Docker [version 1; referees: 2 approved

    Directory of Open Access Journals (Sweden)

    Upendra Kumar Devisetty

    2016-06-01

    Full Text Available Docker has become a very popular container-based virtualization platform for software distribution that has revolutionized the way in which scientific software and software dependencies (software stacks can be packaged, distributed, and deployed. Docker makes the complex and time-consuming installation procedures needed for scientific software a one-time process. Because it enables platform-independent installation, versioning of software environments, and easy redeployment and reproducibility, Docker is an ideal candidate for the deployment of identical software stacks on different compute environments such as XSEDE and Amazon AWS. CyVerse’s Discovery Environment also uses Docker for integrating its powerful, community-recommended software tools into CyVerse’s production environment for public use. This paper will help users bring their tools into CyVerse Discovery Environment (DE which will not only allows users to integrate their tools with relative ease compared to the earlier method of tool deployment in DE but will also help users to share their apps with collaborators and release them for public use.

  11. Bigger Data, Collaborative Tools and the Future of Predictive Drug Discovery

    Science.gov (United States)

    Clark, Alex M.; Swamidass, S. Joshua; Litterman, Nadia; Williams, Antony J.

    2014-01-01

    Over the past decade we have seen a growth in the provision of chemistry data and cheminformatics tools as either free websites or software as a service (SaaS) commercial offerings. These have transformed how we find molecule-related data and use such tools in our research. There have also been efforts to improve collaboration between researchers either openly or through secure transactions using commercial tools. A major challenge in the future will be how such databases and software approaches handle larger amounts of data as it accumulates from high throughput screening and enables the user to draw insights, enable predictions and move projects forward. We now discuss how information from some drug discovery datasets can be made more accessible and how privacy of data should not overwhelm the desire to share it at an appropriate time with collaborators. We also discuss additional software tools that could be made available and provide our thoughts on the future of predictive drug discovery in this age of big data. We use some examples from our own research on neglected diseases, collaborations, mobile apps and algorithm development to illustrate these ideas. PMID:24943138

  12. Developing a distributed HTML5-based search engine for geospatial resource discovery

    Science.gov (United States)

    ZHOU, N.; XIA, J.; Nebert, D.; Yang, C.; Gui, Z.; Liu, K.

    2013-12-01

    With explosive growth of data, Geospatial Cyberinfrastructure(GCI) components are developed to manage geospatial resources, such as data discovery and data publishing. However, the efficiency of geospatial resources discovery is still challenging in that: (1) existing GCIs are usually developed for users of specific domains. Users may have to visit a number of GCIs to find appropriate resources; (2) The complexity of decentralized network environment usually results in slow response and pool user experience; (3) Users who use different browsers and devices may have very different user experiences because of the diversity of front-end platforms (e.g. Silverlight, Flash or HTML). To address these issues, we developed a distributed and HTML5-based search engine. Specifically, (1)the search engine adopts a brokering approach to retrieve geospatial metadata from various and distributed GCIs; (2) the asynchronous record retrieval mode enhances the search performance and user interactivity; (3) the search engine based on HTML5 is able to provide unified access capabilities for users with different devices (e.g. tablet and smartphone).

  13. Tools, techniques, organisation and culture of the CADD group at Sygnature Discovery.

    Science.gov (United States)

    St-Gallay, Steve A; Sambrook-Smith, Colin P

    2017-03-01

    Computer-aided drug design encompasses a wide variety of tools and techniques, and can be implemented with a range of organisational structures and focus in different organisations. Here we outline the computational chemistry skills within Sygnature Discovery, along with the software and hardware at our disposal, and briefly discuss the methods that are not employed and why. The goal of the group is to provide support for design and analysis in order to improve the quality of compounds synthesised and reduce the timelines of drug discovery projects, and we reveal how this is achieved at Sygnature. Impact on medicinal chemistry is vital to demonstrating the value of computational chemistry, and we discuss the approaches taken to influence the list of compounds for synthesis, and how we recognise success. Finally we touch on some of the areas being developed within the team in order to provide further value to the projects and clients.

  14. Helping Students Understand Gene Regulation with Online Tools: A Review of MEME and Melina II, Motif Discovery Tools for Active Learning in Biology

    Directory of Open Access Journals (Sweden)

    David Treves

    2012-08-01

    Full Text Available Review of: MEME and Melina II, which are two free and easy-to-use online motif discovery tools that can be employed to actively engage students in learning about gene regulatory elements.

  15. ATLAAS-P2P: a two layer network solution for easing the resource discovery process in unstructured networks

    OpenAIRE

    Baraglia, Ranieri; Dazzi, Patrizio; Mordacchini, Matteo; Ricci, Laura

    2013-01-01

    ATLAAS-P2P is a two-layered P2P architecture for developing systems providing resource aggregation and approximated discovery in P2P networks. Such systems allow users to search the desired resources by specifying their requirements in a flexible and easy way. From the point of view of resource providers, this system makes available an effective solution supporting providers in being reached by resource requests.

  16. New tools and resources in metabolomics: 2016-2017.

    Science.gov (United States)

    Misra, Biswapriya B

    2018-04-01

    Rapid advances in mass spectrometry (MS) and nuclear magnetic resonance (NMR)-based platforms for metabolomics have led to an upsurge of data every single year. Newer high-throughput platforms, hyphenated technologies, miniaturization, and tool kits in data acquisition efforts in metabolomics have led to additional challenges in metabolomics data pre-processing, analysis, interpretation, and integration. Thanks to the informatics, statistics, and computational community, new resources continue to develop for metabolomics researchers. The purpose of this review is to provide a summary of the metabolomics tools, software, and databases that were developed or improved during 2016-2017, thus, enabling readers, developers, and researchers access to a succinct but thorough list of resources for further improvisation, implementation, and application in due course of time. © 2018 WILEY-VCH Verlag GmbH & Co. KGaA, Weinheim.

  17. search.bioPreprint: a discovery tool for cutting edge, preprint biomedical research articles [version 2; referees: 2 approved

    Directory of Open Access Journals (Sweden)

    Carrie L. Iwema

    2016-07-01

    Full Text Available The time it takes for a completed manuscript to be published traditionally can be extremely lengthy. Article publication delay, which occurs in part due to constraints associated with peer review, can prevent the timely dissemination of critical and actionable data associated with new information on rare diseases or developing health concerns such as Zika virus. Preprint servers are open access online repositories housing preprint research articles that enable authors (1 to make their research immediately and freely available and (2 to receive commentary and peer review prior to journal submission. There is a growing movement of preprint advocates aiming to change the current journal publication and peer review system, proposing that preprints catalyze biomedical discovery, support career advancement, and improve scientific communication. While the number of articles submitted to and hosted by preprint servers are gradually increasing, there has been no simple way to identify biomedical research published in a preprint format, as they are not typically indexed and are only discoverable by directly searching the specific preprint server websites. To address this issue, we created a search engine that quickly compiles preprints from disparate host repositories and provides a one-stop search solution. Additionally, we developed a web application that bolsters the discovery of preprints by enabling each and every word or phrase appearing on any web site to be integrated with articles from preprint servers. This tool, search.bioPreprint, is publicly available at http://www.hsls.pitt.edu/resources/preprint.

  18. CUAHSI Data Services: Tools and Cyberinfrastructure for Water Data Discovery, Research and Collaboration

    Science.gov (United States)

    Seul, M.; Brazil, L.; Castronova, A. M.

    2017-12-01

    CUAHSI Data Services: Tools and Cyberinfrastructure for Water Data Discovery, Research and CollaborationEnabling research surrounding interdisciplinary topics often requires a combination of finding, managing, and analyzing large data sets and models from multiple sources. This challenge has led the National Science Foundation to make strategic investments in developing community data tools and cyberinfrastructure that focus on water data, as it is central need for many of these research topics. CUAHSI (The Consortium of Universities for the Advancement of Hydrologic Science, Inc.) is a non-profit organization funded by the National Science Foundation to aid students, researchers, and educators in using and managing data and models to support research and education in the water sciences. This presentation will focus on open-source CUAHSI-supported tools that enable enhanced data discovery online using advanced searching capabilities and computational analysis run in virtual environments pre-designed for educators and scientists so they can focus their efforts on data analysis rather than IT set-up.

  19. Spec Tool; an online education and research resource

    Science.gov (United States)

    Maman, S.; Shenfeld, A.; Isaacson, S.; Blumberg, D. G.

    2016-06-01

    Education and public outreach (EPO) activities related to remote sensing, space, planetary and geo-physics sciences have been developed widely in the Earth and Planetary Image Facility (EPIF) at Ben-Gurion University of the Negev, Israel. These programs aim to motivate the learning of geo-scientific and technologic disciplines. For over the past decade, the facility hosts research and outreach activities for researchers, local community, school pupils, students and educators. As software and data are neither available nor affordable, the EPIF Spec tool was created as a web-based resource to assist in initial spectral analysis as a need for researchers and students. The tool is used both in the academic courses and in the outreach education programs and enables a better understanding of the theoretical data of spectroscopy and Imaging Spectroscopy in a 'hands-on' activity. This tool is available online and provides spectra visualization tools and basic analysis algorithms including Spectral plotting, Spectral angle mapping and Linear Unmixing. The tool enables to visualize spectral signatures from the USGS spectral library and additional spectra collected in the EPIF such as of dunes in southern Israel and from Turkmenistan. For researchers and educators, the tool allows loading collected samples locally for further analysis.

  20. Bringing your tools to CyVerse Discovery Environment using Docker [version 2; referees: 3 approved

    Directory of Open Access Journals (Sweden)

    Upendra Kumar Devisetty

    2016-11-01

    Full Text Available Docker has become a very popular container-based virtualization platform for software distribution that has revolutionized the way in which scientific software and software dependencies (software stacks can be packaged, distributed, and deployed. Docker makes the complex and time-consuming installation procedures needed for scientific software a one-time process. Because it enables platform-independent installation, versioning of software environments, and easy redeployment and reproducibility, Docker is an ideal candidate for the deployment of identical software stacks on different compute environments such as XSEDE and Amazon AWS. Cyverse's Discovery Environment also uses Docker for integrating its powerful, community-recommended software tools into CyVerse's production environment for public use. This paper will help users bring their tools into CyVerse DE which will not only allows users to integrate their tools with relative ease compared to the earlier method of tool deployment in DE but also help users to share their apps with collaborators and also release them for public use.

  1. Bringing your tools to CyVerse Discovery Environment using Docker [version 3; referees: 3 approved

    Directory of Open Access Journals (Sweden)

    Upendra Kumar Devisetty

    2016-12-01

    Full Text Available Docker has become a very popular container-based virtualization platform for software distribution that has revolutionized the way in which scientific software and software dependencies (software stacks can be packaged, distributed, and deployed. Docker makes the complex and time-consuming installation procedures needed for scientific software a one-time process. Because it enables platform-independent installation, versioning of software environments, and easy redeployment and reproducibility, Docker is an ideal candidate for the deployment of identical software stacks on different compute environments such as XSEDE and Amazon AWS. Cyverse's Discovery Environment also uses Docker for integrating its powerful, community-recommended software tools into CyVerse's production environment for public use. This paper will help users bring their tools into CyVerse DE which will not only allows users to integrate their tools with relative ease compared to the earlier method of tool deployment in DE but also help users to share their apps with collaborators and also release them for public use.

  2. The AIDS and Cancer Specimen Resource: Role in HIV/AIDS scientific discovery

    Directory of Open Access Journals (Sweden)

    McGrath Michael S

    2007-03-01

    Full Text Available Abstract The AIDS Cancer and Specimen Resource (ACSR supports scientific discovery in the area of HIV/AIDS-associated malignancies. The ACSR was established as a cooperative agreement between the NCI (Office of the Director, Division of Cancer Treatment and Diagnosis and regional consortia, University of California, San Francisco (West Coast, George Washington University (East Coast and Ohio State University (Mid-Region to collect, preserve and disperse HIV-related tissues and biologic fluids and controls along with clinical data to qualified investigators. The available biological samples with clinical data and the application process are described on the ACSR web site. The ACSR tissue bank has more than 100,000 human HIV positive specimens that represent different processing (43, specimen (15, and anatomical site (50 types. The ACSR provides special biospecimen collections and prepares speciality items, e.g., tissue microarrays (TMA, DNA libraries. Requests have been greatest for Kaposi's sarcoma (32% and non-Hodgkin's lymphoma (26%. Dispersed requests include 83% tissue (frozen and paraffin embedded, 18% plasma/serum and 9% other. ACSR also provides tissue microarrays of, e.g., Kaposi's sarcoma and non-Hodgkin's lymphoma, for biomarker assays and has developed collaborations with other groups that provide access to additional AIDS-related malignancy specimens. ACSR members and associates have completed 63 podium and poster presentations. Investigators have submitted 125 letters of intent requests. Discoveries using ACSR have been reported in 61 scientific publications in notable journals with an average impact factor of 7. The ACSR promotes the scientific exploration of the relationship between HIV/AIDS and malignancy by participation at national and international scientific meetings, contact with investigators who have productive research in this area and identifying, collecting, preserving, enhancing, and dispersing HIV

  3. Processes, Performance Drivers and ICT Tools in Human Resources Management

    Directory of Open Access Journals (Sweden)

    Oškrdal Václav

    2011-06-01

    Full Text Available This article presents an insight to processes, performance drivers and ICT tools in human resources (HR management area. On the basis of a modern approach to HR management, a set of business processes that are handled by today’s HR managers is defined. Consequently, the concept of ICT-supported performance drivers and their relevance in the area of HR management as well as the relationship between HR business processes, performance drivers and ICT tools are defined. The theoretical outcomes are further enhanced with results obtained from a survey among Czech companies. This article was written with kind courtesy of finances provided by VŠE IGA grant „IGA – 32/2010“.

  4. Teaching resources in speleology and karst: a valuable educational tool

    Directory of Open Access Journals (Sweden)

    De Waele Jo

    2010-01-01

    Full Text Available There is a growing need in the speleological community of tools that make teaching of speleology and karst much easier. Despite the existence of a wide range of major academic textbooks, often the caver community has a difficult access to such material. Therefore, to fill this gap, the Italian Speleological Society, under the umbrella of the Union International de Spéléologie, has prepared a set of lectures, in a presentation format, on several topics including geology, physics, chemistry, hydrogeology, mineralogy, palaeontology, biology, microbiology, history, archaeology, artificial caves, documentation, etc. These lectures constitute the “Teaching Resources in Speleology and Karst”, available online. This educational tool, thanks to its easily manageable format, can constantly be updated and enriched with new contents and topics.

  5. Data Mining and Knowledge Discovery tools for exploiting big Earth-Observation data

    Science.gov (United States)

    Espinoza Molina, D.; Datcu, M.

    2015-04-01

    The continuous increase in the size of the archives and in the variety and complexity of Earth-Observation (EO) sensors require new methodologies and tools that allow the end-user to access a large image repository, to extract and to infer knowledge about the patterns hidden in the images, to retrieve dynamically a collection of relevant images, and to support the creation of emerging applications (e.g.: change detection, global monitoring, disaster and risk management, image time series, etc.). In this context, we are concerned with providing a platform for data mining and knowledge discovery content from EO archives. The platform's goal is to implement a communication channel between Payload Ground Segments and the end-user who receives the content of the data coded in an understandable format associated with semantics that is ready for immediate exploitation. It will provide the user with automated tools to explore and understand the content of highly complex images archives. The challenge lies in the extraction of meaningful information and understanding observations of large extended areas, over long periods of time, with a broad variety of EO imaging sensors in synergy with other related measurements and data. The platform is composed of several components such as 1.) ingestion of EO images and related data providing basic features for image analysis, 2.) query engine based on metadata, semantics and image content, 3.) data mining and knowledge discovery tools for supporting the interpretation and understanding of image content, 4.) semantic definition of the image content via machine learning methods. All these components are integrated and supported by a relational database management system, ensuring the integrity and consistency of Terabytes of Earth Observation data.

  6. UCLA's Molecular Screening Shared Resource: enhancing small molecule discovery with functional genomics and new technology.

    Science.gov (United States)

    Damoiseaux, Robert

    2014-05-01

    The Molecular Screening Shared Resource (MSSR) offers a comprehensive range of leading-edge high throughput screening (HTS) services including drug discovery, chemical and functional genomics, and novel methods for nano and environmental toxicology. The MSSR is an open access environment with investigators from UCLA as well as from the entire globe. Industrial clients are equally welcome as are non-profit entities. The MSSR is a fee-for-service entity and does not retain intellectual property. In conjunction with the Center for Environmental Implications of Nanotechnology, the MSSR is unique in its dedicated and ongoing efforts towards high throughput toxicity testing of nanomaterials. In addition, the MSSR engages in technology development eliminating bottlenecks from the HTS workflow and enabling novel assays and readouts currently not available.

  7. The Roles of Water in the Protein Matrix: A Largely Untapped Resource for Drug Discovery.

    Science.gov (United States)

    Spyrakis, Francesca; Ahmed, Mostafa H; Bayden, Alexander S; Cozzini, Pietro; Mozzarelli, Andrea; Kellogg, Glen E

    2017-08-24

    The value of thoroughly understanding the thermodynamics specific to a drug discovery/design study is well known. Over the past decade, the crucial roles of water molecules in protein structure, function, and dynamics have also become increasingly appreciated. This Perspective explores water in the biological environment by adopting its point of view in such phenomena. The prevailing thermodynamic models of the past, where water was seen largely in terms of an entropic gain after its displacement by a ligand, are now known to be much too simplistic. We adopt a set of terminology that describes water molecules as being "hot" and "cold", which we have defined as being easy and difficult to displace, respectively. The basis of these designations, which involve both enthalpic and entropic water contributions, are explored in several classes of biomolecules and structural motifs. The hallmarks for characterizing water molecules are examined, and computational tools for evaluating water-centric thermodynamics are reviewed. This Perspective's summary features guidelines for exploiting water molecules in drug discovery.

  8. Radiotracer properties determined by high performance liquid chromatography: a potential tool for brain radiotracer discovery

    International Nuclear Information System (INIS)

    Tavares, Adriana Alexandre S.; Lewsey, James; Dewar, Deborah; Pimlott, Sally L.

    2012-01-01

    Introduction: Previously, development of novel brain radiotracers has largely relied on simple screening tools. Improved selection methods at the early stages of radiotracer discovery and an increased understanding of the relationships between in vitro physicochemical and in vivo radiotracer properties are needed. We investigated if high performance liquid chromatography (HPLC) methodologies could provide criteria for lead candidate selection by comparing HPLC measurements with radiotracer properties in humans. Methods: Ten molecules, previously used as radiotracers in humans, were analysed to obtain the following measures: partition coefficient (Log P); permeability (P m ); percentage of plasma protein binding (%PPB); and membrane partition coefficient (K m ). Relationships between brain entry measurements (Log P, P m and %PPB) and in vivo brain percentage injected dose (%ID); and K m and specific binding in vivo (BP ND ) were investigated. Log P values obtained using in silico packages and flask methods were compared with Log P values obtained using HPLC. Results: The modelled associations with %ID were stronger for %PPB (r 2 =0.65) and P m (r 2 =0.77) than for Log P (r 2 =0.47) while 86% of BP ND variance was explained by K m . Log P values were variable dependant on the methodology used. Conclusions: Log P should not be relied upon as a predictor of blood-brain barrier penetration during brain radiotracer discovery. HPLC measurements of permeability, %PPB and membrane interactions may be potentially useful in predicting in vivo performance and hence allow evaluation and ranking of compound libraries for the selection of lead radiotracer candidates at early stages of radiotracer discovery.

  9. Using Multiple Tools to Analyze Resource Exchange in China

    Directory of Open Access Journals (Sweden)

    Nan Li

    2015-09-01

    Full Text Available With the rapid development of globalization, the function of international physical resource exchange is becoming increasingly important in economic growth through resource optimization. However, most existing ecological economy studies use physical trade balance (PTB directly or use physical imports and exports individually to analyze national material metabolization. Neither the individual analysis of physical imports and exports nor the direct analysis of PTB is capable of portraying the comprehensive contributions of a certain product to total physical trade. This study introduced an indicator, i.e., the physical contribution to the trade balance (PCB, which evolved from the traditional index of contribution to the trade balance (CB. In addition, trade balance (TB, PTB, CB, and PCB were systematically related and combined. An analysis was conducted using the four tools to obtain overall trade trends in China. This study discovered that both physical trade value and quantity exhibited different characteristics when China joined the World Trade Organization in 2002 and experienced the global economic crisis in 2009. Finally, the advantages of a supporting policy decision by applying multiple analytical tools to physical trade were discussed.

  10. Footprints: A Visual Search Tool that Supports Discovery and Coverage Tracking.

    Science.gov (United States)

    Isaacs, Ellen; Domico, Kelly; Ahern, Shane; Bart, Eugene; Singhal, Mudita

    2014-12-01

    Searching a large document collection to learn about a broad subject involves the iterative process of figuring out what to ask, filtering the results, identifying useful documents, and deciding when one has covered enough material to stop searching. We are calling this activity "discoverage," discovery of relevant material and tracking coverage of that material. We built a visual analytic tool called Footprints that uses multiple coordinated visualizations to help users navigate through the discoverage process. To support discovery, Footprints displays topics extracted from documents that provide an overview of the search space and are used to construct searches visuospatially. Footprints allows users to triage their search results by assigning a status to each document (To Read, Read, Useful), and those status markings are shown on interactive histograms depicting the user's coverage through the documents across dates, sources, and topics. Coverage histograms help users notice biases in their search and fill any gaps in their analytic process. To create Footprints, we used a highly iterative, user-centered approach in which we conducted many evaluations during both the design and implementation stages and continually modified the design in response to feedback.

  11. Climate Discovery: Integrating Research With Exhibit, Public Tours, K-12, and Web-based EPO Resources

    Science.gov (United States)

    Foster, S. Q.; Carbone, L.; Gardiner, L.; Johnson, R.; Russell, R.; Advisory Committee, S.; Ammann, C.; Lu, G.; Richmond, A.; Maute, A.; Haller, D.; Conery, C.; Bintner, G.

    2005-12-01

    The Climate Discovery Exhibit at the National Center for Atmospheric Research (NCAR) Mesa Lab provides an exciting conceptual outline for the integration of several EPO activities with other well-established NCAR educational resources and programs. The exhibit is organized into four topic areas intended to build understanding among NCAR's 80,000 annual visitors, including 10,000 school children, about Earth system processes and scientific methods contributing to a growing body of knowledge about climate and global change. These topics include: 'Sun-Earth Connections,' 'Climate Now,' 'Climate Past,' and 'Climate Future.' Exhibit text, graphics, film and electronic media, and interactives are developed and updated through collaborations between NCAR's climate research scientists and staff in the Office of Education and Outreach (EO) at the University Corporation for Atmospheric Research (UCAR). With funding from NCAR, paleoclimatologists have contributed data and ideas for a new exhibit Teachers' Guide unit about 'Climate Past.' This collection of middle-school level, standards-aligned lessons are intended to help students gain understanding about how scientists use proxy data and direct observations to describe past climates. Two NASA EPO's have funded the development of 'Sun-Earth Connection' lessons, visual media, and tips for scientists and teachers. Integrated with related content and activities from the NASA-funded Windows to the Universe web site, these products have been adapted to form a second unit in the Climate Discovery Teachers' Guide about the Sun's influence on Earth's climate. Other lesson plans, previously developed by on-going efforts of EO staff and NSF's previously-funded Project Learn program are providing content for a third Teachers' Guide unit on 'Climate Now' - the dynamic atmospheric and geological processes that regulate Earth's climate. EO has plans to collaborate with NCAR climatologists and computer modelers in the next year to develop

  12. Data access and decision tools for coastal water resources ...

    Science.gov (United States)

    US EPA has supported the development of numerous models and tools to support implementation of environmental regulations. However, transfer of knowledge and methods from detailed technical models to support practical problem solving by local communities and watershed or coastal management organizations remains a challenge. We have developed the Estuary Data Mapper (EDM) to facilitate data discovery, visualization and access to support environmental problem solving for coastal watersheds and estuaries. EDM is a stand-alone application based on open-source software which requires only internet access for operation. Initially, development of EDM focused on delivery of raw data streams from distributed web services, ranging from atmospheric deposition to hydrologic, tidal, and water quality time series, estuarine habitat characteristics, and remote sensing products. We have transitioned to include access to value-added products which provide end-users with results of future scenario analysis, facilitate extension of models across geographic regions, and/or promote model interoperability. Here we present three examples: 1) the delivery of input data for the development of seagrass models across estuaries, 2) scenarios illustrating the implications of riparian buffer management (loss or restoration) for stream thermal regimes and fish communities, and 3) access to hydrology model outputs to foster connections across models at different scales, ultimately feeding

  13. Tools and data services registry: a community effort to document bioinformatics resources

    NARCIS (Netherlands)

    Ison, J.; Rapacki, K.; Menager, H.; Kalas, M.; Rydza, E.; Chmura, P.; Anthon, C.; Beard, N.; Berka, K.; Bolser, D.; Booth, T.; Bretaudeau, A.; Brezovsky, J.; Casadio, R.; Cesareni, G.; Coppens, F.; Cornell, M.; Cuccuru, G.; Davidsen, K.; Vedova, G.D.; Dogan, T.; Doppelt-Azeroual, O.; Emery, L.; Gasteiger, E.; Gatter, T.; Goldberg, T.; Grosjean, M.; Gruning, B.; Helmer-Citterich, M.; Ienasescu, H.; Ioannidis, V.; Jespersen, M.C.; Jimenez, R.; Juty, N.; Juvan, P.; Koch, M.; Laibe, C.; Li, J.W.; Licata, L.; Mareuil, F.; Micetic, I.; Friborg, R.M.; Moretti, S.; Morris, C.; Moller, S.; Nenadic, A.; Peterson, H.; Profiti, G.; Rice, P.; Romano, P.; Roncaglia, P.; Saidi, R.; Schafferhans, A.; Schwammle, V.; Smith, C.; Sperotto, M.M.; Stockinger, H.; Varekova, R.S.; Tosatto, S.C.; Torre, V.; Uva, P.; Via, A.; Yachdav, G.; Zambelli, F.; Vriend, G.; Rost, B.; Parkinson, H.; Longreen, P.; Brunak, S.

    2016-01-01

    Life sciences are yielding huge data sets that underpin scientific discoveries fundamental to improvement in human health, agriculture and the environment. In support of these discoveries, a plethora of databases and tools are deployed, in technically complex and diverse implementations, across a

  14. Neopeptide Analyser: A software tool for neopeptide discovery in proteomics data [version 1; referees: 2 approved

    Directory of Open Access Journals (Sweden)

    Mandy Peffers

    2017-04-01

    Full Text Available Experiments involving mass spectrometry (MS-based proteomics are widely used for analyses of connective tissues. Common examples include the use of relative quantification to identify differentially expressed peptides and proteins in cartilage and tendon. We are working on characterising so-called ‘neopeptides’, i.e. peptides formed due to native cleavage of proteins, for example under pathological conditions. Unlike peptides typically quantified in MS workflows due to the in vitro use of an enzyme such as trypsin, a neopeptide has at least one terminus that was not due to the use of trypsin in the workflow. The identification of neopeptides within these datasets is important in understanding disease pathology, and the development of antibodies that could be utilised as diagnostic biomarkers for diseases, such as osteoarthritis, and targets for novel treatments. Our previously described neopeptide data analysis workflow was laborious and was not amenable to robust statistical analysis, which reduced confidence in the neopeptides identified. To overcome this, we developed ‘Neopeptide Analyser’, a user friendly neopeptide analysis tool used in conjunction with label-free MS quantification tool Progenesis QIP for proteomics. Neopeptide Analyser filters data sourced from Progenesis QIP output to identify neopeptide sequences, as well as give the residues that are adjacent to the peptide in its corresponding protein sequence. It also produces normalised values for the neopeptide quantification values and uses these to perform statistical tests, which are also included in the output. Neopeptide Analyser is available as a Java application for Mac, Windows and Linux. The analysis features and ease of use encourages data exploration, which could aid the discovery of novel pathways in extracellular matrix degradation, the identification of potential biomarkers and as a tool to investigate matrix turnover. Neopeptide Analyser is available from

  15. Special issue of International journal of human resource management: Conceptual and empirical discoveries in successful HRM implementation

    OpenAIRE

    Mireia Valverde; Tanya Bondarouk; Jordi Trullen

    2016-01-01

    Special issue of International journal of human resource management: Conceptual and empirical discoveries in successful HRM implementation DOI: 10.1080/09585192.2016.1154378 URL: http://www.tandfonline.com/doi/full/10.1080/09585192.2016.1154378 Filiació URV: SI Inclòs a la memòria: SI Paraules clau en blanc [No abstract available

  16. Remote access tools for optimization of hardware and workmanship resources

    International Nuclear Information System (INIS)

    Bartnig, Roberto; Diniz, Luciano; Ribeiro, Joao Luiz; Salcedo, Fernando

    2000-01-01

    Campos basin, during its 23 years, went by several different generations concerning industrial automation, always looking for operational improvement and reduction of risks, as well as larger reliability and precision in the process controls. The ECOS (Central Operational and Supervision Station) is a human-machine interface developed by PETROBRAS using graphic stations for supervision and control of 16 offshore production units, which nowadays are responsible for about 77 % of oil production of Campos basin (730,000 barrels/day). Through the use of software tools developed by the personnel of the support to the industrial automation it was possible to optimize the resources of specialized labor and to take advantage of the periods of smaller use of the net. Those tools monitor disk free space and fragmentation, onshore/offshore communication links, local network traffic and errors, backup of process historical data and applications, in order to guarantee the operation of the net, the historical of process data, the integrity of the production applications, improving safety and operational continuity. (author)

  17. atBioNet– an integrated network analysis tool for genomics and biomarker discovery

    Directory of Open Access Journals (Sweden)

    Ding Yijun

    2012-07-01

    Full Text Available Abstract Background Large amounts of mammalian protein-protein interaction (PPI data have been generated and are available for public use. From a systems biology perspective, Proteins/genes interactions encode the key mechanisms distinguishing disease and health, and such mechanisms can be uncovered through network analysis. An effective network analysis tool should integrate different content-specific PPI databases into a comprehensive network format with a user-friendly platform to identify key functional modules/pathways and the underlying mechanisms of disease and toxicity. Results atBioNet integrates seven publicly available PPI databases into a network-specific knowledge base. Knowledge expansion is achieved by expanding a user supplied proteins/genes list with interactions from its integrated PPI network. The statistically significant functional modules are determined by applying a fast network-clustering algorithm (SCAN: a Structural Clustering Algorithm for Networks. The functional modules can be visualized either separately or together in the context of the whole network. Integration of pathway information enables enrichment analysis and assessment of the biological function of modules. Three case studies are presented using publicly available disease gene signatures as a basis to discover new biomarkers for acute leukemia, systemic lupus erythematosus, and breast cancer. The results demonstrated that atBioNet can not only identify functional modules and pathways related to the studied diseases, but this information can also be used to hypothesize novel biomarkers for future analysis. Conclusion atBioNet is a free web-based network analysis tool that provides a systematic insight into proteins/genes interactions through examining significant functional modules. The identified functional modules are useful for determining underlying mechanisms of disease and biomarker discovery. It can be accessed at: http://www.fda.gov/ScienceResearch/BioinformaticsTools

  18. atBioNet--an integrated network analysis tool for genomics and biomarker discovery.

    Science.gov (United States)

    Ding, Yijun; Chen, Minjun; Liu, Zhichao; Ding, Don; Ye, Yanbin; Zhang, Min; Kelly, Reagan; Guo, Li; Su, Zhenqiang; Harris, Stephen C; Qian, Feng; Ge, Weigong; Fang, Hong; Xu, Xiaowei; Tong, Weida

    2012-07-20

    Large amounts of mammalian protein-protein interaction (PPI) data have been generated and are available for public use. From a systems biology perspective, Proteins/genes interactions encode the key mechanisms distinguishing disease and health, and such mechanisms can be uncovered through network analysis. An effective network analysis tool should integrate different content-specific PPI databases into a comprehensive network format with a user-friendly platform to identify key functional modules/pathways and the underlying mechanisms of disease and toxicity. atBioNet integrates seven publicly available PPI databases into a network-specific knowledge base. Knowledge expansion is achieved by expanding a user supplied proteins/genes list with interactions from its integrated PPI network. The statistically significant functional modules are determined by applying a fast network-clustering algorithm (SCAN: a Structural Clustering Algorithm for Networks). The functional modules can be visualized either separately or together in the context of the whole network. Integration of pathway information enables enrichment analysis and assessment of the biological function of modules. Three case studies are presented using publicly available disease gene signatures as a basis to discover new biomarkers for acute leukemia, systemic lupus erythematosus, and breast cancer. The results demonstrated that atBioNet can not only identify functional modules and pathways related to the studied diseases, but this information can also be used to hypothesize novel biomarkers for future analysis. atBioNet is a free web-based network analysis tool that provides a systematic insight into proteins/genes interactions through examining significant functional modules. The identified functional modules are useful for determining underlying mechanisms of disease and biomarker discovery. It can be accessed at: http://www.fda.gov/ScienceResearch/BioinformaticsTools/ucm285284.htm.

  19. Development of a Suite of Analytical Tools for Energy and Water Infrastructure Knowledge Discovery

    Science.gov (United States)

    Morton, A.; Piburn, J.; Stewart, R.; Chandola, V.

    2017-12-01

    Energy and water generation and delivery systems are inherently interconnected. With demand for energy growing, the energy sector is experiencing increasing competition for water. With increasing population and changing environmental, socioeconomic, and demographic scenarios, new technology and investment decisions must be made for optimized and sustainable energy-water resource management. This also requires novel scientific insights into the complex interdependencies of energy-water infrastructures across multiple space and time scales. To address this need, we've developed a suite of analytical tools to support an integrated data driven modeling, analysis, and visualization capability for understanding, designing, and developing efficient local and regional practices related to the energy-water nexus. This work reviews the analytical capabilities available along with a series of case studies designed to demonstrate the potential of these tools for illuminating energy-water nexus solutions and supporting strategic (federal) policy decisions.

  20. Improved discovery of NEON data and samples though vocabularies, workflows, and web tools

    Science.gov (United States)

    Laney, C. M.; Elmendorf, S.; Flagg, C.; Harris, T.; Lunch, C. K.; Gulbransen, T.

    2017-12-01

    The National Ecological Observatory Network (NEON) is a continental-scale ecological observation facility sponsored by the National Science Foundation and operated by Battelle. NEON supports research on the impacts of invasive species, land use change, and environmental change on natural resources and ecosystems by gathering and disseminating a full suite of observational, instrumented, and airborne datasets from field sites across the U.S. NEON also collects thousands of samples from soil, water, and organisms every year, and partners with numerous institutions to analyze and archive samples. We have developed numerous new technologies to support processing and discovery of this highly diverse collection of data. These technologies include applications for data collection and sample management, processing pipelines specific to each collection system (field observations, installed sensors, and airborne instruments), and publication pipelines. NEON data and metadata are discoverable and downloadable via both a public API and data portal. We solicit continued engagement and advice from the informatics and environmental research communities, particularly in the areas of data versioning, usability, and visualization.

  1. SPME as a promising tool in translational medicine and drug discovery: From bench to bedside.

    Science.gov (United States)

    Goryński, Krzysztof; Goryńska, Paulina; Górska, Agnieszka; Harężlak, Tomasz; Jaroch, Alina; Jaroch, Karol; Lendor, Sofia; Skobowiat, Cezary; Bojko, Barbara

    2016-10-25

    Solid phase microextraction (SPME) is a technology where a small amount of an extracting phase dispersed on a solid support is exposed to the sample for a well-defined period of time. The open-bed geometry and biocompatibility of the materials used for manufacturing of the devices makes it very convenient tool for direct extraction from complex biological matrices. The flexibility of the formats permits tailoring the method according the needs of the particular application. Number of studies concerning monitoring of drugs and their metabolites, analysis of metabolome of volatile as well as non-volatile compounds, determination of ligand-protein binding, permeability and compound toxicity was already reported. All these applications were performed in different matrices including biological fluids and tissues, cell cultures, and in living animals. The low invasiveness of in vivo SPME, ability of using very small sample volumes and analysis of cell cultures permits to address the rule of 3R, which is currently acknowledged ethical standard in R&D labs. In the current review systematic evaluation of the applicability of SPME to studies required to be conduct at different stages of drug discovery and development and translational medicine is presented. The advantages and challenges are discussed based on the examples directly showing given experimental design or on the studies, which could be translated to the models routinely used in drug development process. Copyright © 2016 Elsevier B.V. All rights reserved.

  2. Evaluation of tools for highly variable gene discovery from single-cell RNA-seq data.

    Science.gov (United States)

    Yip, Shun H; Sham, Pak Chung; Wang, Junwen

    2018-02-21

    Traditional RNA sequencing (RNA-seq) allows the detection of gene expression variations between two or more cell populations through differentially expressed gene (DEG) analysis. However, genes that contribute to cell-to-cell differences are not discoverable with RNA-seq because RNA-seq samples are obtained from a mixture of cells. Single-cell RNA-seq (scRNA-seq) allows the detection of gene expression in each cell. With scRNA-seq, highly variable gene (HVG) discovery allows the detection of genes that contribute strongly to cell-to-cell variation within a homogeneous cell population, such as a population of embryonic stem cells. This analysis is implemented in many software packages. In this study, we compare seven HVG methods from six software packages, including BASiCS, Brennecke, scLVM, scran, scVEGs and Seurat. Our results demonstrate that reproducibility in HVG analysis requires a larger sample size than DEG analysis. Discrepancies between methods and potential issues in these tools are discussed and recommendations are made.

  3. A Delphi study assessing the utility of quality improvement tools and resources in Australian primary care.

    Science.gov (United States)

    Upham, Susan J; Janamian, Tina; Crossland, Lisa; Jackson, Claire L

    2016-04-18

    To determine the relevance and utility of online tools and resources to support organisational performance development in primary care and to complement the Primary Care Practice Improvement Tool (PC-PIT). A purposively recruited Expert Advisory Panel of 12 end users used a modified Delphi technique to evaluate 53 tools and resources identified through a previously conducted systematic review. The panel comprised six practice managers and six general practitioners who had participated in the PC-PIT pilot study in 2013-2014. Tools and resources were reviewed in three rounds using a standard pre-tested assessment form. Recommendations, scores and reasons for recommending or rejecting each tool or resource were analysed to determine the final suite of tools and resources. The evaluation was conducted from November 2014 to August 2015. Recommended tools and resources scored highly (mean score, 16/20) in Rounds 1 and 2 of review (n = 25). These tools and resources were perceived to be easily used, useful to the practice and supportive of the PC-PIT. Rejected resources scored considerably lower (mean score, 5/20) and were noted to have limitations such as having no value to the practice and poor utility (n = 6). A final review (Round 3) of 28 resources resulted in a suite of 21 to support the elements of the PC-PIT. This suite of tools and resources offers one approach to supporting the quality improvement initiatives currently in development in primary care reform.

  4. Gene Overexpression Resources in Cereals for Functional Genomics and Discovery of Useful Genes

    Directory of Open Access Journals (Sweden)

    Kiyomi Abe

    2016-09-01

    Full Text Available Identification and elucidation of functions of plant genes is valuable for both basic and applied research. In addition to natural variation in model plants, numerous loss-of-function resources have been produced by mutagenesis with chemicals, irradiation, or insertions of transposable elements or T-DNA. However, we may be unable to observe loss-of-function phenotypes for genes with functionally redundant homologs, and for those essential for growth and development. To offset such disadvantages, gain-of-function transgenic resources have been exploited. Activation-tagged lines have been generated using obligatory overexpression of endogenous genes by random insertion of an enhancer. Recent progress in DNA sequencing technology and bioinformatics has enabled the preparation of genomewide collections of full-length cDNAs (fl-cDNAs in some model species. Using the fl-cDNA clones, a novel gain-of-function strategy, Fl-cDNA OvereXpressor gene (FOX-hunting system, has been developed. A mutant phenotype in a FOX line can be directly attributed to the overexpressed fl-cDNA. Investigating a large population of FOX lines could reveal important genes conferring favorable phenotypes for crop breeding. Alternatively, a unique loss-of-function approach Chimeric REpressor gene Silencing Technology (CRES-T has been developed. In CRES-T, overexpression of a chimeric repressor, composed of the coding sequence of a transcription factor (TF and short peptide designated as the repression domain, could interfere with the action of endogenous TF in plants. Although plant TFs usually consist of gene families, CRES-T is effective, in principle, even for the TFs with functional redundancy. In this review, we focus on the current status of the gene-overexpression strategies and resources for identifying and elucidating novel functions of cereal genes. We discuss the potential of these research tools for identifying useful genes and phenotypes for application in crop

  5. AFRA-NEST: A Tool for Human Resource Development

    International Nuclear Information System (INIS)

    Amanor, Edison; Akaho, E.H.K.; Serfor-Armah, Y.

    2014-01-01

    Conclusion: • Regional Networks could serve as a common platform to meet the needs for human resource development. • With AFRA-NEST, International cooperation would be strengthened. • Systematic integration and sharing of available nuclear training resources. • Cost of training future nuclear experts could drastically be reduced

  6. Ecological and resource economics as ecosystem management tools

    Science.gov (United States)

    Stephen Farber; Dennis. Bradley

    1999-01-01

    Economic pressures on ecosystems will only intensify in the future. Increased population levels, settlement patterns, and increased incomes will raise the demands for ecosystem resources and their services. The pressure to transform ecosystem natural assets into marketable commodities, whether by harvesting and mining resources or altering landscapes through...

  7. The MY NASA DATA Project: Tools and a Collaboration Space for Knowledge Discovery

    Science.gov (United States)

    Chambers, L. H.; Alston, E. J.; Diones, D. D.; Moore, S. W.; Oots, P. C.; Phelps, C. S.

    2006-05-01

    The Atmospheric Science Data Center (ASDC) at NASA Langley Research Center is charged with serving a wide user community that is interested in its large data holdings in the areas of Aerosols, Clouds, Radiation Budget, and Tropospheric Chemistry. Most of the data holdings, however, are in large files with specialized data formats. The MY NASA DATA (mynasadata.larc.nasa.gov) project began in 2004, as part of the NASA Research, Education, and Applications Solutions Network (REASoN), in order to open this important resource to a broader community including K-12 education and citizen scientists. MY NASA DATA (short for Mentoring and inquirY using NASA Data on Atmospheric and earth science for Teachers and Amateurs) consists of a web space that collects tools, lesson plans, and specially developed documentation to help the target audience more easily use the vast collection of NASA data about the Earth System. The core piece of the MY NASA DATA project is the creation of microsets (both static and custom) that make data easily accessible. The installation of a Live Access Server (LAS) greatly enhanced the ability for teachers, students, and citizen scientists to create and explore custom microsets of Earth System Science data. The LAS, which is an open source software tool using emerging data standards, also allows the MY NASA DATA team to make available data on other aspects of the Earth System from collaborating data centers. We are currently working with the Physical Oceanography DAAC at the Jet Propulsion Laboratory to bring in several parameters describing the ocean. In addition, MY NASA DATA serves as a central space for the K-12 community to share resources. The site already includes a dozen User-contributed lesson plans. This year we will be focusing on the Citizen Science portion of the site, and will be welcoming user-contributed project ideas, as well as reports of completed projects. An e-mentor network has also been created to involve a wider community in

  8. Using the Tools and Resources of the RCSB Protein Data Bank.

    Science.gov (United States)

    Costanzo, Luigi Di; Ghosh, Sutapa; Zardecki, Christine; Burley, Stephen K

    2016-09-07

    The Protein Data Bank (PDB) archive is the worldwide repository of experimentally determined three-dimensional structures of large biological molecules found in all three kingdoms of life. Atomic-level structures of these proteins, nucleic acids, and complex assemblies thereof are central to research and education in molecular, cellular, and organismal biology, biochemistry, biophysics, materials science, bioengineering, ecology, and medicine. Several types of information are associated with each PDB archival entry, including atomic coordinates, primary experimental data, polymer sequence(s), and summary metadata. The Research Collaboratory for Structural Bioinformatics Protein Data Bank (RCSB PDB) serves as the U.S. data center for the PDB, distributing archival data and supporting both simple and complex queries that return results. These data can be freely downloaded, analyzed, and visualized using RCSB PDB tools and resources to gain a deeper understanding of fundamental biological processes, molecular evolution, human health and disease, and drug discovery. © 2016 by John Wiley & Sons, Inc. Copyright © 2016 John Wiley & Sons, Inc.

  9. The first set of EST resource for gene discovery and marker development in pigeonpea (Cajanus cajan L.

    Directory of Open Access Journals (Sweden)

    Byregowda Munishamappa

    2010-03-01

    Full Text Available Abstract Background Pigeonpea (Cajanus cajan (L. Millsp is one of the major grain legume crops of the tropics and subtropics, but biotic stresses [Fusarium wilt (FW, sterility mosaic disease (SMD, etc.] are serious challenges for sustainable crop production. Modern genomic tools such as molecular markers and candidate genes associated with resistance to these stresses offer the possibility of facilitating pigeonpea breeding for improving biotic stress resistance. Availability of limited genomic resources, however, is a serious bottleneck to undertake molecular breeding in pigeonpea to develop superior genotypes with enhanced resistance to above mentioned biotic stresses. With an objective of enhancing genomic resources in pigeonpea, this study reports generation and analysis of comprehensive resource of FW- and SMD- responsive expressed sequence tags (ESTs. Results A total of 16 cDNA libraries were constructed from four pigeonpea genotypes that are resistant and susceptible to FW ('ICPL 20102' and 'ICP 2376' and SMD ('ICP 7035' and 'TTB 7' and a total of 9,888 (9,468 high quality ESTs were generated and deposited in dbEST of GenBank under accession numbers GR463974 to GR473857 and GR958228 to GR958231. Clustering and assembly analyses of these ESTs resulted into 4,557 unique sequences (unigenes including 697 contigs and 3,860 singletons. BLASTN analysis of 4,557 unigenes showed a significant identity with ESTs of different legumes (23.2-60.3%, rice (28.3%, Arabidopsis (33.7% and poplar (35.4%. As expected, pigeonpea ESTs are more closely related to soybean (60.3% and cowpea ESTs (43.6% than other plant ESTs. Similarly, BLASTX similarity results showed that only 1,603 (35.1% out of 4,557 total unigenes correspond to known proteins in the UniProt database (≤ 1E-08. Functional categorization of the annotated unigenes sequences showed that 153 (3.3% genes were assigned to cellular component category, 132 (2.8% to biological process, and 132 (2

  10. iTools: a framework for classification, categorization and integration of computational biology resources.

    Directory of Open Access Journals (Sweden)

    Ivo D Dinov

    2008-05-01

    Full Text Available The advancement of the computational biology field hinges on progress in three fundamental directions--the development of new computational algorithms, the availability of informatics resource management infrastructures and the capability of tools to interoperate and synergize. There is an explosion in algorithms and tools for computational biology, which makes it difficult for biologists to find, compare and integrate such resources. We describe a new infrastructure, iTools, for managing the query, traversal and comparison of diverse computational biology resources. Specifically, iTools stores information about three types of resources--data, software tools and web-services. The iTools design, implementation and resource meta-data content reflect the broad research, computational, applied and scientific expertise available at the seven National Centers for Biomedical Computing. iTools provides a system for classification, categorization and integration of different computational biology resources across space-and-time scales, biomedical problems, computational infrastructures and mathematical foundations. A large number of resources are already iTools-accessible to the community and this infrastructure is rapidly growing. iTools includes human and machine interfaces to its resource meta-data repository. Investigators or computer programs may utilize these interfaces to search, compare, expand, revise and mine meta-data descriptions of existent computational biology resources. We propose two ways to browse and display the iTools dynamic collection of resources. The first one is based on an ontology of computational biology resources, and the second one is derived from hyperbolic projections of manifolds or complex structures onto planar discs. iTools is an open source project both in terms of the source code development as well as its meta-data content. iTools employs a decentralized, portable, scalable and lightweight framework for long

  11. Watershed Application of the Sustainable Installations Regional Resource Assessment Tool

    National Research Council Canada - National Science Library

    Jenicek, Elizabeth M; Fournier, Donald F; Downs, Natalie R; Boesdorfer, Brad

    2005-01-01

    The U.S. Army Corps of Engineers recognizes the need for a system-wide approach to ecosystem management in its efforts to provide environmental sustainability in the stewardship of the Nation's water resources...

  12. Developing a planning tool for South African prosecution resources ...

    African Journals Online (AJOL)

    Strategic planning, forecasting, simulation, resource planning, prosecution ... whether a case should and can be prosecuted, what charges to prosecute .... various activities in the court environment, were the recently built discrete-event simula-.

  13. Tool development to understand rural resource users' land use and ...

    African Journals Online (AJOL)

    -) ..... is a proxy for soil fertility and water availability. The resource users .... in Montpellier (France), two sessions with conservationists in Ant- ananarivo and .... hood activities within the wetlands of the Alaotra, (ii) living close to Lake Alaotra ...

  14. Responsible resource management in hotels : attitudes, indicators, tools and strategies

    OpenAIRE

    Bohdanowicz, Paulina

    2006-01-01

    Hotels constitute one of the main, and still expanding, pillars of the tourism sector and are highly unique among other commercial buildings. Resource intensive and frequently inefficient systems and operational routines applied in the sector, result in considerable environmental impact and indicate an urgent need for more environmentally sound practices and products in the hotel industry. A certain level of activity in the area of reducing resource use has been observed for quite some time b...

  15. Uranium resources evaluation model as an exploration tool

    International Nuclear Information System (INIS)

    Ruzicka, V.

    1976-01-01

    Evaluation of uranium resources, as conducted by the Uranium Resources Evaluation Section of the Geological Survey of Canada, comprises operations analogous with those performed during the preparatory stages of uranium exploration. The uranium resources evaluation model, simulating the estimation process, can be divided into four steps. The first step includes definition of major areas and ''unit subdivisions'' for which geological data are gathered, coded, computerized and retrieved. Selection of these areas and ''unit subdivisions'' is based on a preliminary appraisal of their favourability for uranium mineralization. The second step includes analyses of the data, definition of factors controlling uranium minearlization, classification of uranium occurrences into genetic types, and final delineation of favourable areas; this step corresponds to the selection of targets for uranium exploration. The third step includes geological field work; it is equivalent to geological reconnaissance in exploration. The fourth step comprises computation of resources; the preliminary evaluation techniques in the exploration are, as a rule, analogous with the simplest methods employed in the resource evaluation. The uranium resources evaluation model can be conceptually applied for decision-making during exploration or for formulation of exploration strategy using the quantified data as weighting factors. (author)

  16. Applications and Methods Utilizing the Simple Semantic Web Architecture and Protocol (SSWAP) for Bioinformatics Resource Discovery and Disparate Data and Service Integration

    Science.gov (United States)

    Scientific data integration and computational service discovery are challenges for the bioinformatic community. This process is made more difficult by the separate and independent construction of biological databases, which makes the exchange of scientific data between information resources difficu...

  17. Data access and decision tools for coastal water resources management

    Science.gov (United States)

    US EPA has supported the development of numerous models and tools to support implementation of environmental regulations. However, transfer of knowledge and methods from detailed technical models to support practical problem solving by local communities and watershed or coastal ...

  18. Resources for Indoor Air Quality Design Tools for Schools

    Science.gov (United States)

    The information available here is presented as a tool to help school districts and facility planners design the next generation of learning environments so that the school facility will help schools in achieving their core mission of educating children.

  19. Tools and resources for neuroanatomy education: a systematic review.

    Science.gov (United States)

    Arantes, M; Arantes, J; Ferreira, M A

    2018-05-03

    The aim of this review was to identify studies exploring neuroanatomy teaching tools and their impact in learning, as a basis towards the implementation of a neuroanatomy program in the context of a curricular reform in medical education. Computer-assisted searches were conducted through March 2017 in the PubMed, Web of Science, Medline, Current Contents Connect, KCI and Scielo Citation Index databases. Four sets of keywords were used, combining "neuroanatomy" with "education", "teaching", "learning" and "student*". Studies were reviewed independently by two readers, and data collected were confirmed by a third reader. Of the 214 studies identified, 29 studies reported data on the impact of using specific neuroanatomy teaching tools. Most of them (83%) were published in the last 8 years and were conducted in the United States of America (65.52%). Regarding the participants, medical students were the most studied sample (37.93%) and the majority of the studies (65.52%) had less than 100 participants. Approximately half of the studies included in this review used digital teaching tools (e.g., 3D computer neuroanatomy models), whereas the remaining used non-digital learning tools (e.g., 3D physical models). Our work highlight the progressive interest in the study of neuroanatomy teaching tools over the last years, as evidenced from the number of publications and highlight the need to consider new tools, coping with technological development in medical education.

  20. Disability Rights, Gender, and Development: A Resource Tool for Action. Full Report

    Science.gov (United States)

    de Silva de Alwis, Rangita

    2008-01-01

    This resource tool builds a normative framework to examine the intersections of disability rights and gender in the human rights based approach to development. Through case studies, good practices and analyses the research tool makes recommendations and illustrates effective tools for the implementation of gender and disability sensitive laws,…

  1. Tools for Analyzing Computing Resource Management Strategies and Algorithms for SDR Clouds

    Science.gov (United States)

    Marojevic, Vuk; Gomez-Miguelez, Ismael; Gelonch, Antoni

    2012-09-01

    Software defined radio (SDR) clouds centralize the computing resources of base stations. The computing resource pool is shared between radio operators and dynamically loads and unloads digital signal processing chains for providing wireless communications services on demand. Each new user session request particularly requires the allocation of computing resources for executing the corresponding SDR transceivers. The huge amount of computing resources of SDR cloud data centers and the numerous session requests at certain hours of a day require an efficient computing resource management. We propose a hierarchical approach, where the data center is divided in clusters that are managed in a distributed way. This paper presents a set of computing resource management tools for analyzing computing resource management strategies and algorithms for SDR clouds. We use the tools for evaluating a different strategies and algorithms. The results show that more sophisticated algorithms can achieve higher resource occupations and that a tradeoff exists between cluster size and algorithm complexity.

  2. An Overview of Bioinformatics Tools and Resources in Allergy.

    Science.gov (United States)

    Fu, Zhiyan; Lin, Jing

    2017-01-01

    The rapidly increasing number of characterized allergens has created huge demands for advanced information storage, retrieval, and analysis. Bioinformatics and machine learning approaches provide useful tools for the study of allergens and epitopes prediction, which greatly complement traditional laboratory techniques. The specific applications mainly include identification of B- and T-cell epitopes, and assessment of allergenicity and cross-reactivity. In order to facilitate the work of clinical and basic researchers who are not familiar with bioinformatics, we review in this chapter the most important databases, bioinformatic tools, and methods with relevance to the study of allergens.

  3. The Energy Industry Profile of ISO/DIS 19115-1: Facilitating Discovery and Evaluation of, and Access to Distributed Information Resources

    Science.gov (United States)

    Hills, S. J.; Richard, S. M.; Doniger, A.; Danko, D. M.; Derenthal, L.; Energistics Metadata Work Group

    2011-12-01

    A diverse group of organizations representative of the international community involved in disciplines relevant to the upstream petroleum industry, - energy companies, - suppliers and publishers of information to the energy industry, - vendors of software applications used by the industry, - partner government and academic organizations, has engaged in the Energy Industry Metadata Standards Initiative. This Initiative envisions the use of standard metadata within the community to enable significant improvements in the efficiency with which users discover, evaluate, and access distributed information resources. The metadata standard needed to realize this vision is the initiative's primary deliverable. In addition to developing the metadata standard, the initiative is promoting its adoption to accelerate realization of the vision, and publishing metadata exemplars conformant with the standard. Implementation of the standard by community members, in the form of published metadata which document the information resources each organization manages, will allow use of tools requiring consistent metadata for efficient discovery and evaluation of, and access to, information resources. While metadata are expected to be widely accessible, access to associated information resources may be more constrained. The initiative is being conducting by Energistics' Metadata Work Group, in collaboration with the USGIN Project. Energistics is a global standards group in the oil and natural gas industry. The Work Group determined early in the initiative, based on input solicited from 40+ organizations and on an assessment of existing metadata standards, to develop the target metadata standard as a profile of a revised version of ISO 19115, formally the "Energy Industry Profile of ISO/DIS 19115-1 v1.0" (EIP). The Work Group is participating on the ISO/TC 211 project team responsible for the revision of ISO 19115, now ready for "Draft International Standard" (DIS) status. With ISO 19115 an

  4. Insects: an underrepresented resource for the discovery of biologically active natural products

    Directory of Open Access Journals (Sweden)

    Lauren Seabrooks

    2017-07-01

    Full Text Available Nature has been the source of life-changing and -saving medications for centuries. Aspirin, penicillin and morphine are prime examples of Nature׳s gifts to medicine. These discoveries catalyzed the field of natural product drug discovery which has mostly focused on plants. However, insects have more than twice the number of species and entomotherapy has been in practice for as long as and often in conjunction with medicinal plants and is an important alternative to modern medicine in many parts of the world. Herein, an overview of current traditional medicinal applications of insects and characterization of isolated biologically active molecules starting from approximately 2010 is presented. Insect natural products reviewed were isolated from ants, bees, wasps, beetles, cockroaches, termites, flies, true bugs, moths and more. Biological activities of these natural products from insects include antimicrobial, antifungal, antiviral, anticancer, antioxidant, anti-inflammatory and immunomodulatory effects.

  5. On Resource Description Capabilities of On-Board Tools for Resource Management in Cloud Networking and NFV Infrastructures

    OpenAIRE

    Tutschku, Kurt; Ahmadi Mehri, Vida; Carlsson, Anders; Chivukula, Krishna Varaynya; Johan, Christenson

    2016-01-01

    The rapid adoption of networks that are based on "cloudification" and Network Function Virtualisation (NFV) comes from the anticipated high cost savings of up to 70% in their build and operation. The high savings are founded in the use of general standard servers, instead of single-purpose hardware, and by efficiency resource sharing through virtualisation concepts. In this paper, we discuss the capabilities of resource description of "on-board" tools, i.e. using standard Linux commands, to e...

  6. Usability Testing for e-Resource Discovery: How Students Find and Choose e-Resources Using Library Web Sites

    Science.gov (United States)

    Fry, Amy; Rich, Linda

    2011-01-01

    In early 2010, library staff at Bowling Green State University (BGSU) in Ohio designed and conducted a usability study of key parts of the library web site, focusing on the web pages generated by the library's electronic resources management system (ERM) that list and describe the library's databases. The goal was to discover how users find and…

  7. Metagenomics as a Tool for Enzyme Discovery: Hydrolytic Enzymes from Marine-Related Metagenomes.

    Science.gov (United States)

    Popovic, Ana; Tchigvintsev, Anatoly; Tran, Hai; Chernikova, Tatyana N; Golyshina, Olga V; Yakimov, Michail M; Golyshin, Peter N; Yakunin, Alexander F

    2015-01-01

    This chapter discusses metagenomics and its application for enzyme discovery, with a focus on hydrolytic enzymes from marine metagenomic libraries. With less than one percent of culturable microorganisms in the environment, metagenomics, or the collective study of community genetics, has opened up a rich pool of uncharacterized metabolic pathways, enzymes, and adaptations. This great untapped pool of genes provides the particularly exciting potential to mine for new biochemical activities or novel enzymes with activities tailored to peculiar sets of environmental conditions. Metagenomes also represent a huge reservoir of novel enzymes for applications in biocatalysis, biofuels, and bioremediation. Here we present the results of enzyme discovery for four enzyme activities, of particular industrial or environmental interest, including esterase/lipase, glycosyl hydrolase, protease and dehalogenase.

  8. The Spiral Discovery Network as an Automated General-Purpose Optimization Tool

    Directory of Open Access Journals (Sweden)

    Adam B. Csapo

    2018-01-01

    Full Text Available The Spiral Discovery Method (SDM was originally proposed as a cognitive artifact for dealing with black-box models that are dependent on multiple inputs with nonlinear and/or multiplicative interaction effects. Besides directly helping to identify functional patterns in such systems, SDM also simplifies their control through its characteristic spiral structure. In this paper, a neural network-based formulation of SDM is proposed together with a set of automatic update rules that makes it suitable for both semiautomated and automated forms of optimization. The behavior of the generalized SDM model, referred to as the Spiral Discovery Network (SDN, and its applicability to nondifferentiable nonconvex optimization problems are elucidated through simulation. Based on the simulation, the case is made that its applicability would be worth investigating in all areas where the default approach of gradient-based backpropagation is used today.

  9. MobilomeFINDER: web-based tools for in silico and experimental discovery of bacterial genomic islands

    OpenAIRE

    Ou, Hong-Yu; He, Xinyi; Harrison, Ewan M.; Kulasekara, Bridget R.; Thani, Ali Bin; Kadioglu, Aras; Lory, Stephen; Hinton, Jay C. D.; Barer, Michael R.; Deng, Zixin; Rajakumar, Kumar

    2007-01-01

    MobilomeFINDER (http://mml.sjtu.edu.cn/MobilomeFINDER) is an interactive online tool that facilitates bacterial genomic island or ‘mobile genome’ (mobilome) discovery; it integrates the ArrayOme and tRNAcc software packages. ArrayOme utilizes a microarray-derived comparative genomic hybridization input data set to generate ‘inferred contigs’ produced by merging adjacent genes classified as ‘present’. Collectively these ‘fragments’ represent a hypothetical ‘microarray-visualized genome (MVG)’....

  10. Water footprint as a tool for integrated water resources management

    Science.gov (United States)

    Aldaya, Maite; Hoekstra, Arjen

    2010-05-01

    In a context where water resources are unevenly distributed and, in some regions precipitation and drought conditions are increasing, enhanced water management is a major challenge to final consumers, businesses, water resource users, water managers and policymakers in general. By linking a large range of sectors and issues, virtual water trade and water footprint analyses provide an appropriate framework to find potential solutions and contribute to a better management of water resources. The water footprint is an indicator of freshwater use that looks not only at direct water use of a consumer or producer, but also at the indirect water use. The water footprint of a product is the volume of freshwater used to produce the product, measured over the full supply chain. It is a multi-dimensional indicator, showing water consumption volumes by source and polluted volumes by type of pollution; all components of a total water footprint are specified geographically and temporally. The water footprint breaks down into three components: the blue (volume of freshwater evaporated from surface or groundwater systems), green (water volume evaporated from rainwater stored in the soil as soil moisture) and grey water footprint (the volume of polluted water associated with the production of goods and services). Closely linked to the concept of water footprint is that of virtual water trade, which represents the amount of water embedded in traded products. Many nations save domestic water resources by importing water-intensive products and exporting commodities that are less water intensive. National water saving through the import of a product can imply saving water at a global level if the flow is from sites with high to sites with low water productivity. Virtual water trade between nations and even continents could thus be used as an instrument to improve global water use efficiency and to achieve water security in water-poor regions of the world. The virtual water trade

  11. Choosing Discovery: A Literature Review on the Selection and Evaluation of Discovery Layers

    Science.gov (United States)

    Moore, Kate B.; Greene, Courtney

    2012-01-01

    Within the next few years, traditional online public access catalogs will be replaced by more robust and interconnected discovery layers that can serve as primary public interfaces to simultaneously search many separate collections of resources. Librarians have envisioned this type of discovery tool since the 1980s, and research shows that…

  12. Reducing traffic in DHT-based discovery protocols for dynamic resources

    Science.gov (United States)

    Carlini, Emanuele; Coppola, Massimo; Laforenza, Domenico; Ricci, Laura

    Existing peer-to-peer approaches for resource location based on distributed hash tables focus mainly on optimizing lookup query resolution. The underlying assumption is that the arrival ratio of lookup queries is higher than the ratio of resource publication operations. We propose a set of optimization strategies to reduce the network traffic generated by the data publication and update process when resources have dynamic-valued attributes. We aim at reducing the publication overhead of supporting multi-attribute range queries. We develop a model predicting the bandwidth reduction, and we assign proper values to the model variables on the basis of real data measurements. We further validate these results by a set of simulations. Our experiments are designed to reproduce the typical behaviour of the resulting scheme within large distributed resource location system, like the resource location service of the XtreemOS Grid-enabled Operating System.

  13. Host-Brucella interactions and the Brucella genome as tools for subunit antigen discovery and immunization against brucellosis

    Science.gov (United States)

    Gomez, Gabriel; Adams, Leslie G.; Rice-Ficht, Allison; Ficht, Thomas A.

    2013-01-01

    Vaccination is the most important approach to counteract infectious diseases. Thus, the development of new and improved vaccines for existing, emerging, and re-emerging diseases is an area of great interest to the scientific community and general public. Traditional approaches to subunit antigen discovery and vaccine development lack consideration for the critical aspects of public safety and activation of relevant protective host immunity. The availability of genomic sequences for pathogenic Brucella spp. and their hosts have led to development of systems-wide analytical tools that have provided a better understanding of host and pathogen physiology while also beginning to unravel the intricacies at the host-pathogen interface. Advances in pathogen biology, host immunology, and host-agent interactions have the potential to serve as a platform for the design and implementation of better-targeted antigen discovery approaches. With emphasis on Brucella spp., we probe the biological aspects of host and pathogen that merit consideration in the targeted design of subunit antigen discovery and vaccine development. PMID:23720712

  14. Drive Cost Reduction, Increase Innovation and Mitigate Risk with Advanced Knowledge Discovery Tools Designed to Unlock and Leverage Prior Knowledge

    International Nuclear Information System (INIS)

    Mitchell, I.

    2016-01-01

    Full text: The nuclear industry is knowledge-intensive and includes a diverse number of stakeholders. Much of this knowledge is at risk as engineers, technicians and project professionals retire, leaving a widening skills and information gap. This knowledge is critical in an increasingly complex environment with information from past projects often buried in decades-old, non-integrated systems enterprise. Engineers can spend 40% or more of their time searching for answers across the enterprise instead of solving problems. The inability to access trusted industry knowledge results in increased risk and expense. Advanced knowledge discovery technologies slash research times by as much as 75% and accelerate innovation and problem solving by giving technical professionals access to the information they need, in the context of the problems they are trying to solve. Unlike traditional knowledge management approaches, knowledge discovery tools powered by semantic search technologies are adept at uncovering answers in unstructured data and require no tagging, organization or moving of data, meaning a smaller IT footprint and faster time-to-knowledge. This session will highlight best-in-class knowledge discovery technologies, content, and strategies to give nuclear industry organizations the ability to leverage the corpus of enterprise knowledge into the future. (author

  15. Gene2Function: An Integrated Online Resource for Gene Function Discovery

    Directory of Open Access Journals (Sweden)

    Yanhui Hu

    2017-08-01

    Full Text Available One of the most powerful ways to develop hypotheses regarding the biological functions of conserved genes in a given species, such as humans, is to first look at what is known about their function in another species. Model organism databases and other resources are rich with functional information but difficult to mine. Gene2Function addresses a broad need by integrating information about conserved genes in a single online resource.

  16. eagle-i: An Ontology-Driven Framework For Biomedical Resource Curation And Discovery

    OpenAIRE

    Erik Segerdell; Melanie L. Wilson; Ted Bashor; Daniela Bourges-Waldegg; Karen Corday; H. Robert Frost; Tenille Johnson; Christopher J. Shaffer; Larry Stone; Carlo Torniai; Melissa A. Haendel

    2010-01-01

    The eagle-i Consortium ("http://www.eagle-i.org/home":www.eagle-i.org/home) comprises nine geographically and ethnically diverse universities across America working to build a federated network of research resources. Biomedical research generates many resources that are rarely shared or published, including: reagents, protocols, instruments, expertise, organisms, training opportunities, software, human studies, and biological specimens. The goal of eagle-i is to improve biomedical r...

  17. Cyclic Investigation of Geophysical Studies in the Exploration and Discovery of Natural Resources in Our Country

    International Nuclear Information System (INIS)

    Gonulalan, A. U.

    2007-01-01

    Although the methods of exploration geophysics were first utilized after the discovery of an oil field in 1921, they have also applied in the old centuries. Likewise, the half of the total production in the United States of America is covered by new oil fields discovered by utilizing geophysical methods. The industry's energy necessity increases the interest to oil. The investments in the field of geophysics by the companies which makes large amount of money in order to discover new oil fields, widespread use of computers, the developments of space technology and world-wide nuclear competition even though its great danger for human beings have great share in the development of geophysics. Our country has 18 different types mines which has more than 10 billion $ potential. Geophysical engineers have great Kowledge and labor in the discovery of 1,795 trillion wealth from borax to building stone, and 60 billion $ oil and gas. On the other hand, as 1,5 billion investment in the field of geophysics is only 0.08 % of total investments, the increase of investments will add more contribution

  18. Picking the Best from the All-Resources Menu: Advanced Tools for Resource Planning

    Energy Technology Data Exchange (ETDEWEB)

    Palmintier, Bryan S [National Renewable Energy Laboratory (NREL), Golden, CO (United States)

    2018-01-31

    Introduces the wide range of electric power systems modeling types and associated questions they can help answer. The presentation focusses on modeling needs for high levels of Distributed Energy Resources (DERs), renewables, and inverter-based technologies as alternatives to traditional centralized power systems. Covers Dynamics, Production Cost/QSTS, Metric Assessment, Resource Planning, and Integrated Simulations with examples drawn from NREL's past and on-going projects. Presented at the McKnight Foundation workshop on 'An All-Resources Approach to Planning for a More Dynamic, Low-Carbon Grid' exploring grid modernization options to replace retiring coal plants in Minnesota.

  19. High-efficiency combinatorial approach as an effective tool for accelerating metallic biomaterials research and discovery

    Energy Technology Data Exchange (ETDEWEB)

    Zhang, X.D. [School of Material Science and Engineering, Central South University, Changsha, Hunan, 410083 (China); Liu, L.B., E-mail: lbliu.csu@gmail.com [School of Material Science and Engineering, Central South University, Changsha, Hunan, 410083 (China); State Key Laboratory for Powder Metallurgy, Changsha, Hunan, 410083 (China); Zhao, J.-C. [State Key Laboratory for Powder Metallurgy, Changsha, Hunan, 410083 (China); Department of Materials Science and Engineering, The Ohio State University, 2041 College Road, Columbus, OH 43210 (United States); Wang, J.L.; Zheng, F.; Jin, Z.P. [School of Material Science and Engineering, Central South University, Changsha, Hunan, 410083 (China)

    2014-06-01

    A high-efficiency combinatorial approach has been applied to rapidly build the database of composition-dependent elastic modulus and hardness of the Ti–Ta and Ti–Zr–Ta systems. A diffusion multiple of the Ti–Zr–Ta system was manufactured, then annealed at 1173 K for 1800 h, and water quenched to room temperature. Extensive interdiffusion among Ti, Zr and Ta has taken place. Combining nanoindentation and electron probe micro-analysis (EPMA), the elastic modulus, hardness as well as composition across the diffusion multiple were determined. The composition/elastic modulus/hardness relationship of the Ti–Ta and Ti–Zr–Ta alloys has been obtained. It was found that the elastic modulus and hardness depend strongly on the Ta and Zr content. The result can be used to accelerate the discovery/development of bio-titanium alloys for different components in implant prosthesis. - Highlights: • High-efficiency diffusion multiple of Ti–Zr–Ta was manufactured. • Composition-dependent elastic modulus and hardness of the Ti–Ta and Ti–Zr–Ta systems have been obtained effectively, • The methodology and the information can be used to accelerate the discovery/development of bio-titanium alloys.

  20. Aligning Web-Based Tools to the Research Process Cycle: A Resource for Collaborative Research Projects

    Science.gov (United States)

    Price, Geoffrey P.; Wright, Vivian H.

    2012-01-01

    Using John Creswell's Research Process Cycle as a framework, this article describes various web-based collaborative technologies useful for enhancing the organization and efficiency of educational research. Visualization tools (Cacoo) assist researchers in identifying a research problem. Resource storage tools (Delicious, Mendeley, EasyBib)…

  1. Situational Awareness Analysis Tools for Aiding Discovery of Security Events and Patterns

    National Research Council Canada - National Science Library

    Kumar, Vipin; Kim, Yongdae; Srivastava, Jaideep; Zhang, Zhi-Li; Shaneck, Mark; Chandola, Varun; Liu, Haiyang; Choi, Changho; Simon, Gyorgy; Eilertson, Eric

    2005-01-01

    .... The University of Minnesota team has developed a comprehensive, multi-stage analysis framework which provides tools and analysis methodologies to aid cyber security analysts in improving the quality...

  2. Evaluation Framework and Tools for Distributed Energy Resources

    Energy Technology Data Exchange (ETDEWEB)

    Gumerman, Etan Z.; Bharvirkar, Ranjit R.; LaCommare, Kristina Hamachi; Marnay , Chris

    2003-02-01

    The Energy Information Administration's (EIA) 2002 Annual Energy Outlook (AEO) forecast anticipates the need for 375 MW of new generating capacity (or about one new power plant) per week for the next 20 years, most of which is forecast to be fueled by natural gas. The Distributed Energy and Electric Reliability Program (DEER) of the Department of Energy (DOE), has set a national goal for DER to capture 20 percent of new electric generation capacity additions by 2020 (Office of Energy Efficiency and Renewable Energy 2000). Cumulatively, this amounts to about 40 GW of DER capacity additions from 2000-2020. Figure ES-1 below compares the EIA forecast and DEER's assumed goal for new DER by 2020 while applying the same definition of DER to both. This figure illustrates that the EIA forecast is consistent with the overall DEER DER goal. For the purposes of this study, Berkeley Lab needed a target level of small-scale DER penetration upon which to hinge consideration of benefits and costs. Because the AEO2002 forecasted only 3.1 GW of cumulative additions from small-scale DER in the residential and commercial sectors, another approach was needed to estimate the small-scale DER target. The focus here is on small-scale DER technologies under 500 kW. The technology size limit is somewhat arbitrary, but the key results of interest are marginal additional costs and benefits around an assumed level of penetration that existing programs might achieve. Berkeley Lab assumes that small-scale DER has the same growth potential as large scale DER in AEO2002, about 38 GW. This assumption makes the small-scale goal equivalent to 380,000 DER units of average size 100 kW. This report lays out a framework whereby the consequences of meeting this goal might be estimated and tallied up. The framework is built around a list of major benefits and a set of tools that might be applied to estimate them. This study lists some of the major effects of an emerging paradigm shift away from

  3. Resource Discovery for Extreme Scale Collaboration (RDESC) Final Report - RPI/TWC - Year 3

    Energy Technology Data Exchange (ETDEWEB)

    Fox, Peter [Rensselaer Polytechnic Inst., Troy, NY (United States)

    2015-05-30

    The amount of data produced in the practice of science is growing rapidly. Despite the accumulation and demand for scientific data, relatively little is actually made available for the broader scientific community. We surmise that the root of the problem is the perceived difficulty to electronically publish scientific data and associated metadata in a way that makes it discoverable. We propose to exploit Semantic Web technologies and practices to make (meta)data discoverable and easy to publish. We share our experiences in curating metadata to illustrate both the flexibility of our approach and the pain of discovering data in the current research environment. We also make recommendations by concrete example of how data publishers can provide their (meta)data by adding some limited, additional markup to HTML pages on the Web. With little additional effort from data publishers, the difficulty of data discovery/access/sharing can be greatly reduced and the impact of research data greatly enhanced.

  4. GOrilla: a tool for discovery and visualization of enriched GO terms in ranked gene lists

    Directory of Open Access Journals (Sweden)

    Steinfeld Israel

    2009-02-01

    Full Text Available Abstract Background Since the inception of the GO annotation project, a variety of tools have been developed that support exploring and searching the GO database. In particular, a variety of tools that perform GO enrichment analysis are currently available. Most of these tools require as input a target set of genes and a background set and seek enrichment in the target set compared to the background set. A few tools also exist that support analyzing ranked lists. The latter typically rely on simulations or on union-bound correction for assigning statistical significance to the results. Results GOrilla is a web-based application that identifies enriched GO terms in ranked lists of genes, without requiring the user to provide explicit target and background sets. This is particularly useful in many typical cases where genomic data may be naturally represented as a ranked list of genes (e.g. by level of expression or of differential expression. GOrilla employs a flexible threshold statistical approach to discover GO terms that are significantly enriched at the top of a ranked gene list. Building on a complete theoretical characterization of the underlying distribution, called mHG, GOrilla computes an exact p-value for the observed enrichment, taking threshold multiple testing into account without the need for simulations. This enables rigorous statistical analysis of thousand of genes and thousands of GO terms in order of seconds. The output of the enrichment analysis is visualized as a hierarchical structure, providing a clear view of the relations between enriched GO terms. Conclusion GOrilla is an efficient GO analysis tool with unique features that make a useful addition to the existing repertoire of GO enrichment tools. GOrilla's unique features and advantages over other threshold free enrichment tools include rigorous statistics, fast running time and an effective graphical representation. GOrilla is publicly available at: http://cbl-gorilla.cs.technion.ac.il

  5. Mobile Cloud Computing: Resource Discovery, Session Connectivity and Other Open Issues

    NARCIS (Netherlands)

    Schüring, Markus; Karagiannis, Georgios

    2011-01-01

    Abstract—Cloud computing can be considered as a model that provides network access to a shared pool of resources, such as storage and computing power, which can be rapidly provisioned and released with minimal management effort. This paper describes a research activity in the area of mobile cloud

  6. BEST: Next-Generation Biomedical Entity Search Tool for Knowledge Discovery from Biomedical Literature.

    Directory of Open Access Journals (Sweden)

    Sunwon Lee

    Full Text Available As the volume of publications rapidly increases, searching for relevant information from the literature becomes more challenging. To complement standard search engines such as PubMed, it is desirable to have an advanced search tool that directly returns relevant biomedical entities such as targets, drugs, and mutations rather than a long list of articles. Some existing tools submit a query to PubMed and process retrieved abstracts to extract information at query time, resulting in a slow response time and limited coverage of only a fraction of the PubMed corpus. Other tools preprocess the PubMed corpus to speed up the response time; however, they are not constantly updated, and thus produce outdated results. Further, most existing tools cannot process sophisticated queries such as searches for mutations that co-occur with query terms in the literature. To address these problems, we introduce BEST, a biomedical entity search tool. BEST returns, as a result, a list of 10 different types of biomedical entities including genes, diseases, drugs, targets, transcription factors, miRNAs, and mutations that are relevant to a user's query. To the best of our knowledge, BEST is the only system that processes free text queries and returns up-to-date results in real time including mutation information in the results. BEST is freely accessible at http://best.korea.ac.kr.

  7. Moving Forward: The Next-Gen Catalog and the New Discovery Tools

    Science.gov (United States)

    Weare, William H., Jr.; Toms, Sue; Breeding, Marshall

    2011-01-01

    Do students prefer to use Google instead of the library catalog? Ever wondered why? Google is easier to use and delivers plenty of "good enough" resources to meet their needs. The current generation of online catalogs has two main problems. First, the look and feel of the interface doesn't reflect the conventions adhered to elsewhere on the web,…

  8. Discovery and Mash-up of Physical Resources through a Web of Things Architecture

    OpenAIRE

    Mainetti, Luca; Mighali, Vincenzo; Patrono, Luigi; Rametta, Piercosimo

    2014-01-01

    The Internet of Things has focused on new systems, the so-called smart things, to integrate the physical world with the virtual world by exploiting the network architecture of the Internet. However, defining applications on top of smart things is mainly reserved to system experts, since it requires a thorough knowledge of hardware platforms and some specific programming languages. Furthermore, a common infrastructure to publish and share resource information is also needed. In this paper, we ...

  9. The Evolution of DNA-Templated Synthesis as a Tool for Materials Discovery.

    Science.gov (United States)

    O'Reilly, Rachel K; Turberfield, Andrew J; Wilks, Thomas R

    2017-10-17

    system that can translate instructions coded as a sequence of DNA bases into a chemical structure-a process analogous to the action of the ribosome in living organisms but with the potential to create a much more chemically diverse set of products. It is also possible to ensure that each product molecule is tagged with its identifying DNA sequence. Compound libraries synthesized in this way can be exposed to selection against suitable targets, enriching successful molecules. The encoding DNA can then be amplified using the polymerase chain reaction and decoded by DNA sequencing. More importantly, the DNA instruction sequences can be mutated and reused during multiple rounds of amplification, translation, and selection. In other words, DTS could be used as the foundation for a system of synthetic molecular evolution, which could allow us to efficiently search a vast chemical space. This has huge potential to revolutionize materials discovery-imagine being able to evolve molecules for light harvesting, or catalysts for CO 2 fixation. The field of DTS has developed to the point where a wide variety of reactions can be performed on a DNA template. Complex architectures and autonomous "DNA robots" have been implemented for the controlled assembly of BBs, and these mechanisms have in turn enabled the one-pot synthesis of large combinatorial libraries. Indeed, DTS libraries are being exploited by pharmaceutical companies and have already found their way into drug lead discovery programs. This Account explores the processes involved in DTS and highlights the challenges that remain in creating a general system for molecular discovery by evolution.

  10. Continuous flow chemistry: a discovery tool for new chemical reactivity patterns.

    Science.gov (United States)

    Hartwig, Jan; Metternich, Jan B; Nikbin, Nikzad; Kirschning, Andreas; Ley, Steven V

    2014-06-14

    Continuous flow chemistry as a process intensification tool is well known. However, its ability to enable chemists to perform reactions which are not possible in batch is less well studied or understood. Here we present an example, where a new reactivity pattern and extended reaction scope has been achieved by transferring a reaction from batch mode to flow. This new reactivity can be explained by suppressing back mixing and precise control of temperature in a flow reactor set up.

  11. Continuous flow chemistry: a discovery tool for new chemical reactivity patterns

    OpenAIRE

    Hartwig, Jan; Metternich, Jan B.; Nikbin, Nikzad; Kirschning, Andreas; Ley, Steven V.

    2014-01-01

    Continuous flow chemistry as a process intensification tool is well known. However, its ability to enable chemists to perform reactions which are not possible in batch is less well studied or understood. Here we present an example, where a new reactivity pattern and extended reaction scope has been achieved by transferring a reaction from batch mode to flow. This new reactivity can be explained by suppressing back mixing and precise control of temperature in a flow reactor set up.

  12. Virtual screening methods as tools for drug lead discovery from large chemical libraries.

    Science.gov (United States)

    Ma, X H; Zhu, F; Liu, X; Shi, Z; Zhang, J X; Yang, S Y; Wei, Y Q; Chen, Y Z

    2012-01-01

    Virtual screening methods have been developed and explored as useful tools for searching drug lead compounds from chemical libraries, including large libraries that have become publically available. In this review, we discussed the new developments in exploring virtual screening methods for enhanced performance in searching large chemical libraries, their applications in screening libraries of ~ 1 million or more compounds in the last five years, the difficulties in their applications, and the strategies for further improving these methods.

  13. Distributed Information Search and Retrieval for Astronomical Resource Discovery and Data Mining

    Science.gov (United States)

    Murtagh, Fionn; Guillaume, Damien

    Information search and retrieval has become by nature a distributed task. We look at tools and techniques which are of importance in this area. Current technological evolution can be summarized as the growing stability and cohesiveness of distributed architectures of searchable objects. The objects themselves are more often than not multimedia, including published articles or grey literature reports, yellow page services, image data, catalogs, presentation and online display materials, and ``operations'' information such as scheduling and publicly accessible proposal information. The evolution towards distributed architectures, protocols and formats, and the direction of our own work, are focussed on in this paper.

  14. Resource Planning Model: An Integrated Resource Planning and Dispatch Tool for Regional Electric Systems

    Energy Technology Data Exchange (ETDEWEB)

    Mai, T.; Drury, E.; Eurek, K.; Bodington, N.; Lopez, A.; Perry, A.

    2013-01-01

    This report introduces a new capacity expansion model, the Resource Planning Model (RPM), with high spatial and temporal resolution that can be used for mid- and long-term scenario planning of regional power systems. Although RPM can be adapted to any geographic region, the report describes an initial version of the model adapted for the power system in Colorado. It presents examples of scenario results from the first version of the model, including an example of a 30%-by-2020 renewable electricity penetration scenario.

  15. Prototype Development of a Tradespace Analysis Tool for Spaceflight Medical Resources.

    Science.gov (United States)

    Antonsen, Erik L; Mulcahy, Robert A; Rubin, David; Blue, Rebecca S; Canga, Michael A; Shah, Ronak

    2018-02-01

    The provision of medical care in exploration-class spaceflight is limited by mass, volume, and power constraints, as well as limitations of available skillsets of crewmembers. A quantitative means of exploring the risks and benefits of inclusion or exclusion of onboard medical capabilities may help to inform the development of an appropriate medical system. A pilot project was designed to demonstrate the utility of an early tradespace analysis tool for identifying high-priority resources geared toward properly equipping an exploration mission medical system. Physician subject matter experts identified resources, tools, and skillsets required, as well as associated criticality scores of the same, to meet terrestrial, U.S.-specific ideal medical solutions for conditions concerning for exploration-class spaceflight. A database of diagnostic and treatment actions and resources was created based on this input and weighed against the probabilities of mission-specific medical events to help identify common and critical elements needed in a future exploration medical capability. Analysis of repository data demonstrates the utility of a quantitative method of comparing various medical resources and skillsets for future missions. Directed database queries can provide detailed comparative estimates concerning likelihood of resource utilization within a given mission and the weighted utility of tangible and intangible resources. This prototype tool demonstrates one quantitative approach to the complex needs and limitations of an exploration medical system. While this early version identified areas for refinement in future version development, more robust analysis tools may help to inform the development of a comprehensive medical system for future exploration missions.Antonsen EL, Mulcahy RA, Rubin D, Blue RS, Canga MA, Shah R. Prototype development of a tradespace analysis tool for spaceflight medical resources. Aerosp Med Hum Perform. 2018; 89(2):108-114.

  16. Advances in Hydrogeochemical Indicators for the Discovery of New Geothermal Resources in the Great Basin, USA

    Energy Technology Data Exchange (ETDEWEB)

    Simmons, Stuart F. [Colorado School of Mines, Golden, CO (United States). Geology and Geological Engineering; Spycher, Nicolas [Lawrence Berkeley National Lab. (LBNL), Berkeley, CA (United States). Earth Sciences Division; Sonnenthal, Eric [Lawrence Berkeley National Lab. (LBNL), Berkeley, CA (United States). Earth Sciences Division; Dobson, Patrick [Lawrence Berkeley National Lab. (LBNL), Berkeley, CA (United States). Earth Sciences Division

    2013-05-20

    This report summarizes the results of Phase I work for a go/no go decision on Phase II funding. In the first objective, we assessed the extent to which fluid-mineral equilibria controlled deep water compositions in geothermal systems across the Great Basin. Six systems were evaluated: Beowawe; Desert Peak; Dixie Valley; Mammoth; Raft River; Roosevelt. These represent a geographic spread of geothermal resources, in different geological settings and with a wide range of fluid compositions. The results were used for calibration/reformulation of chemical geothermometers that reflect the reservoir temperatures in producing reservoirs. In the second objective, we developed a reactive -transport model of the Desert Peak hydrothermal system to evaluate the processes that affect reservoir fluid geochemistry and its effect on solute geothermometry. This included testing geothermometry on “reacted” thermal water originating from different lithologies and from near-surface locations where the temperature is known from the simulation. The integrated multi-component geothermometer (GeoT, relying on computed mineral saturation indices) was tested against the model results and also on the systems studied in the first objective.

  17. Rough Sets as a Knowledge Discovery and Classification Tool for the Diagnosis of Students with Learning Disabilities

    Directory of Open Access Journals (Sweden)

    Yu-Chi Lin

    2011-02-01

    Full Text Available Due to the implicit characteristics of learning disabilities (LDs, the diagnosis of students with learning disabilities has long been a difficult issue. Artificial intelligence techniques like artificial neural network (ANN and support vector machine (SVM have been applied to the LD diagnosis problem with satisfactory outcomes. However, special education teachers or professionals tend to be skeptical to these kinds of black-box predictors. In this study, we adopt the rough set theory (RST, which can not only perform as a classifier, but may also produce meaningful explanations or rules, to the LD diagnosis application. Our experiments indicate that the RST approach is competitive as a tool for feature selection, and it performs better in term of prediction accuracy than other rulebased algorithms such as decision tree and ripper algorithms. We also propose to mix samples collected from sources with different LD diagnosis procedure and criteria. By pre-processing these mixed samples with simple and readily available clustering algorithms, we are able to improve the quality and support of rules generated by the RST. Overall, our study shows that the rough set approach, as a classification and knowledge discovery tool, may have great potential in playing an essential role in LD diagnosis.

  18. Discovery and publishing of primary biodiversity data associated with multimedia resources: The Audubon Core strategies and approaches

    Directory of Open Access Journals (Sweden)

    Robert A Morris

    2013-07-01

    Full Text Available The Audubon Core Multimedia Resource Metadata Schema is a representation-free vocabulary for the description of biodiversity multimedia resources and collections, now in the final stages as a proposed Biodiversity Informatics Standards (TDWG standard. By defining only six terms as mandatory, it seeks to lighten the burden for providing or using multimedia useful for biodiversity science. At the same time it offers rich optional metadata terms that can help curators of multimedia collections provide authoritative media that document species occurrence, ecosystems, identification tools, ontologies, and many other kinds of biodiversity documents or data. About half of the vocabulary is re-used from other relevant controlled vocabularies that are often already in use for multimedia metadata, thereby reducing the mapping burden on existing repositories. A central design goal is to allow consuming applications to have a high likelihood of discovering suitable resources, reducing the human examination effort that might be required to decide if the resource is fit for the purpose of the application.

  19. Forward-backward asymmetry as a discovery tool for Z′ bosons at the LHC

    International Nuclear Information System (INIS)

    Accomando, Elena; Belyaev, Alexander; Fiaschi, Juri; Mimasu, Ken; Moretti, Stefano; Shepherd-Themistocleous, Claire

    2016-01-01

    The Forward-Backward Asymmetry (AFB) in Z ′ physics is commonly only perceived as the observable which possibly allows one to interpret a Z ′ signal appearing in the Drell-Yan channel by distinguishing different models of such (heavy) spin-1 bosons. In this paper, we revisit this issue, showing that the absence of any di-lepton rapidity cut, which is commonly used in the literature, can enhance the potential of the observable at the LHC. We moreover examine the ability of AFB in setting bounds on or even discovering a Z ′ at the Large Hadron Collider (LHC) concluding that it may be a powerful tool for this purpose. We analyse two different scenarios: Z ′ -bosons with a narrow and wide width, respectively. We find that, in the first case, the significance of the AFB search can be comparable with that of the ‘bump’ search usually adopted by the experimental collaborations; however, in being a ratio of (differential) cross sections, the AFB has the advantage of reducing experimental systematics as well as theoretical errors due to PDF uncertainties. In the second case, the AFB search can outperform the bump search in terms of differential shape, meaning the AFB distribution may be better suited for new broad resonances than the event counting strategy usually adopted in such cases.

  20. Forward-backward asymmetry as a discovery tool for Z' bosons at the LHC

    Science.gov (United States)

    Accomando, Elena; Belyaev, Alexander; Fiaschi, Juri; Mimasu, Ken; Moretti, Stefano; Shepherd-Themistocleous, Claire

    2016-01-01

    The Forward-Backward Asymmetry (AFB) in Z' physics is commonly only perceived as the observable which possibly allows one to interpret a Z' signal appearing in the Drell-Yan channel by distinguishing different models of such (heavy) spin-1 bosons. In this paper, we revisit this issue, showing that the absence of any di-lepton rapidity cut, which is commonly used in the literature, can enhance the potential of the observable at the LHC. We moreover examine the ability of AFB in setting bounds on or even discovering a Z' at the Large Hadron Collider (LHC) concluding that it may be a powerful tool for this purpose. We analyse two different scenarios: Z'-bosons with a narrow and wide width, respectively. We find that, in the first case, the significance of the AFB search can be comparable with that of the `bump' search usually adopted by the experimental collaborations; however, in being a ratio of (differential) cross sections, the AFB has the advantage of reducing experimental systematics as well as theoretical errors due to PDF uncertainties. In the second case, the AFB search can outperform the bump search in terms of differential shape, meaning the AFB distribution may be better suited for new broad resonances than the event counting strategy usually adopted in such cases.

  1. MobilomeFINDER: web-based tools for in silico and experimental discovery of bacterial genomic islands

    Science.gov (United States)

    Ou, Hong-Yu; He, Xinyi; Harrison, Ewan M.; Kulasekara, Bridget R.; Thani, Ali Bin; Kadioglu, Aras; Lory, Stephen; Hinton, Jay C. D.; Barer, Michael R.; Rajakumar, Kumar

    2007-01-01

    MobilomeFINDER (http://mml.sjtu.edu.cn/MobilomeFINDER) is an interactive online tool that facilitates bacterial genomic island or ‘mobile genome’ (mobilome) discovery; it integrates the ArrayOme and tRNAcc software packages. ArrayOme utilizes a microarray-derived comparative genomic hybridization input data set to generate ‘inferred contigs’ produced by merging adjacent genes classified as ‘present’. Collectively these ‘fragments’ represent a hypothetical ‘microarray-visualized genome (MVG)’. ArrayOme permits recognition of discordances between physical genome and MVG sizes, thereby enabling identification of strains rich in microarray-elusive novel genes. Individual tRNAcc tools facilitate automated identification of genomic islands by comparative analysis of the contents and contexts of tRNA sites and other integration hotspots in closely related sequenced genomes. Accessory tools facilitate design of hotspot-flanking primers for in silico and/or wet-science-based interrogation of cognate loci in unsequenced strains and analysis of islands for features suggestive of foreign origins; island-specific and genome-contextual features are tabulated and represented in schematic and graphical forms. To date we have used MobilomeFINDER to analyse several Enterobacteriaceae, Pseudomonas aeruginosa and Streptococcus suis genomes. MobilomeFINDER enables high-throughput island identification and characterization through increased exploitation of emerging sequence data and PCR-based profiling of unsequenced test strains; subsequent targeted yeast recombination-based capture permits full-length sequencing and detailed functional studies of novel genomic islands. PMID:17537813

  2. Analysis of cassava (Manihot esculenta) ESTs: A tool for the discovery of genes

    International Nuclear Information System (INIS)

    Zapata, Andres; Neme, Rafik; Sanabria, Carolina; Lopez, Camilo

    2011-01-01

    Cassava (Manihot esculenta) is the main source of calories for more than 1,000 millions of people around the world and has been consolidated as the fourth most important crop after rice, corn and wheat. Cassava is considered tolerant to abiotic and biotic stress conditions; nevertheless these characteristics are mainly present in non-commercial varieties. Genetic breeding strategies represent an alternative to introduce the desirable characteristics into commercial varieties. A fundamental step for accelerating the genetic breeding process in cassava requires the identification of genes associated to these characteristics. One rapid strategy for the identification of genes is the possibility to have a large collection of ESTs (expressed sequence tag). In this study, a complete analysis of cassava ESTs was done. The cassava ESTs represent 80,459 sequences which were assembled in a set of 29,231 unique genes (unigen), comprising 10,945 contigs and 18,286 singletones. These 29,231 unique genes represent about 80% of the genes of the cassava's genome. Between 5% and 10% of the unigenes of cassava not show similarity to any sequences present in the NCBI database and could be consider as cassava specific genes. a functional category was assigned to a group of sequences of the unigen set (29%) following the Gene Ontology Vocabulary. the molecular function component was the best represented with 43% of the sequences, followed by the biological process component (38%) and finally the cellular component with 19%. in the cassava ESTs collection, 3,709 microsatellites were identified and they could be used as molecular markers. this study represents an important contribution to the knowledge of the functional genomic structure of cassava and constitutes an important tool for the identification of genes associated to agricultural characteristics of interest that could be employed in cassava breeding programs.

  3. A smartphone-based ASR data collection tool for under-resourced languages

    CSIR Research Space (South Africa)

    De Vries, NJ

    2014-01-01

    Full Text Available collection strategies, highlighting some of the salient issues pertaining to collecting ASR data for under-resourced languages. We then describe the development of a smartphone-based data collection tool, Woefzela, which is designed to function in a...

  4. VITMO - A Powerful Tool to Improve Discovery in the Magnetospheric and Ionosphere-Thermosphere Domains

    Science.gov (United States)

    Schaefer, R. K.; Morrison, D.; Potter, M.; Stephens, G.; Barnes, R. J.; Talaat, E. R.; Sarris, T.

    2017-12-01

    With the advent of the NASA Magnetospheric Multiscale Mission and the Van Allen Probes we have space missions that probe the Earth's magnetosphere and radiation belts. These missions fly at far distances from the Earth in contrast to the larger number of near-Earth satellites. Both of the satellites make in situ measurements. Energetic particles flow along magnetic field lines from these measurement locations down to the ionosphere/thermosphere region. Discovering other data that may be used with these satellites is a difficult and complicated process. To solve this problem, we have developed a series of light-weight web services that can provide a new data search capability for the Virtual Ionosphere Thermosphere Mesosphere Observatory (VITMO). The services consist of a database of spacecraft ephemerides and instrument fields of view; an overlap calculator to find times when the fields of view of different instruments intersect; and a magnetic field line tracing service that maps in situ and ground based measurements for a number of magnetic field models and geophysical conditions. These services run in real-time when the user queries for data and allow the non-specialist user to select data that they were previously unable to locate, opening up analysis opportunities beyond the instrument teams and specialists, making it easier for future students who come into the field. Each service on their own provides a useful new capability for virtual observatories; operating together they provide a powerful new search tool. The ephemerides service was built using the Navigation and Ancillary Information Facility (NAIF) SPICE toolkit (http://naif.jpl.nasa.gov/naif/index.html) allowing them to be extended to support any Earth orbiting satellite with the addition of the appropriate SPICE kernels. The overlap calculator uses techniques borrowed from computer graphics to identify overlapping measurements in space and time. The calculator will allow a user defined uncertainty

  5. An Open-Source Web-Based Tool for Resource-Agnostic Interactive Translation Prediction

    Directory of Open Access Journals (Sweden)

    Daniel Torregrosa

    2014-09-01

    Full Text Available We present a web-based open-source tool for interactive translation prediction (ITP and describe its underlying architecture. ITP systems assist human translators by making context-based computer-generated suggestions as they type. Most of the ITP systems in literature are strongly coupled with a statistical machine translation system that is conveniently adapted to provide the suggestions. Our system, however, follows a resource-agnostic approach and suggestions are obtained from any unmodified black-box bilingual resource. This paper reviews our ITP method and describes the architecture of Forecat, a web tool, partly based on the recent technology of web components, that eases the use of our ITP approach in any web application requiring this kind of translation assistance. We also evaluate the performance of our method when using an unmodified Moses-based statistical machine translation system as the bilingual resource.

  6. Maximizing Academic Library Collections: Measuring Changes in Use Patterns Owing to EBSCO Discovery Service

    Science.gov (United States)

    Calvert, Kristin

    2015-01-01

    Despite the prevalence of academic libraries adopting web-scale discovery tools, few studies have quantified their effect on the use of library collections. This study measures the impact that EBSCO Discovery Service has had on use of library resources through circulation statistics, use of electronic resources, and interlibrary loan requests.…

  7. Developing a Data Discovery Tool for Interdisciplinary Science: Leveraging a Web-based Mapping Application and Geosemantic Searching

    Science.gov (United States)

    Albeke, S. E.; Perkins, D. G.; Ewers, S. L.; Ewers, B. E.; Holbrook, W. S.; Miller, S. N.

    2015-12-01

    The sharing of data and results is paramount for advancing scientific research. The Wyoming Center for Environmental Hydrology and Geophysics (WyCEHG) is a multidisciplinary group that is driving scientific breakthroughs to help manage water resources in the Western United States. WyCEHG is mandated by the National Science Foundation (NSF) to share their data. However, the infrastructure from which to share such diverse, complex and massive amounts of data did not exist within the University of Wyoming. We developed an innovative framework to meet the data organization, sharing, and discovery requirements of WyCEHG by integrating both open and closed source software, embedded metadata tags, semantic web technologies, and a web-mapping application. The infrastructure uses a Relational Database Management System as the foundation, providing a versatile platform to store, organize, and query myriad datasets, taking advantage of both structured and unstructured formats. Detailed metadata are fundamental to the utility of datasets. We tag data with Uniform Resource Identifiers (URI's) to specify concepts with formal descriptions (i.e. semantic ontologies), thus allowing users the ability to search metadata based on the intended context rather than conventional keyword searches. Additionally, WyCEHG data are geographically referenced. Using the ArcGIS API for Javascript, we developed a web mapping application leveraging database-linked spatial data services, providing a means to visualize and spatially query available data in an intuitive map environment. Using server-side scripting (PHP), the mapping application, in conjunction with semantic search modules, dynamically communicates with the database and file system, providing access to available datasets. Our approach provides a flexible, comprehensive infrastructure from which to store and serve WyCEHG's highly diverse research-based data. This framework has not only allowed WyCEHG to meet its data stewardship

  8. Thoughtflow: Standards and Tools for Provenance Capture and Workflow Definition to Support Model-Informed Drug Discovery and Development.

    Science.gov (United States)

    Wilkins, J J; Chan, Pls; Chard, J; Smith, G; Smith, M K; Beer, M; Dunn, A; Flandorfer, C; Franklin, C; Gomeni, R; Harnisch, L; Kaye, R; Moodie, S; Sardu, M L; Wang, E; Watson, E; Wolstencroft, K; Cheung, Sya

    2017-05-01

    Pharmacometric analyses are complex and multifactorial. It is essential to check, track, and document the vast amounts of data and metadata that are generated during these analyses (and the relationships between them) in order to comply with regulations, support quality control, auditing, and reporting. It is, however, challenging, tedious, error-prone, and time-consuming, and diverts pharmacometricians from the more useful business of doing science. Automating this process would save time, reduce transcriptional errors, support the retention and transfer of knowledge, encourage good practice, and help ensure that pharmacometric analyses appropriately impact decisions. The ability to document, communicate, and reconstruct a complete pharmacometric analysis using an open standard would have considerable benefits. In this article, the Innovative Medicines Initiative (IMI) Drug Disease Model Resources (DDMoRe) consortium proposes a set of standards to facilitate the capture, storage, and reporting of knowledge (including assumptions and decisions) in the context of model-informed drug discovery and development (MID3), as well as to support reproducibility: "Thoughtflow." A prototype software implementation is provided. © 2017 The Authors CPT: Pharmacometrics & Systems Pharmacology published by Wiley Periodicals, Inc. on behalf of American Society for Clinical Pharmacology and Therapeutics.

  9. FAF-Drugs2: free ADME/tox filtering tool to assist drug discovery and chemical biology projects.

    Science.gov (United States)

    Lagorce, David; Sperandio, Olivier; Galons, Hervé; Miteva, Maria A; Villoutreix, Bruno O

    2008-09-24

    Drug discovery and chemical biology are exceedingly complex and demanding enterprises. In recent years there are been increasing awareness about the importance of predicting/optimizing the absorption, distribution, metabolism, excretion and toxicity (ADMET) properties of small chemical compounds along the search process rather than at the final stages. Fast methods for evaluating ADMET properties of small molecules often involve applying a set of simple empirical rules (educated guesses) and as such, compound collections' property profiling can be performed in silico. Clearly, these rules cannot assess the full complexity of the human body but can provide valuable information and assist decision-making. This paper presents FAF-Drugs2, a free adaptable tool for ADMET filtering of electronic compound collections. FAF-Drugs2 is a command line utility program (e.g., written in Python) based on the open source chemistry toolkit OpenBabel, which performs various physicochemical calculations, identifies key functional groups, some toxic and unstable molecules/functional groups. In addition to filtered collections, FAF-Drugs2 can provide, via Gnuplot, several distribution diagrams of major physicochemical properties of the screened compound libraries. We have developed FAF-Drugs2 to facilitate compound collection preparation, prior to (or after) experimental screening or virtual screening computations. Users can select to apply various filtering thresholds and add rules as needed for a given project. As it stands, FAF-Drugs2 implements numerous filtering rules (23 physicochemical rules and 204 substructure searching rules) that can be easily tuned.

  10. Validating the WHO maternal near miss tool: comparing high- and low-resource settings.

    Science.gov (United States)

    Witteveen, Tom; Bezstarosti, Hans; de Koning, Ilona; Nelissen, Ellen; Bloemenkamp, Kitty W; van Roosmalen, Jos; van den Akker, Thomas

    2017-06-19

    WHO proposed the WHO Maternal Near Miss (MNM) tool, classifying women according to several (potentially) life-threatening conditions, to monitor and improve quality of obstetric care. The objective of this study is to analyse merged data of one high- and two low-resource settings where this tool was applied and test whether the tool may be suitable for comparing severe maternal outcome (SMO) between these settings. Using three cohort studies that included SMO cases, during two-year time frames in the Netherlands, Tanzania and Malawi we reassessed all SMO cases (as defined by the original studies) with the WHO MNM tool (five disease-, four intervention- and seven organ dysfunction-based criteria). Main outcome measures were prevalence of MNM criteria and case fatality rates (CFR). A total of 3172 women were studied; 2538 (80.0%) from the Netherlands, 248 (7.8%) from Tanzania and 386 (12.2%) from Malawi. Total SMO detection was 2767 (87.2%) for disease-based criteria, 2504 (78.9%) for intervention-based criteria and 1211 (38.2%) for organ dysfunction-based criteria. Including every woman who received ≥1 unit of blood in low-resource settings as life-threatening, as defined by organ dysfunction criteria, led to more equally distributed populations. In one third of all Dutch and Malawian maternal death cases, organ dysfunction criteria could not be identified from medical records. Applying solely organ dysfunction-based criteria may lead to underreporting of SMO. Therefore, a tool based on defining MNM only upon establishing organ failure is of limited use for comparing settings with varying resources. In low-resource settings, lowering the threshold of transfused units of blood leads to a higher detection rate of MNM. We recommend refined disease-based criteria, accompanied by a limited set of intervention- and organ dysfunction-based criteria to set a measure of severity.

  11. Models, methods and software tools to evaluate the quality of informational and educational resources

    International Nuclear Information System (INIS)

    Gavrilov, S.I.

    2011-01-01

    The paper studies the modern methods and tools to evaluate the quality of data systems, which allows determining the specificity of informational and educational resources (IER). The author has developed a model of IER quality management at all stages of the life cycle and an integrated multi-level hierarchical system of IER quality assessment, taking into account both information properties and targeted resource assignment. The author presents a mathematical and algorithmic justification of solving the problem of IER quality management, and offers data system to assess the IER quality [ru

  12. A procedure to improve the information flow in the assessment of discoveries of oil and gas resources in the Brazilian context

    Energy Technology Data Exchange (ETDEWEB)

    Rosa, Henrique; Suslick, Saul B.; Sousa, Sergio H.G. de [Universidade Estadual de Campinas, SP (Brazil). Inst. of Geosciences; Castro, Jonas Q. [ANP - Brazilian National Petroleum Agency, Rio de Janeiro, RJ (Brazil)

    2004-07-01

    This paper is focused on the elaboration of a standardization model for the existing flow of information between the Petroleum National Agency (ANP) and the concessionaire companies in the event of the discovery of any potentially commercial hydrocarbon resources inside their concession areas. The method proposed by Rosa (2003) included the analysis of a small sample of Oil and Gas Discovery Assessment Plans (PADs), elaborated by companies that operate in exploratory blocks in Brazil, under the regulatory context introduced by the Petroleum Law (Law 9478, August, 6th, 1997). The analysis of these documents made it possible to identify and target the problems originated from the lack of standardization. The results obtained facilitated the development of a model that helps the creation process of Oil and Gas Discovery Assessment Plans. It turns out that the standardization procedures suggested provide considerable advantages while speeding up several technical and regulatory steps. A software called 'ePADs' was developed to consolidate the automation of the several steps in the model for the standardization of the Oil and Gas Discovery Assessment Plans. A preliminary version has been tested with several different types of discoveries indicating a good performance by complying with all regulatory aspects and operational requirements. (author)

  13. A Case Study Optimizing Human Resources in Rwanda's First Dental School: Three Innovative Management Tools.

    Science.gov (United States)

    Hackley, Donna M; Mumena, Chrispinus H; Gatarayiha, Agnes; Cancedda, Corrado; Barrow, Jane R

    2018-06-01

    Harvard School of Dental Medicine, University of Maryland School of Dentistry, and the University of Rwanda (UR) are collaborating to create Rwanda's first School of Dentistry as part of the Human Resources for Health (HRH) Rwanda initiative that aims to strengthen the health care system of Rwanda. The HRH oral health team developed three management tools to measure progress in systems-strengthening efforts: 1) the road map is an operations plan for the entire dental school and facilitates delivery of the curriculum and management of human and material resources; 2) each HRH U.S. faculty member develops a work plan with targeted deliverables for his or her rotation, which is facilitated with biweekly flash reports that measure progress and keep the faculty member focused on his or her specific deliverables; and 3) the redesigned HRH twinning model, changed from twinning of an HRH faculty member with a single Rwandan faculty member to twinning with multiple Rwandan faculty members based on shared academic interests and goals, has improved efficiency, heightened engagement of the UR dental faculty, and increased the impact of HRH U.S. faculty members. These new tools enable the team to measure its progress toward the collaborative's goals and understand the successes and challenges in moving toward the planned targets. The tools have been valuable instruments in fostering discussion around priorities and deployment of resources as well as in developing strong relationships, enabling two-way exchange of knowledge, and promoting sustainability.

  14. A qualitative study of shopper experiences at an urban farmers' market using the Stanford Healthy Neighborhood Discovery Tool.

    Science.gov (United States)

    Buman, Matthew P; Bertmann, Farryl; Hekler, Eric B; Winter, Sandra J; Sheats, Jylana L; King, Abby C; Wharton, Christopher M

    2015-04-01

    To understand factors which enhance or detract from farmers' market shopper experiences to inform targeted interventions to increase farmers' market utilization, community-building and social marketing strategies. A consumer-intercept study using the Stanford Healthy Neighborhood Discovery Tool to capture real-time perceptions via photographs and audio narratives. An urban farmers' market in a large metropolitan US city. Thirty-eight farmers' market shoppers, who recorded 748 unique coded elements through community-based participatory research methods. Shoppers were primarily women (65 %), 18-35 years of age (54 %), non-Hispanic (81 %) and white (73 %). Shoppers captured 291 photographs (7·9 (sd 6·3) per shopper), 171 audio narratives (5·3 (sd 4·7) per shopper), and ninety-one linked photograph + audio narrative pairs (3·8 (sd 2·8) per shopper). A systematic content analysis of the photographs and audio narratives was conducted by eight independent coders. In total, nine common elements emerged from the data that enhanced the farmers' market experience (61·8 %), detracted from the experience (5·7 %) or were neutral (32·4 %). The most frequently noted elements were freshness/abundance of produce (23·3 %), product presentation (12·8 %), social interactions (12·4 %) and farmers' market attractions (e.g. live entertainment, dining offerings; 10·3 %). While produce quality (i.e. freshness/abundance) was of primary importance, other contextual factors also appeared important to the shoppers' experiences. These results may inform social marketing strategies to increase farmers' market utilization and community-building efforts that target market venues.

  15. FAF-Drugs2: Free ADME/tox filtering tool to assist drug discovery and chemical biology projects

    Directory of Open Access Journals (Sweden)

    Miteva Maria A

    2008-09-01

    Full Text Available Abstract Background Drug discovery and chemical biology are exceedingly complex and demanding enterprises. In recent years there are been increasing awareness about the importance of predicting/optimizing the absorption, distribution, metabolism, excretion and toxicity (ADMET properties of small chemical compounds along the search process rather than at the final stages. Fast methods for evaluating ADMET properties of small molecules often involve applying a set of simple empirical rules (educated guesses and as such, compound collections' property profiling can be performed in silico. Clearly, these rules cannot assess the full complexity of the human body but can provide valuable information and assist decision-making. Results This paper presents FAF-Drugs2, a free adaptable tool for ADMET filtering of electronic compound collections. FAF-Drugs2 is a command line utility program (e.g., written in Python based on the open source chemistry toolkit OpenBabel, which performs various physicochemical calculations, identifies key functional groups, some toxic and unstable molecules/functional groups. In addition to filtered collections, FAF-Drugs2 can provide, via Gnuplot, several distribution diagrams of major physicochemical properties of the screened compound libraries. Conclusion We have developed FAF-Drugs2 to facilitate compound collection preparation, prior to (or after experimental screening or virtual screening computations. Users can select to apply various filtering thresholds and add rules as needed for a given project. As it stands, FAF-Drugs2 implements numerous filtering rules (23 physicochemical rules and 204 substructure searching rules that can be easily tuned.

  16. SU-E-T-191: PITSTOP: Process Improvement Techniques, Software Tools, and Operating Principles for a Quality Initiative Discovery Framework.

    Science.gov (United States)

    Siochi, R

    2012-06-01

    To develop a quality initiative discovery framework using process improvement techniques, software tools and operating principles. Process deviations are entered into a radiotherapy incident reporting database. Supervisors use an in-house Event Analysis System (EASy) to discuss incidents with staff. Major incidents are analyzed with an in-house Fault Tree Analysis (FTA). A meta-Analysis is performed using association, text mining, key word clustering, and differential frequency analysis. A key operating principle encourages the creation of forcing functions via rapid application development. 504 events have been logged this past year. The results for the key word analysis indicate that the root cause for the top ranked key words was miscommunication. This was also the root cause found from association analysis, where 24% of the time that an event involved a physician it also involved a nurse. Differential frequency analysis revealed that sharp peaks at week 27 were followed by 3 major incidents, two of which were dose related. The peak was largely due to the front desk which caused distractions in other areas. The analysis led to many PI projects but there is still a major systematic issue with the use of forms. The solution we identified is to implement Smart Forms to perform error checking and interlocking. Our first initiative replaced our daily QA checklist with a form that uses custom validation routines, preventing therapists from proceeding with treatments until out of tolerance conditions are corrected. PITSTOP has increased the number of quality initiatives in our department, and we have discovered or confirmed common underlying causes of a variety of seemingly unrelated errors. It has motivated the replacement of all forms with smart forms. © 2012 American Association of Physicists in Medicine.

  17. Blood-brain barrier in vitro models as tools in drug discovery: assessment of the transport ranking of antihistaminic drugs.

    Science.gov (United States)

    Neuhaus, W; Mandikova, J; Pawlowitsch, R; Linz, B; Bennani-Baiti, B; Lauer, R; Lachmann, B; Noe, C R

    2012-05-01

    In the course of our validation program testing blood-brain barrier (BBB) in vitro models for their usability as tools in drug discovery it was evaluated whether an established Transwell model based on porcine cell line PBMEC/C1-2 was able to differentiate between the transport properties of first and second generation antihistaminic drugs. First generation antihistamines can permeate the BBB and act in the central nervous system (CNS), whereas entry to the CNS of second generation antihistamines is restricted by efflux pumps such as P-glycoprotein (P-gP) located in brain endothelial cells. P-gP functionality of PBMEC/C1-2 cells grown on Transwell filter inserts was proven by transport studies with P-gP substrate rhodamine 123 and P-gP blocker verapamil. Subsequent drug transport studies with the first generation antihistamines promethazine, diphenhydramine and pheniramine and the second generation antihistamines astemizole, ceterizine, fexofenadine and loratadine were accomplished in single substance as well as in group studies. Results were normalised to diazepam, an internal standard for the transcellular transport route. Moreover, effects after addition of P-gP inhibitor verapamil were investigated. First generation antihistamine pheniramine permeated as fastest followed by diphenhydramine, diazepam, promethazine and second generation antihistaminic drugs ceterizine, fexofenadine, astemizole and loratadine reflecting the BBB in vivo permeability ranking well. Verapamil increased the transport rates of all second generation antihistamines, which suggested involvement of P-gP during their permeation across the BBB model. The ranking after addition of verapamil was significantly changed, only fexofenadine and ceterizine penetrated slower than internal standard diazepam in the presence of verapamil. In summary, permeability data showed that the BBB model based on porcine cell line PBMEC/C1-2 was able to reflect the BBB in vivo situation for the transport of

  18. Intra-annual wave resource characterization for energy exploitation: A new decision-aid tool

    International Nuclear Information System (INIS)

    Carballo, R.; Sánchez, M.; Ramos, V.; Fraguela, J.A.; Iglesias, G.

    2015-01-01

    Highlights: • A decision-aid tool is developed for computing the monthly performance of WECs. • It allows the generation of high-resolution monthly characterization matrices. • The decision-aid tool is implemented to the Death Coast (N Spain). • The monthly matrices can be obtained at any coastal location within the Death Coast. • The tool is applied to a coastal location of a proposed wave farm. - Abstract: The wave energy resource is usually characterized by a significant variability throughout the year. In estimating the power performance of a Wave Energy Converter (WEC) it is fundamental to take into account this variability; indeed, an estimate based on mean annual values may well result in a wrong decision making. In this work, a novel decision-aid tool, iWEDGE (intra-annual Wave Energy Diagram GEnerator) is developed and implemented to a coastal region of interest, the Death Coast (Spain), one of the regions in Europe with the largest wave resource. Following a comprehensive procedure, and based on deep water wave data and high-resolution numerical modelling, this tool provides the monthly high-resolution characterization matrices (or energy diagrams) for any location of interest. In other words, the information required for the accurate computation of the intra-annual performance of any WEC at any location within the region covered is made available. Finally, an application of iWEDGE to the site of a proposed wave farm is presented. The results obtained highlight the importance of the decision-aid tool herein provided for wave energy exploitation

  19. A Tool and Process that Facilitate Community Capacity Building and Social Learning for Natural Resource Management

    Directory of Open Access Journals (Sweden)

    Christopher M. Raymond

    2013-03-01

    Full Text Available This study presents a self-assessment tool and process that facilitate community capacity building and social learning for natural resource management. The tool and process provide opportunities for rural landholders and project teams both to self-assess their capacity to plan and deliver natural resource management (NRM programs and to reflect on their capacities relative to other organizations and institutions that operate in their region. We first outline the tool and process and then present a critical review of the pilot in the South Australian Arid Lands NRM region, South Australia. Results indicate that participants representing local, organizational, and institutional tiers of government were able to arrive at a group consensus position on the strength, importance, and confidence of a variety of capacities for NRM categorized broadly as human, social, physical, and financial. During the process, participants learned a lot about their current capacity as well as capacity needs. Broad conclusions are discussed with reference to the iterative process for assessing and reflecting on community capacity.

  20. Tools and measures for stimulation the efficient energy consumption. Integrated resource planning in Romania

    International Nuclear Information System (INIS)

    Scripcariu, Daniela; Scripcariu, Mircea; Leca, Aureliu

    1996-01-01

    The integrated resource planning is based on analyses of the energy generation and energy consumption as a whole. Thus, increasing the energy efficiency appears to be the cheapest, the most available and the most cost-effective energy resource. In order to stimulate the increase of efficiency of energy consumption, besides economic efficiency criteria for selecting technical solutions, additional tools and measures are necessary. The paper presents the main tools and measures needed to foster an efficient energy consumption. Actions meant to stimulate DSM (Demand-Side Management) implementation in Romania are proposed. The paper contains 5 sections. In the introduction, the main aspects of the DSM are considered, namely, where the programs are implemented, who is the responsible, which are the objectives and finally, how the DSM programs are implemented. The following tools in management of energy use are examined: the energy prices, the regulation in the field of energy efficiency, standards and norms, energy labelling of the products and energy education. Among the measures for managing the energy use, the paper takes into consideration the institutions responsible for DSM, for instance, the Romanian Agency for Energy Conservation (ARCE), decentralization of decision making, the program approaches and financing the actions aiming at improving the energy efficiency. Finally, the paper analyses the criteria in choosing adequate solutions of improving the energy efficiency

  1. Retrieval of Legal Information Through Discovery Layers: A Case Study Related to Indian Law Libraries

    Directory of Open Access Journals (Sweden)

    Kushwah, Shivpal Singh

    2016-09-01

    Full Text Available Purpose. The purpose of this paper is to analyze and evaluate discovery layer search tools for retrieval of legal information in Indian law libraries. This paper covers current practices in legal information retrieval with special reference to Indian academic law libraries, and analyses its importance in the domain of law.Design/Methodology/Approach. A web survey and observational study method are used to collect the data. Data related to the discovery tools were collected using email and further discussion held with the discovery layer/ tool /product developers and their representatives.Findings. Results show that most of the Indian law libraries are subscribing to bundles of legal information resources such as Hein Online, JSTOR, LexisNexis Academic, Manupatra, Westlaw India, SCC web, AIR Online (CDROM, and so on. International legal and academic resources are compatible with discovery tools because they support various standards related to online publishing and dissemination such as OAI/PMH, Open URL, MARC21, and Z39.50, but Indian legal resources such as Manupatra, Air, and SCC are not compatible with the discovery layers. The central index is one of the important components in a discovery search interface, and discovery layer services/tools could be useful for Indian law libraries also if they can include multiple legal and academic resources in their central index. But present practices and observations reveal that discovery layers are not providing facility to cover legal information resources. Therefore, in the present form, discovery tools are not very useful; they are an incomplete and half solution for Indian libraries because all available Indian legal resources available in the law libraries are not covered.Originality/Value. Very limited research or published literature is available in the area of discovery layers and their compatibility with legal information resources.

  2. Tools for Engaging Scientists in Education and Public Outreach: Resources from NASA's Science Mission Directorate Forums

    Science.gov (United States)

    Buxner, S.; Grier, J.; Meinke, B. K.; Gross, N. A.; Woroner, M.

    2014-12-01

    The NASA Science Education and Public Outreach (E/PO) Forums support the NASA Science Mission Directorate (SMD) and its E/PO community by enhancing the coherency and efficiency of SMD-funded E/PO programs. The Forums foster collaboration and partnerships between scientists with content expertise and educators with pedagogy expertise. We will present tools to engage and resources to support scientists' engagement in E/PO efforts. Scientists can get connected to educators and find support materials and links to resources to support their E/PO work through the online SMD E/PO community workspace (http://smdepo.org) The site includes resources for scientists interested in E/PO including one page guides about "How to Get Involved" and "How to Increase Your Impact," as well as the NASA SMD Scientist Speaker's Bureau to connect scientists to audiences across the country. Additionally, there is a set of online clearinghouses that provide ready-made lessons and activities for use by scientists and educators: NASA Wavelength (http://nasawavelength.org/) and EarthSpace (http://www.lpi.usra.edu/earthspace/). The NASA Forums create and partner with organizations to provide resources specifically for undergraduate science instructors including slide sets for Earth and Space Science classes on the current topics in astronomy and planetary science. The Forums also provide professional development opportunities at professional science conferences each year including AGU, LPSC, AAS, and DPS to support higher education faculty who are teaching undergraduate courses. These offerings include best practices in instruction, resources for teaching planetary science and astronomy topics, and other special topics such as working with diverse students and the use of social media in the classroom. We are continually soliciting ways that we can better support scientists' efforts in effectively engaging in E/PO. Please contact Sanlyn Buxner (buxner@psi.edu) or Jennifer Grier (jgrier@psi.edu) to

  3. An online knowledge resource and questionnaires as a continuing pharmacy education tool to document reflective learning.

    Science.gov (United States)

    Budzinski, Jason W; Farrell, Barbara; Pluye, Pierre; Grad, Roland M; Repchinsky, Carol; Jovaisas, Barbara; Johnson-Lafleur, Janique

    2012-06-18

    To assess the use of an electronic knowledge resource to document continuing education activities and reveal educational needs of practicing pharmacists. Over a 38-week period, 67 e-mails were sent to 6,500 Canadian Pharmacists Association (CPhA) members. Each e-mail contained a link to an e-Therapeutics+ Highlight, a factual excerpt of selected content from an online drug and therapeutic knowledge resource. Participants were then prompted to complete a pop-up questionnaire. Members completed 4,140 questionnaires. Participants attributed the information they learned in the Highlights to practice improvements (50.4%), learning (57.0%), and motivation to learn more (57.4%). Reading Highlight excerpts and completing Web-based questionnaires is an effective method of continuing education that could be easily documented and tracked, making it an effective tool for use with e-portfolios.

  4. Developing a planning tool for South African prosecution resources: challenges and approach

    Directory of Open Access Journals (Sweden)

    R Koen

    2012-12-01

    Full Text Available In every country the prosecution of criminal cases is governed by different laws, policies and processes. In South Africa, the National Prosecuting Authority (NPA has the responsibility of planning and managing all prosecution functions. The NPA has certain unique characteristics that make it different from other similar organisations internationally. The development of a planning tool that the NPA could use to plan their future resource requirements over the short to medium term required extensive modelling, and its final form included features which, to the best knowledge of the development team, make it unique both locally and internationally. Model design was largely influenced by the challenges emanating from the special requirements and context of the problem. Resources were not forecasted directly, but were derived with the help of simulation models that traced docket flows through various resource-driven processes. Docket flows were derived as a proportion of reported crimes, and these were forecasted using a multivariate statistical model which could take into account explanatory variables as well as the correlations between the patterns observed within different crime categories. The simulation consisted of a number of smaller models which could be run independently, and not of one overarching model. This approach was found to make the best use of available data, and compensated for the fact that certain parameters, linking different courts and court types, were not available. In addition, it simplified scenario testing and sensitivity analysis. The various components of the planning tool, including inputs and outputs of the simulation models and the linkages between the forecasts and the simulation models, were implemented in a set of spreadsheets. By using spreadsheets as a common user interface, the planning tool could be used by prosecutors and managers who may not have extensive mathematical or modelling experience.

  5. KML (Keyhole Markup Language) : a key tool in the education of geo-resources.

    Science.gov (United States)

    Veltz, Isabelle

    2015-04-01

    Although going on the ground with pupils remains the best way to understand the geologic structure of a deposit, it is very difficult to bring them in a mining extraction site and it is impossible to explore whole regions in search of these resources. For those reasons the KML (with the Google earth interface) is a very complete tool for teaching geosciences. Simple and intuitive, its handling is quickly mastered by the pupils, it also allows the teachers to validate skills for IT certificates. It allows the use of KML files stemming from online banks, from personal productions of the teacher or from pupils' works. These tools offer a global approach in 3D as well as a geolocation-based access to any type of geological data. The resource on which I built this KML is taught in the curriculum of the 3 years of French high school, it is methane hydrate. This non conventional hydrocarbon molecule enters in this vague border between mineral an organic matter (as phosphate deposits). It has become for over ten year the subject of the race for the exploitation of the gas hydrates fields in order to try to supply to the world demand. The methane hydrate fields are very useful and interesting to study the 3 majors themes of geological resource: the exploration, the exploitation and the risks especially for environments and populations. The KML which I propose allows the pupils to put itself in the skin of a geologist in search of deposits or on the technician who is going to extract the resource. It also allows them to evaluate the risks connected to the effect of tectonics activity or climatic changes on the natural or catastrophic releasing of methane and its role in the increase of the greenhouse effect. This KML associated to plenty of pedagogic activities is directly downloadable for teachers at http://eduterre.ens-lyon.fr/eduterre-usages/actualites/methane/.

  6. A risk assessment tool applied to the study of shale gas resources

    Energy Technology Data Exchange (ETDEWEB)

    Veiguela, Miguel [Mining, Energy and Materials Engineering School, University of Oviedo (Spain); Hurtado, Antonio; Eguilior, Sonsoles; Recreo, Fernando [Environment Department, CIEMAT, Madrid (Spain); Roqueñi, Nieves [Mining, Energy and Materials Engineering School, University of Oviedo (Spain); Loredo, Jorge, E-mail: jloredo@uniovi.es [Mining, Energy and Materials Engineering School, University of Oviedo (Spain)

    2016-11-15

    The implementation of a risk assessment tool with the capacity to evaluate the risks for health, safety and the environment (HSE) from extraction of non-conventional fossil fuel resources by the hydraulic fracturing (fracking) technique can be a useful tool to boost development and progress of the technology and winning public trust and acceptance of this. At the early project stages, the lack of data related the selection of non-conventional gas deposits makes it difficult the use of existing approaches to risk assessment of fluids injected into geologic formations. The qualitative risk assessment tool developed in this work is based on the approach that shale gas exploitation risk is dependent on both the geologic site and the technological aspects. It follows from the Oldenburg's ‘Screening and Ranking Framework (SRF)’ developed to evaluate potential geologic carbon dioxide (CO{sub 2}) storage sites. These two global characteristics: (1) characteristics centered on the natural aspects of the site and (2) characteristics centered on the technological aspects of the Project, have been evaluated through user input of Property values, which define Attributes, which define the Characteristics. In order to carry out an individual evaluation of each of the characteristics and the elements of the model, the tool has been implemented in a spreadsheet. The proposed model has been applied to a site with potential for the exploitation of shale gas in Asturias (northwestern Spain) with tree different technological options to test the approach. - Highlights: • The proposed methodology is a risk assessment useful tool for shale gas projects. • The tool is addressed to the early stages of decision making processes. • The risk assessment of a site is made through a qualitative estimation. • Different weights are assigned to each specific natural and technological property. • The uncertainty associated to the current knowledge is considered.

  7. A risk assessment tool applied to the study of shale gas resources

    International Nuclear Information System (INIS)

    Veiguela, Miguel; Hurtado, Antonio; Eguilior, Sonsoles; Recreo, Fernando; Roqueñi, Nieves; Loredo, Jorge

    2016-01-01

    The implementation of a risk assessment tool with the capacity to evaluate the risks for health, safety and the environment (HSE) from extraction of non-conventional fossil fuel resources by the hydraulic fracturing (fracking) technique can be a useful tool to boost development and progress of the technology and winning public trust and acceptance of this. At the early project stages, the lack of data related the selection of non-conventional gas deposits makes it difficult the use of existing approaches to risk assessment of fluids injected into geologic formations. The qualitative risk assessment tool developed in this work is based on the approach that shale gas exploitation risk is dependent on both the geologic site and the technological aspects. It follows from the Oldenburg's ‘Screening and Ranking Framework (SRF)’ developed to evaluate potential geologic carbon dioxide (CO_2) storage sites. These two global characteristics: (1) characteristics centered on the natural aspects of the site and (2) characteristics centered on the technological aspects of the Project, have been evaluated through user input of Property values, which define Attributes, which define the Characteristics. In order to carry out an individual evaluation of each of the characteristics and the elements of the model, the tool has been implemented in a spreadsheet. The proposed model has been applied to a site with potential for the exploitation of shale gas in Asturias (northwestern Spain) with tree different technological options to test the approach. - Highlights: • The proposed methodology is a risk assessment useful tool for shale gas projects. • The tool is addressed to the early stages of decision making processes. • The risk assessment of a site is made through a qualitative estimation. • Different weights are assigned to each specific natural and technological property. • The uncertainty associated to the current knowledge is considered.

  8. Applications and methods utilizing the Simple Semantic Web Architecture and Protocol (SSWAP for bioinformatics resource discovery and disparate data and service integration

    Directory of Open Access Journals (Sweden)

    Nelson Rex T

    2010-06-01

    Full Text Available Abstract Background Scientific data integration and computational service discovery are challenges for the bioinformatic community. This process is made more difficult by the separate and independent construction of biological databases, which makes the exchange of data between information resources difficult and labor intensive. A recently described semantic web protocol, the Simple Semantic Web Architecture and Protocol (SSWAP; pronounced "swap" offers the ability to describe data and services in a semantically meaningful way. We report how three major information resources (Gramene, SoyBase and the Legume Information System [LIS] used SSWAP to semantically describe selected data and web services. Methods We selected high-priority Quantitative Trait Locus (QTL, genomic mapping, trait, phenotypic, and sequence data and associated services such as BLAST for publication, data retrieval, and service invocation via semantic web services. Data and services were mapped to concepts and categories as implemented in legacy and de novo community ontologies. We used SSWAP to express these offerings in OWL Web Ontology Language (OWL, Resource Description Framework (RDF and eXtensible Markup Language (XML documents, which are appropriate for their semantic discovery and retrieval. We implemented SSWAP services to respond to web queries and return data. These services are registered with the SSWAP Discovery Server and are available for semantic discovery at http://sswap.info. Results A total of ten services delivering QTL information from Gramene were created. From SoyBase, we created six services delivering information about soybean QTLs, and seven services delivering genetic locus information. For LIS we constructed three services, two of which allow the retrieval of DNA and RNA FASTA sequences with the third service providing nucleic acid sequence comparison capability (BLAST. Conclusions The need for semantic integration technologies has preceded

  9. Emerging tools for continuous nutrient monitoring networks: Sensors advancing science and water resources protection

    Science.gov (United States)

    Pellerin, Brian; Stauffer, Beth A; Young, Dwane A; Sullivan, Daniel J.; Bricker, Suzanne B.; Walbridge, Mark R; Clyde, Gerard A; Shaw, Denice M

    2016-01-01

    Sensors and enabling technologies are becoming increasingly important tools for water quality monitoring and associated water resource management decisions. In particular, nutrient sensors are of interest because of the well-known adverse effects of nutrient enrichment on coastal hypoxia, harmful algal blooms, and impacts to human health. Accurate and timely information on nutrient concentrations and loads is integral to strategies designed to minimize risk to humans and manage the underlying drivers of water quality impairment. Using nitrate sensors as an example, we highlight the types of applications in freshwater and coastal environments that are likely to benefit from continuous, real-time nutrient data. The concurrent emergence of new tools to integrate, manage and share large data sets is critical to the successful use of nutrient sensors and has made it possible for the field of continuous nutrient monitoring to rapidly move forward. We highlight several near-term opportunities for Federal agencies, as well as the broader scientific and management community, that will help accelerate sensor development, build and leverage sites within a national network, and develop open data standards and data management protocols that are key to realizing the benefits of a large-scale, integrated monitoring network. Investing in these opportunities will provide new information to guide management and policies designed to protect and restore our nation’s water resources.

  10. Master Middle Ware: A Tool to Integrate Water Resources and Fish Population Dynamics Models

    Science.gov (United States)

    Yi, S.; Sandoval Solis, S.; Thompson, L. C.; Kilduff, D. P.

    2017-12-01

    Linking models that investigate separate components of ecosystem processes has the potential to unify messages regarding management decisions by evaluating potential trade-offs in a cohesive framework. This project aimed to improve the ability of riparian resource managers to forecast future water availability conditions and resultant fish habitat suitability, in order to better inform their management decisions. To accomplish this goal, we developed a middleware tool that is capable of linking and overseeing the operations of two existing models, a water resource planning tool Water Evaluation and Planning (WEAP) model and a habitat-based fish population dynamics model (WEAPhish). First, we designed the Master Middle Ware (MMW) software in Visual Basic for Application® in one Excel® file that provided a familiar framework for both data input and output Second, MMW was used to link and jointly operate WEAP and WEAPhish, using Visual Basic Application (VBA) macros to implement system level calls to run the models. To demonstrate the utility of this approach, hydrological, biological, and middleware model components were developed for the Butte Creek basin. This tributary of the Sacramento River, California is managed for both hydropower and the persistence of a threatened population of spring-run Chinook salmon (Oncorhynchus tschawytscha). While we have demonstrated the use of MMW for a particular watershed and fish population, MMW can be customized for use with different rivers and fish populations, assuming basic data requirements are met. This model integration improves on ad hoc linkages for managing data transfer between software programs by providing a consistent, user-friendly, and familiar interface across different model implementations. Furthermore, the data-viewing capabilities of MMW facilitate the rapid interpretation of model results by hydrologists, fisheries biologists, and resource managers, in order to accelerate learning and management decision

  11. Human Ageing Genomic Resources: Integrated databases and tools for the biology and genetics of ageing

    Science.gov (United States)

    Tacutu, Robi; Craig, Thomas; Budovsky, Arie; Wuttke, Daniel; Lehmann, Gilad; Taranukha, Dmitri; Costa, Joana; Fraifeld, Vadim E.; de Magalhães, João Pedro

    2013-01-01

    The Human Ageing Genomic Resources (HAGR, http://genomics.senescence.info) is a freely available online collection of research databases and tools for the biology and genetics of ageing. HAGR features now several databases with high-quality manually curated data: (i) GenAge, a database of genes associated with ageing in humans and model organisms; (ii) AnAge, an extensive collection of longevity records and complementary traits for >4000 vertebrate species; and (iii) GenDR, a newly incorporated database, containing both gene mutations that interfere with dietary restriction-mediated lifespan extension and consistent gene expression changes induced by dietary restriction. Since its creation about 10 years ago, major efforts have been undertaken to maintain the quality of data in HAGR, while further continuing to develop, improve and extend it. This article briefly describes the content of HAGR and details the major updates since its previous publications, in terms of both structure and content. The completely redesigned interface, more intuitive and more integrative of HAGR resources, is also presented. Altogether, we hope that through its improvements, the current version of HAGR will continue to provide users with the most comprehensive and accessible resources available today in the field of biogerontology. PMID:23193293

  12. Technical tools and of information for the quality administration of the water resource

    International Nuclear Information System (INIS)

    Escobar Martinez, John Fernando; Sierra Ramirez, Carlos; Molina Perez, Francisco

    2000-01-01

    Given the complexity of an aquatic ecosystem and the impossibility of making experiments on a real scale, the water quality engineer represents the different reactions and interactions that happen in these ecosystems using mathematical models. This constitutes a powerful tool that allows prospective studies and helps in the decision-making. On the other hand, the huge volumes of information produced by the geographical space analysis and the large amount of variables involved, make the Geographical Information Systems (GIS) a powerful tool to develop analysis, modeling and simulation tasks on a defined area and the processes related to it. This article presents an integration proposal that permits to associate both, the spatial analysis and the visual representation capabilities of a GIS application with the water quality results obtained from a mathematical model, in such a way that, the interaction of the users of the information get increased, and the development of new tools helps in the decisions making and administrative process in the management of the water resources

  13. Scientific and practical tools for dealing with water resource estimations for the future

    Directory of Open Access Journals (Sweden)

    D. A. Hughes

    2015-06-01

    Full Text Available Future flow regimes will be different to today and imperfect knowledge of present and future climate variations, rainfall–runoff processes and anthropogenic impacts make them highly uncertain. Future water resources decisions will rely on practical and appropriate simulation tools that are sensitive to changes, can assimilate different types of change information and flexible enough to accommodate improvements in understanding of change. They need to include representations of uncertainty and generate information appropriate for uncertain decision-making. This paper presents some examples of the tools that have been developed to address these issues in the southern Africa region. The examples include uncertainty in present day simulations due to lack of understanding and data, using climate change projection data from multiple climate models and future catchment responses due to both climate and development effects. The conclusions are that the tools and models are largely available and what we need is more reliable forcing and model evlaution information as well as methods of making decisions with such inevitably uncertain information.

  14. Web-Based Geospatial Tools to Address Hazard Mitigation, Natural Resource Management, and Other Societal Issues

    Science.gov (United States)

    Hearn,, Paul P.

    2009-01-01

    Federal, State, and local government agencies in the United States face a broad range of issues on a daily basis. Among these are natural hazard mitigation, homeland security, emergency response, economic and community development, water supply, and health and safety services. The U.S. Geological Survey (USGS) helps decision makers address these issues by providing natural hazard assessments, information on energy, mineral, water and biological resources, maps, and other geospatial information. Increasingly, decision makers at all levels are challenged not by the lack of information, but by the absence of effective tools to synthesize the large volume of data available, and to utilize the data to frame policy options in a straightforward and understandable manner. While geographic information system (GIS) technology has been widely applied to this end, systems with the necessary analytical power have been usable only by trained operators. The USGS is addressing the need for more accessible, manageable data tools by developing a suite of Web-based geospatial applications that will incorporate USGS and cooperating partner data into the decision making process for a variety of critical issues. Examples of Web-based geospatial tools being used to address societal issues follow.

  15. RASOnD - A comprehensive resource and search tool for RAS superfamily oncogenes from various species

    Directory of Open Access Journals (Sweden)

    Singh Tej P

    2011-07-01

    Full Text Available Abstract Background The Ras superfamily plays an important role in the control of cell signalling and division. Mutations in the Ras genes convert them into active oncogenes. The Ras oncogenes form a major thrust of global cancer research as they are involved in the development and progression of tumors. This has resulted in the exponential growth of data on Ras superfamily across different public databases and in literature. However, no dedicated public resource is currently available for data mining and analysis on this family. The present database was developed to facilitate straightforward accession, retrieval and analysis of information available on Ras oncogenes from one particular site. Description We have developed the RAS Oncogene Database (RASOnD as a comprehensive knowledgebase that provides integrated and curated information on a single platform for oncogenes of Ras superfamily. RASOnD encompasses exhaustive genomics and proteomics data existing across diverse publicly accessible databases. This resource presently includes overall 199,046 entries from 101 different species. It provides a search tool to generate information about their nucleotide and amino acid sequences, single nucleotide polymorphisms, chromosome positions, orthologies, motifs, structures, related pathways and associated diseases. We have implemented a number of user-friendly search interfaces and sequence analysis tools. At present the user can (i browse the data (ii search any field through a simple or advance search interface and (iii perform a BLAST search and subsequently CLUSTALW multiple sequence alignment by selecting sequences of Ras oncogenes. The Generic gene browser, GBrowse, JMOL for structural visualization and TREEVIEW for phylograms have been integrated for clear perception of retrieved data. External links to related databases have been included in RASOnD. Conclusions This database is a resource and search tool dedicated to Ras oncogenes. It has

  16. Extended statistical entropy analysis as a quantitative management tool for water resource systems

    Science.gov (United States)

    Sobantka, Alicja; Rechberger, Helmut

    2010-05-01

    The use of entropy in hydrology and water resources has been applied to various applications. As water resource systems are inherently spatial and complex, a stochastic description of these systems is needed, and entropy theory enables development of such a description by providing determination of the least-biased probability distributions with limited knowledge and data. Entropy can also serve as a basis for risk and reliability analysis. The relative entropy has been variously interpreted as a measure freedom of choice, uncertainty and disorder, information content, missing information or information gain or loss. In the analysis of empirical data, entropy is another measure of dispersion, an alternative to the variance. Also, as an evaluation tool, the statistical entropy analysis (SEA) has been developed by previous workers to quantify the power of a process to concentrate chemical elements. Within this research programme the SEA is aimed to be extended for application to chemical compounds and tested for its deficits and potentials in systems where water resources play an important role. The extended SEA (eSEA) will be developed first for the nitrogen balance in waste water treatment plants (WWTP). Later applications on the emission of substances to water bodies such as groundwater (e.g. leachate from landfills) will also be possible. By applying eSEA to the nitrogen balance in a WWTP, all possible nitrogen compounds, which may occur during the water treatment process, are taken into account and are quantified in their impact towards the environment and human health. It has been shown that entropy reducing processes are part of modern waste management. Generally, materials management should be performed in a way that significant entropy rise is avoided. The entropy metric might also be used to perform benchmarking on WWTPs. The result out of this management tool would be the determination of the efficiency of WWTPs. By improving and optimizing the efficiency

  17. Cancer in silico drug discovery: a systems biology tool for identifying candidate drugs to target specific molecular tumor subtypes.

    Science.gov (United States)

    San Lucas, F Anthony; Fowler, Jerry; Chang, Kyle; Kopetz, Scott; Vilar, Eduardo; Scheet, Paul

    2014-12-01

    Large-scale cancer datasets such as The Cancer Genome Atlas (TCGA) allow researchers to profile tumors based on a wide range of clinical and molecular characteristics. Subsequently, TCGA-derived gene expression profiles can be analyzed with the Connectivity Map (CMap) to find candidate drugs to target tumors with specific clinical phenotypes or molecular characteristics. This represents a powerful computational approach for candidate drug identification, but due to the complexity of TCGA and technology differences between CMap and TCGA experiments, such analyses are challenging to conduct and reproduce. We present Cancer in silico Drug Discovery (CiDD; scheet.org/software), a computational drug discovery platform that addresses these challenges. CiDD integrates data from TCGA, CMap, and Cancer Cell Line Encyclopedia (CCLE) to perform computational drug discovery experiments, generating hypotheses for the following three general problems: (i) determining whether specific clinical phenotypes or molecular characteristics are associated with unique gene expression signatures; (ii) finding candidate drugs to repress these expression signatures; and (iii) identifying cell lines that resemble the tumors being studied for subsequent in vitro experiments. The primary input to CiDD is a clinical or molecular characteristic. The output is a biologically annotated list of candidate drugs and a list of cell lines for in vitro experimentation. We applied CiDD to identify candidate drugs to treat colorectal cancers harboring mutations in BRAF. CiDD identified EGFR and proteasome inhibitors, while proposing five cell lines for in vitro testing. CiDD facilitates phenotype-driven, systematic drug discovery based on clinical and molecular data from TCGA. ©2014 American Association for Cancer Research.

  18. Challenges in the development of an M4 PAM in vivo tool compound: The discovery of VU0467154 and unexpected DMPK profiles of close analogs.

    Science.gov (United States)

    Wood, Michael R; Noetzel, Meredith J; Poslusney, Michael S; Melancon, Bruce J; Tarr, James C; Lamsal, Atin; Chang, Sichen; Luscombe, Vincent B; Weiner, Rebecca L; Cho, Hyekyung P; Bubser, Michael; Jones, Carrie K; Niswender, Colleen M; Wood, Michael W; Engers, Darren W; Brandon, Nicholas J; Duggan, Mark E; Conn, P Jeffrey; Bridges, Thomas M; Lindsley, Craig W

    2017-01-15

    This letter describes the chemical optimization of a novel series of M 4 positive allosteric modulators (PAMs) based on a 5-amino-thieno[2,3-c]pyridazine core, developed via iterative parallel synthesis, and culminating in the highly utilized rodent in vivo tool compound, VU0467154 (5). This is the first report of the optimization campaign (SAR and DMPK profiling) that led to the discovery of VU0467154, and details all of the challenges faced in allosteric modulator programs (steep SAR, species differences in PAM pharmacology and subtle structural changes affecting CNS penetration). Copyright © 2016 Elsevier Ltd. All rights reserved.

  19. SAGES: a suite of freely-available software tools for electronic disease surveillance in resource-limited settings.

    Directory of Open Access Journals (Sweden)

    Sheri L Lewis

    Full Text Available Public health surveillance is undergoing a revolution driven by advances in the field of information technology. Many countries have experienced vast improvements in the collection, ingestion, analysis, visualization, and dissemination of public health data. Resource-limited countries have lagged behind due to challenges in information technology infrastructure, public health resources, and the costs of proprietary software. The Suite for Automated Global Electronic bioSurveillance (SAGES is a collection of modular, flexible, freely-available software tools for electronic disease surveillance in resource-limited settings. One or more SAGES tools may be used in concert with existing surveillance applications or the SAGES tools may be used en masse for an end-to-end biosurveillance capability. This flexibility allows for the development of an inexpensive, customized, and sustainable disease surveillance system. The ability to rapidly assess anomalous disease activity may lead to more efficient use of limited resources and better compliance with World Health Organization International Health Regulations.

  20. Open Educational Resources as a Tool to Improve Language Education Effectiveness in the Russian Higher Institutions

    Directory of Open Access Journals (Sweden)

    Tatiana Sidorenko

    2014-09-01

    Full Text Available An attempt of Russian universities to move forward to the leading positions in the world rankings has resulted in some initiatives to enhance their activities on the market of education services. Under these conditions, foreign language proficiency is no longer a luxury and it is becoming an important tool to implement goals of university development. In this regard, new methods and techniques of foreign language teaching are highly demanded, which would significantly improve the language competency of both students and faculty members. A search for effective methods to enhance foreign language teaching makes analyze Massive Open Online Courses (MOOCs open educational platforms and consider an opportunity for these platforms to be integrated into the existing system of foreign language teaching in Russian higher education institutions. Based on the research findings, the author concludes that it is irrational to use the resources as embedded components without significant adjustment to the conditions existing in the current higher education system.

  1. Harmonization and development of resources and tools for Italian natural language processing within the PARLI project

    CERN Document Server

    Bosco, Cristina; Delmonte, Rodolfo; Moschitti, Alessandro; Simi, Maria

    2015-01-01

    The papers collected in this volume are selected as a sample of the progress in Natural Language Processing (NLP) performed within the Italian NLP community and especially attested by the PARLI project. PARLI (Portale per l’Accesso alle Risorse in Lingua Italiana) is a project partially funded by the Ministero Italiano per l’Università e la Ricerca (PRIN 2008) from 2008 to 2012 for monitoring and fostering the harmonic growth and coordination of the activities of Italian NLP. It was proposed by various teams of researchers working in Italian universities and research institutions. According to the spirit of the PARLI project, most of the resources and tools created within the project and here described are freely distributed and they did not terminate their life at the end of the project itself, hoping they could be a key factor in future development of computational linguistics.

  2. Impact of Drought on Groundwater and Soil Moisture - A Geospatial Tool for Water Resource Management

    Science.gov (United States)

    Ziolkowska, J. R.; Reyes, R.

    2016-12-01

    For many decades, recurring droughts in different regions in the US have been negatively impacting ecosystems and economic sectors. Oklahoma and Texas have been suffering from exceptional and extreme droughts in 2011-2014, with almost 95% of the state areas being affected (Drought Monitor, 2015). Accordingly, in 2011 alone, around 1.6 billion were lost in the agricultural sector alone as a result of drought in Oklahoma (Stotts 2011), and 7.6 billion in Texas agriculture (Fannin 2012). While surface water is among the instant indicators of drought conditions, it does not translate directly to groundwater resources that are the main source of irrigation water. Both surface water and groundwater are susceptible to drought, while groundwater depletion is a long-term process and might not show immediately. However, understanding groundwater availability is crucial for designing water management strategies and sustainable water use in the agricultural sector and other economic sectors. This paper presents an interactive geospatially weighted evaluation model and a tool at the same time to analyze groundwater resources that can be used for decision support in water management. The tool combines both groundwater and soil moisture changes in Oklahoma and Texas in 2003-2014, thus representing the most important indicators of agricultural and hydrological drought. The model allows for analyzing temporal and geospatial long-term drought at the county level. It can be expanded to other regions in the US and the world. The model has been validated with the Palmer Drought Index Severity Index to account for other indicators of meteorological drought. It can serve as a basis for an upcoming socio-economic and environmental analysis of drought events in the short and long-term in different geographic regions.

  3. Protein Data Bank Japan (PDBj): updated user interfaces, resource description framework, analysis tools for large structures.

    Science.gov (United States)

    Kinjo, Akira R; Bekker, Gert-Jan; Suzuki, Hirofumi; Tsuchiya, Yuko; Kawabata, Takeshi; Ikegawa, Yasuyo; Nakamura, Haruki

    2017-01-04

    The Protein Data Bank Japan (PDBj, http://pdbj.org), a member of the worldwide Protein Data Bank (wwPDB), accepts and processes the deposited data of experimentally determined macromolecular structures. While maintaining the archive in collaboration with other wwPDB partners, PDBj also provides a wide range of services and tools for analyzing structures and functions of proteins. We herein outline the updated web user interfaces together with RESTful web services and the backend relational database that support the former. To enhance the interoperability of the PDB data, we have previously developed PDB/RDF, PDB data in the Resource Description Framework (RDF) format, which is now a wwPDB standard called wwPDB/RDF. We have enhanced the connectivity of the wwPDB/RDF data by incorporating various external data resources. Services for searching, comparing and analyzing the ever-increasing large structures determined by hybrid methods are also described. © The Author(s) 2016. Published by Oxford University Press on behalf of Nucleic Acids Research.

  4. Information contracting tools in a cancer specialist unit:the role of Healthcare Resource Groups (HRGs

    Directory of Open Access Journals (Sweden)

    Carol Marlow

    1998-01-01

    Full Text Available The need for high quality management information within the contracting process has driven many of the major developments in health service computing. These have often merged clinical and financial requirements, usually along patient-centred lines. In order to identify a common currency for a range of clinical activities that are inherently variable, price tariffs have been drawn up on the basis of 'episodes of care' within specialties. Healthcare Resource Groups (HRGs were designed to meet the need for a common information currency. However, they were designed for acute care. The study on which this paper is based aims to examine their applicability to chronic care in a cancer specialist unit. The data were drawn from the patient information system within a major cancer unit. The focus of the investigation is encapsulated in the following questions: a Do HRGs really work as a grouping and costing methodology? b How relevant are HRG classifications for long-term patient care? The investigation demonstrated that not all HRGs are iso-resource within this environment. The findings from the data analysis are echoed by the NHS Executive's own evaluation . This does not negate advantages in their use. Furthermore, the development of Health Benefit Groups as information management tools, through a focus on health conditions and interventions rather than on purely on treatments, offers potential for greater validity within a chronic care situation.

  5. Novel Tools for Conservation Genomics: Comparing Two High-Throughput Approaches for SNP Discovery in the Transcriptome of the European Hake

    DEFF Research Database (Denmark)

    Milano, Ilaria; Babbucci, Massimiliano; Panitz, Frank

    2011-01-01

    The growing accessibility to genomic resources using next-generation sequencing (NGS) technologies has revolutionized the application of molecular genetic tools to ecology and evolutionary studies in non-model organisms. Here we present the case study of the European hake (Merluccius merluccius),...

  6. The regional climate model as a tool for long-term planning of Quebec water resources

    International Nuclear Information System (INIS)

    Frigon, A.

    2008-01-01

    'Full text': In recent years, important progress has been made in downscaling GCM (Global Climate Model) projections to a resolution where hydrological studies become feasible. Climate change simulations performed with RCMs (Regional Climate Models) have reached a level of confidence that allows us to take advantage of this information in long-term planning of water resources. The RCMs' main advantage consist in their construction based on balanced land as well as atmosphere water and energy budgets, and on their inclusion of feedbacks between the surface and the atmosphere. Such models therefore generate sequences of weather events, providing long time series of hydro-climatic variables that are internally consistent, allowing the analysis of hydrologic regimes. At OURANOS, special attention is placed on the hydrological cycle, given its key role on socioeconomic activities. The Canadian Regional Climate Model (CRCM) was developed as a potential tool to provide climate projections at the watershed scale. Various analyses performed over small basins in Quebec provide information on the level of confidence we have in the CRCM for use in hydrological studies. Even though this approach is not free of uncertainty, it was found useful by some water resource managers and hence this information should be considered. One of the keys to retain usefulness, despite the associated uncertainties, is to make use of more than a single regional climate projection. This approach will allow for the evaluation of the climate change signal and its associated level of confidence. Such a methodology is already applied by Hydro-Quebec in the long-term planning of its water resources for hydroelectric generation over the Quebec territory. (author)

  7. Integrated groundwater resource management in Indus Basin using satellite gravimetry and physical modeling tools.

    Science.gov (United States)

    Iqbal, Naveed; Hossain, Faisal; Lee, Hyongki; Akhter, Gulraiz

    2017-03-01

    Reliable and frequent information on groundwater behavior and dynamics is very important for effective groundwater resource management at appropriate spatial scales. This information is rarely available in developing countries and thus poses a challenge for groundwater managers. The in situ data and groundwater modeling tools are limited in their ability to cover large domains. Remote sensing technology can now be used to continuously collect information on hydrological cycle in a cost-effective way. This study evaluates the effectiveness of a remote sensing integrated physical modeling approach for groundwater management in Indus Basin. The Gravity Recovery and Climate Experiment Satellite (GRACE)-based gravity anomalies from 2003 to 2010 were processed to generate monthly groundwater storage changes using the Variable Infiltration Capacity (VIC) hydrologic model. The groundwater storage is the key parameter of interest for groundwater resource management. The spatial and temporal patterns in groundwater storage (GWS) are useful for devising the appropriate groundwater management strategies. GRACE-estimated GWS information with large-scale coverage is valuable for basin-scale monitoring and decision making. This frequently available information is found useful for the identification of groundwater recharge areas, groundwater storage depletion, and pinpointing of the areas where groundwater sustainability is at risk. The GWS anomalies were found to favorably agree with groundwater model simulations from Visual MODFLOW and in situ data. Mostly, a moderate to severe GWS depletion is observed causing a vulnerable situation to the sustainability of this groundwater resource. For the sustainable groundwater management, the region needs to implement groundwater policies and adopt water conservation techniques.

  8. INTEGRATING CORPUS-BASED RESOURCES AND NATURAL LANGUAGE PROCESSING TOOLS INTO CALL

    Directory of Open Access Journals (Sweden)

    Pascual Cantos Gomez

    2002-06-01

    Full Text Available This paper ainis at presenting a survey of computational linguistic tools presently available but whose potential has been neither fully considered not exploited to its full in modern CALL. It starts with a discussion on the rationale of DDL to language learning, presenting typical DDL-activities. DDL-software and potential extensions of non-typical DDL-software (electronic dictionaries and electronic dictionary facilities to DDL . An extended section is devoted to describe NLP-technology and how it can be integrated into CALL, within already existing software or as stand alone resources. A range of NLP-tools is presentcd (MT programs, taggers, lemn~atizersp, arsers and speech technologies with special emphasis on tagged concordancing. The paper finishes with a number of reflections and ideas on how language technologies can be used efficiently within the language learning context and how extensive exploration and integration of these technologies might change and extend both modern CAI,I, and the present language learning paradigiii..

  9. AquaCrop-OS: A tool for resilient management of land and water resources in agriculture

    Science.gov (United States)

    Foster, Timothy; Brozovic, Nicholas; Butler, Adrian P.; Neale, Christopher M. U.; Raes, Dirk; Steduto, Pasquale; Fereres, Elias; Hsiao, Theodore C.

    2017-04-01

    Water managers, researchers, and other decision makers worldwide are faced with the challenge of increasing food production under population growth, drought, and rising water scarcity. Crop simulation models are valuable tools in this effort, and, importantly, provide a means of quantifying rapidly crop yield response to water, climate, and field management practices. Here, we introduce a new open-source crop modelling tool called AquaCrop-OS (Foster et al., 2017), which extends the functionality of the globally used FAO AquaCrop model. Through case studies focused on groundwater-fed irrigation in the High Plains and Central Valley of California in the United States, we demonstrate how AquaCrop-OS can be used to understand the local biophysical, behavioural, and institutional drivers of water risks in agricultural production. Furthermore, we also illustrate how AquaCrop-OS can be combined effectively with hydrologic and economic models to support drought risk mitigation and decision-making around water resource management at a range of spatial and temporal scales, and highlight future plans for model development and training. T. Foster, et al. (2017) AquaCrop-OS: An open source version of FAO's crop water productivity model. Agricultural Water Management. 181: 18-22. http://dx.doi.org/10.1016/j.agwat.2016.11.015.

  10. Enterprise resource planning (ERP) implementation using the value engineering methodology and Six Sigma tools

    Science.gov (United States)

    Leu, Jun-Der; Lee, Larry Jung-Hsing

    2017-09-01

    Enterprise resource planning (ERP) is a software solution that integrates the operational processes of the business functions of an enterprise. However, implementing ERP systems is a complex process. In addition to the technical issues, companies must address problems associated with business process re-engineering, time and budget control, and organisational change. Numerous industrial studies have shown that the failure rate of ERP implementation is high, even for well-designed systems. Thus, ERP projects typically require a clear methodology to support the project execution and effectiveness. In this study, we propose a theoretical model for ERP implementation. The value engineering (VE) method forms the basis of the proposed framework, which integrates Six Sigma tools. The proposed framework encompasses five phases: knowledge generation, analysis, creation, development and execution. In the VE method, potential ERP problems related to software, hardware, consultation and organisation are analysed in a group-decision manner and in relation to value, and Six Sigma tools are applied to avoid any project defects. We validate the feasibility of the proposed model by applying it to an international manufacturing enterprise in Taiwan. The results show improvements in customer response time and operational efficiency in terms of work-in-process and turnover of materials. Based on the evidence from the case study, the theoretical framework is discussed together with the study's limitations and suggestions for future research.

  11. Weight Estimation Tool for Children Aged 6 to 59 Months in Limited-Resource Settings.

    Science.gov (United States)

    Ralston, Mark E; Myatt, Mark A

    2016-01-01

    A simple, reliable anthropometric tool for rapid estimation of weight in children would be useful in limited-resource settings where current weight estimation tools are not uniformly reliable, nearly all global under-five mortality occurs, severe acute malnutrition is a significant contributor in approximately one-third of under-five mortality, and a weight scale may not be immediately available in emergencies to first-response providers. To determine the accuracy and precision of mid-upper arm circumference (MUAC) and height as weight estimation tools in children under five years of age in low-to-middle income countries. This was a retrospective observational study. Data were collected in 560 nutritional surveys during 1992-2006 using a modified Expanded Program of Immunization two-stage cluster sample design. Locations with high prevalence of acute and chronic malnutrition. A total of 453,990 children met inclusion criteria (age 6-59 months; weight ≤ 25 kg; MUAC 80-200 mm) and exclusion criteria (bilateral pitting edema; biologically implausible weight-for-height z-score (WHZ), weight-for-age z-score (WAZ), and height-for-age z-score (HAZ) values). Weight was estimated using Broselow Tape, Hong Kong formula, and database MUAC alone, height alone, and height and MUAC combined. Mean percentage difference between true and estimated weight, proportion of estimates accurate to within ± 25% and ± 10% of true weight, weighted Kappa statistic, and Bland-Altman bias were reported as measures of tool accuracy. Standard deviation of mean percentage difference and Bland-Altman 95% limits of agreement were reported as measures of tool precision. Database height was a more accurate and precise predictor of weight compared to Broselow Tape 2007 [B], Broselow Tape 2011 [A], and MUAC. Mean percentage difference between true and estimated weight was +0.49% (SD = 10.33%); proportion of estimates accurate to within ± 25% of true weight was 97.36% (95% CI 97.40%, 97.46%); and

  12. MLViS: A Web Tool for Machine Learning-Based Virtual Screening in Early-Phase of Drug Discovery and Development.

    Science.gov (United States)

    Korkmaz, Selcuk; Zararsiz, Gokmen; Goksuluk, Dincer

    2015-01-01

    Virtual screening is an important step in early-phase of drug discovery process. Since there are thousands of compounds, this step should be both fast and effective in order to distinguish drug-like and nondrug-like molecules. Statistical machine learning methods are widely used in drug discovery studies for classification purpose. Here, we aim to develop a new tool, which can classify molecules as drug-like and nondrug-like based on various machine learning methods, including discriminant, tree-based, kernel-based, ensemble and other algorithms. To construct this tool, first, performances of twenty-three different machine learning algorithms are compared by ten different measures, then, ten best performing algorithms have been selected based on principal component and hierarchical cluster analysis results. Besides classification, this application has also ability to create heat map and dendrogram for visual inspection of the molecules through hierarchical cluster analysis. Moreover, users can connect the PubChem database to download molecular information and to create two-dimensional structures of compounds. This application is freely available through www.biosoft.hacettepe.edu.tr/MLViS/.

  13. HANDS: a tool for genome-wide discovery of subgenome-specific base-identity in polyploids.

    KAUST Repository

    Mithani, Aziz

    2013-09-24

    The analysis of polyploid genomes is problematic because homeologous subgenome sequences are closely related. This relatedness makes it difficult to assign individual sequences to the specific subgenome from which they are derived, and hinders the development of polyploid whole genome assemblies.We here present a next-generation sequencing (NGS)-based approach for assignment of subgenome-specific base-identity at sites containing homeolog-specific polymorphisms (HSPs): \\'HSP base Assignment using NGS data through Diploid Similarity\\' (HANDS). We show that HANDS correctly predicts subgenome-specific base-identity at >90% of assayed HSPs in the hexaploid bread wheat (Triticum aestivum) transcriptome, thus providing a substantial increase in accuracy versus previous methods for homeolog-specific base assignment.We conclude that HANDS enables rapid and accurate genome-wide discovery of homeolog-specific base-identity, a capability having multiple applications in polyploid genomics.

  14. HANDS: a tool for genome-wide discovery of subgenome-specific base-identity in polyploids.

    KAUST Repository

    Mithani, Aziz; Belfield, Eric J; Brown, Carly; Jiang, Caifu; Leach, Lindsey J; Harberd, Nicholas P

    2013-01-01

    The analysis of polyploid genomes is problematic because homeologous subgenome sequences are closely related. This relatedness makes it difficult to assign individual sequences to the specific subgenome from which they are derived, and hinders the development of polyploid whole genome assemblies.We here present a next-generation sequencing (NGS)-based approach for assignment of subgenome-specific base-identity at sites containing homeolog-specific polymorphisms (HSPs): 'HSP base Assignment using NGS data through Diploid Similarity' (HANDS). We show that HANDS correctly predicts subgenome-specific base-identity at >90% of assayed HSPs in the hexaploid bread wheat (Triticum aestivum) transcriptome, thus providing a substantial increase in accuracy versus previous methods for homeolog-specific base assignment.We conclude that HANDS enables rapid and accurate genome-wide discovery of homeolog-specific base-identity, a capability having multiple applications in polyploid genomics.

  15. Combinatorial Libraries As a Tool for the Discovery of Novel, Broad-Spectrum Antibacterial Agents Targeting the ESKAPE Pathogens.

    Science.gov (United States)

    Fleeman, Renee; LaVoi, Travis M; Santos, Radleigh G; Morales, Angela; Nefzi, Adel; Welmaker, Gregory S; Medina-Franco, José L; Giulianotti, Marc A; Houghten, Richard A; Shaw, Lindsey N

    2015-04-23

    Mixture based synthetic combinatorial libraries offer a tremendous enhancement for the rate of drug discovery, allowing the activity of millions of compounds to be assessed through the testing of exponentially fewer samples. In this study, we used a scaffold-ranking library to screen 37 different libraries for antibacterial activity against the ESKAPE pathogens. Each library contained between 10000 and 750000 structural analogues for a total of >6 million compounds. From this, we identified a bis-cyclic guanidine library that displayed strong antibacterial activity. A positional scanning library for these compounds was developed and used to identify the most effective functional groups at each variant position. Individual compounds were synthesized that were broadly active against all ESKAPE organisms at concentrations development of resistance, and displayed almost no toxicity when tested against human lung cells and erythrocytes. Using a murine model of peritonitis, we also demonstrate that these agents are highly efficacious in vivo.

  16. 30 CFR 44.24 - Discovery.

    Science.gov (United States)

    2010-07-01

    ... 30 Mineral Resources 1 2010-07-01 2010-07-01 false Discovery. 44.24 Section 44.24 Mineral... Discovery. Parties shall be governed in their conduct of discovery by appropriate provisions of the Federal... discovery. Alternative periods of time for discovery may be prescribed by the presiding administrative law...

  17. Metabolomics as a Tool for Discovery of Biomarkers of Autism Spectrum Disorder in the Blood Plasma of Children

    Science.gov (United States)

    West, Paul R.; Amaral, David G.; Bais, Preeti; Smith, Alan M.; Egnash, Laura A.; Ross, Mark E.; Palmer, Jessica A.; Fontaine, Burr R.; Conard, Kevin R.; Corbett, Blythe A.; Cezar, Gabriela G.; Donley, Elizabeth L. R.; Burrier, Robert E.

    2014-01-01

    Background The diagnosis of autism spectrum disorder (ASD) at the earliest age possible is important for initiating optimally effective intervention. In the United States the average age of diagnosis is 4 years. Identifying metabolic biomarker signatures of ASD from blood samples offers an opportunity for development of diagnostic tests for detection of ASD at an early age. Objectives To discover metabolic features present in plasma samples that can discriminate children with ASD from typically developing (TD) children. The ultimate goal is to identify and develop blood-based ASD biomarkers that can be validated in larger clinical trials and deployed to guide individualized therapy and treatment. Methods Blood plasma was obtained from children aged 4 to 6, 52 with ASD and 30 age-matched TD children. Samples were analyzed using 5 mass spectrometry-based methods designed to orthogonally measure a broad range of metabolites. Univariate, multivariate and machine learning methods were used to develop models to rank the importance of features that could distinguish ASD from TD. Results A set of 179 statistically significant features resulting from univariate analysis were used for multivariate modeling. Subsets of these features properly classified the ASD and TD samples in the 61-sample training set with average accuracies of 84% and 86%, and with a maximum accuracy of 81% in an independent 21-sample validation set. Conclusions This analysis of blood plasma metabolites resulted in the discovery of biomarkers that may be valuable in the diagnosis of young children with ASD. The results will form the basis for additional discovery and validation research for 1) determining biomarkers to develop diagnostic tests to detect ASD earlier and improve patient outcomes, 2) gaining new insight into the biochemical mechanisms of various subtypes of ASD 3) identifying biomolecular targets for new modes of therapy, and 4) providing the basis for individualized treatment

  18. Metabolomics as a tool for discovery of biomarkers of autism spectrum disorder in the blood plasma of children.

    Directory of Open Access Journals (Sweden)

    Paul R West

    Full Text Available The diagnosis of autism spectrum disorder (ASD at the earliest age possible is important for initiating optimally effective intervention. In the United States the average age of diagnosis is 4 years. Identifying metabolic biomarker signatures of ASD from blood samples offers an opportunity for development of diagnostic tests for detection of ASD at an early age.To discover metabolic features present in plasma samples that can discriminate children with ASD from typically developing (TD children. The ultimate goal is to identify and develop blood-based ASD biomarkers that can be validated in larger clinical trials and deployed to guide individualized therapy and treatment.Blood plasma was obtained from children aged 4 to 6, 52 with ASD and 30 age-matched TD children. Samples were analyzed using 5 mass spectrometry-based methods designed to orthogonally measure a broad range of metabolites. Univariate, multivariate and machine learning methods were used to develop models to rank the importance of features that could distinguish ASD from TD.A set of 179 statistically significant features resulting from univariate analysis were used for multivariate modeling. Subsets of these features properly classified the ASD and TD samples in the 61-sample training set with average accuracies of 84% and 86%, and with a maximum accuracy of 81% in an independent 21-sample validation set.This analysis of blood plasma metabolites resulted in the discovery of biomarkers that may be valuable in the diagnosis of young children with ASD. The results will form the basis for additional discovery and validation research for 1 determining biomarkers to develop diagnostic tests to detect ASD earlier and improve patient outcomes, 2 gaining new insight into the biochemical mechanisms of various subtypes of ASD 3 identifying biomolecular targets for new modes of therapy, and 4 providing the basis for individualized treatment recommendations.

  19. DFAST and DAGA: web-based integrated genome annotation tools and resources.

    Science.gov (United States)

    Tanizawa, Yasuhiro; Fujisawa, Takatomo; Kaminuma, Eli; Nakamura, Yasukazu; Arita, Masanori

    2016-01-01

    Quality assurance and correct taxonomic affiliation of data submitted to public sequence databases have been an everlasting problem. The DDBJ Fast Annotation and Submission Tool (DFAST) is a newly developed genome annotation pipeline with quality and taxonomy assessment tools. To enable annotation of ready-to-submit quality, we also constructed curated reference protein databases tailored for lactic acid bacteria. DFAST was developed so that all the procedures required for DDBJ submission could be done seamlessly online. The online workspace would be especially useful for users not familiar with bioinformatics skills. In addition, we have developed a genome repository, DFAST Archive of Genome Annotation (DAGA), which currently includes 1,421 genomes covering 179 species and 18 subspecies of two genera, Lactobacillus and Pediococcus , obtained from both DDBJ/ENA/GenBank and Sequence Read Archive (SRA). All the genomes deposited in DAGA were annotated consistently and assessed using DFAST. To assess the taxonomic position based on genomic sequence information, we used the average nucleotide identity (ANI), which showed high discriminative power to determine whether two given genomes belong to the same species. We corrected mislabeled or misidentified genomes in the public database and deposited the curated information in DAGA. The repository will improve the accessibility and reusability of genome resources for lactic acid bacteria. By exploiting the data deposited in DAGA, we found intraspecific subgroups in Lactobacillus gasseri and Lactobacillus jensenii , whose variation between subgroups is larger than the well-accepted ANI threshold of 95% to differentiate species. DFAST and DAGA are freely accessible at https://dfast.nig.ac.jp.

  20. Innovations in resource efficiency and quality management as a tool for balanced development of flour mills

    Directory of Open Access Journals (Sweden)

    A. V. Bogomolov

    2017-01-01

    Full Text Available The article considers innovative tools used in flour mills and offers management solutions aimed at increasing the competitiveness of these enterprises. The relevance of the research topic is due to the need to expand the share of the industry market through the production of domestic products based on effective quality management. Studies have shown that most mills of modern foreign flour producers are equipped with the equipment necessary to enrich the product obtained. At present, enterprises of the Altai Territory, Kemerovo, Tomsk, Ryazan and Leningrad Regions, as well as Bashkiria and Tatarstan, are engaged in flour enrichment in Russia. Calculations confirm that flour fortification not only represents a number of advantages for human health, but also is economically beneficial for industry enterprises. The introduction of concentrates into flour may lead to a slight increase in cost, but should not cause significant changes in the selling price of the final product. These technological changes will allow the flour mill to use new competitive opportunities. Improving the quality of products due to the restoration of the level of vitamins and the increase of nutritional value will attract an additional consumer. The author developed an economic-mathematical model for optimizing the structure of production and sales of products using modern components based on the example of the flour-grinding enterprise of the Belgorod region. The calculations made allowed to ensure the maximum economic effect in the conditions of maintaining the existing system of distribution of products through the distribution channels, as well as maintaining the invariance of its value with the existing volumes of production. From this it follows that innovations in systems of resource efficiency and quality management are an effective tool for the balanced development of flour milling enterprises.

  1. Assessing the role of learning devices and geovisualisation tools for collective action in natural resource management: Experiences from Vietnam.

    Science.gov (United States)

    Castella, Jean-Christophe

    2009-02-01

    In northern Vietnam uplands the successive policy reforms that accompanied agricultural decollectivisation triggered very rapid changes in land use in the 1990s. From a centralized system of natural resource management, a multitude of individual strategies emerged which contributed to new production interactions among farming households, changes in landscape structures, and conflicting strategies among local stakeholders. Within this context of agrarian transition, learning devices can help local communities to collectively design their own course of action towards sustainable natural resource management. This paper presents a collaborative approach combining a number of participatory methods and geovisualisation tools (e.g., spatially explicit multi-agent models and role-playing games) with the shared goal to analyse and represent the interactions between: (i) decision-making processes by individual farmers based on the resource profiles of their farms; (ii) the institutions which regulate resource access and usage; and (iii) the biophysical and socioeconomic environment. This methodological pathway is illustrated by a case study in Bac Kan Province where it successfully led to a communication platform on natural resource management. In a context of rapid socioeconomic changes, learning devices and geovisualisation tools helped embed the participatory approach within a process of community development. The combination of different tools, each with its own advantages and constraints, proved highly relevant for supporting collective natural resource management.

  2. New generation pharmacogenomic tools: a SNP linkage disequilibrium Map, validated SNP assay resource, and high-throughput instrumentation system for large-scale genetic studies.

    Science.gov (United States)

    De La Vega, Francisco M; Dailey, David; Ziegle, Janet; Williams, Julie; Madden, Dawn; Gilbert, Dennis A

    2002-06-01

    Since public and private efforts announced the first draft of the human genome last year, researchers have reported great numbers of single nucleotide polymorphisms (SNPs). We believe that the availability of well-mapped, quality SNP markers constitutes the gateway to a revolution in genetics and personalized medicine that will lead to better diagnosis and treatment of common complex disorders. A new generation of tools and public SNP resources for pharmacogenomic and genetic studies--specifically for candidate-gene, candidate-region, and whole-genome association studies--will form part of the new scientific landscape. This will only be possible through the greater accessibility of SNP resources and superior high-throughput instrumentation-assay systems that enable affordable, highly productive large-scale genetic studies. We are contributing to this effort by developing a high-quality linkage disequilibrium SNP marker map and an accompanying set of ready-to-use, validated SNP assays across every gene in the human genome. This effort incorporates both the public sequence and SNP data sources, and Celera Genomics' human genome assembly and enormous resource ofphysically mapped SNPs (approximately 4,000,000 unique records). This article discusses our approach and methodology for designing the map, choosing quality SNPs, designing and validating these assays, and obtaining population frequency ofthe polymorphisms. We also discuss an advanced, high-performance SNP assay chemisty--a new generation of the TaqMan probe-based, 5' nuclease assay-and high-throughput instrumentation-software system for large-scale genotyping. We provide the new SNP map and validation information, validated SNP assays and reagents, and instrumentation systems as a novel resource for genetic discoveries.

  3. A Critical Study of Effect of Web-Based Software Tools in Finding and Sharing Digital Resources--A Literature Review

    Science.gov (United States)

    Baig, Muntajeeb Ali

    2010-01-01

    The purpose of this paper is to review the effect of web-based software tools for finding and sharing digital resources. A positive correlation between learning and studying through online tools has been found in recent researches. In traditional classroom, searching resources are limited to the library and sharing of resources is limited to the…

  4. Economic impacts of natural resources on a regional economy: the case of the pre-salt oil discoveries in Espirito Santo, Brazil

    Directory of Open Access Journals (Sweden)

    Eduardo Amaral Haddad

    2014-03-01

    Full Text Available The Brazilian government has recently confirmed the discovery of a huge oil and natural gas field in the pre-salt layer of the country’s southeastern coast. It has been said that the oil fields can boost Brazil’s oil production and turn the country into one of the largest oil producers in the world. The fields are spatially concentrated in the coastal areas of a few Brazilian states that may directly benefit from oil production. This paper uses an interregional computable general equilibrium model to assess the impacts of pre-salt on the economy of the State of Espírito Santo, a region already characterized by an economic base that is heavily reliant on natural resources. We focus our analysis on the structural economic impacts on the local economy

  5. INIS-based Japanese literature materials of bibliographic tools for human resource development

    International Nuclear Information System (INIS)

    Kunii, Katsuhiko; Gonda, Mayuki; Ikeda, Kiyoshi; Nagaya, Shun; Itabashi, Keizo; Nakajima, Hidemitsu; Mineo, Yukinobu

    2011-01-01

    The Library of the Japan Atomic Energy Agency (JAEA) has developed two Japanese literature materials of bibliographic tools based on the International Nuclear Information System (INIS) of the IAEA which contains over 3.3 million records of 127 countries and 24 international organizations. These materials have been elaborated by appropriately designating Japanese terminology of nuclear field corresponding with English terminology or vice versa. One is 'Transliterated Japanese journal title list' and the other is 'INIS Thesaurus in Japanese'. While the former is served as a reference that enables users to access articles of Japanese journals better matching their needs, the latter is served as a dictionary to bridge the gap on nuclear field terminologies between over 30,000 English terms and Japanese terms which correspond with those in a semantic manner. The application of those materials to the INIS's full text collection over 280,000 of technical reports, proceedings etc. as an archive is helpful for enhancement of human resource development. The authors describe the effectiveness of those INIS-based materials with bibliographic references of Fukushima Daiichi NPS accident. (author)

  6. Tools and methods for integrated resource planning. Improving energy efficiency and protecting the environment

    International Nuclear Information System (INIS)

    Swisher, J.N.; Martino Jannuzzi, G. de; Redlinger, R.Y.

    1997-01-01

    This book resulted from our recognition of the need to have systematic teaching and training materials on energy efficiency, end-use analysis, demand-side management (DSM) and integrated resource planning (IRP). This book addresses energy efficiency programs and IRP, exploring their application in the electricity sector. We believe that these methods will provide powerful and practical tools for designing efficient and environmentally-sustainable energy supply and demand-side programs to minimize the economic, environmental and other social costs of electricity conversion and use. Moreover, the principles of IRP can be and already are being applied in other areas such as natural gas, water supply, and even transportation and health services. Public authorities can use IRP principles to design programs to encourage end-use efficiency and environmental protection through environmental charges and incentives, non-utility programs, and utility programs applied to the functions remaining in monopoly concessions such as the distribution wires. Competitive supply firms can use IRP principles to satisfy customer needs for efficiency and low prices, to comply with present and future environmental restrictions, and to optimize supply and demand-side investments and returns, particularly at the distribution level, where local-area IRP is now being actively practiced. Finally, in those countries where a strong planning function remains in place, IRP provides a way to integrate end-use efficiency and environmental protection into energy development. (EG) 181 refs

  7. Tools and methods for integrated resource planning. Improving energy efficiency and protecting the environment

    Energy Technology Data Exchange (ETDEWEB)

    Swisher, J N; Martino Jannuzzi, G de; Redlinger, R Y

    1997-11-01

    This book resulted from our recognition of the need to have systematic teaching and training materials on energy efficiency, end-use analysis, demand-side management (DSM) and integrated resource planning (IRP). This book addresses energy efficiency programs and IRP, exploring their application in the electricity sector. We believe that these methods will provide powerful and practical tools for designing efficient and environmentally-sustainable energy supply and demand-side programs to minimize the economic, environmental and other social costs of electricity conversion and use. Moreover, the principles of IRP can be and already are being applied in other areas such as natural gas, water supply, and even transportation and health services. Public authorities can use IRP principles to design programs to encourage end-use efficiency and environmental protection through environmental charges and incentives, non-utility programs, and utility programs applied to the functions remaining in monopoly concessions such as the distribution wires. Competitive supply firms can use IRP principles to satisfy customer needs for efficiency and low prices, to comply with present and future environmental restrictions, and to optimize supply and demand-side investments and returns, particularly at the distribution level, where local-area IRP is now being actively practiced. Finally, in those countries where a strong planning function remains in place, IRP provides a way to integrate end-use efficiency and environmental protection into energy development. (EG) 181 refs.

  8. An Information Literacy Course for Doctoral Students: Information Resources and Tools for Research

    Directory of Open Access Journals (Sweden)

    Ann-Louise Paasio

    2015-12-01

    Full Text Available The purpose of this paper is to showcase the information literacy course for doctoral students called Information Resources and Tools for Research. Turku University Library organises this course in collaboration with the University of Turku Graduate School. The course, which was started in 2012, has been organised four times so far, twice in English and twice in Finnish. The course offers training to all doctoral Programs in all of the seven disciplines present at the University of Turku and doctoral candidates of the University. In our presentation we will describe the structure and contents of the course and share our experiences of the collaboration with the University of Turku Graduate School. In addition, we will describe how the information specialists of the Turku University Library have collaborated during the course. We will also discuss the challenges of the course. Based on the course feedback, it can be stated that in general, participants have found this course very useful for their research in the University of Turku.

  9. A contig-based strategy for the genome-wide discovery of microRNAs without complete genome resources.

    Directory of Open Access Journals (Sweden)

    Jun-Zhi Wen

    Full Text Available MicroRNAs (miRNAs are important regulators of many cellular processes and exist in a wide range of eukaryotes. High-throughput sequencing is a mainstream method of miRNA identification through which it is possible to obtain the complete small RNA profile of an organism. Currently, most approaches to miRNA identification rely on a reference genome for the prediction of hairpin structures. However, many species of economic and phylogenetic importance are non-model organisms without complete genome sequences, and this limits miRNA discovery. Here, to overcome this limitation, we have developed a contig-based miRNA identification strategy. We applied this method to a triploid species of edible banana (GCTCV-119, Musa spp. AAA group and identified 180 pre-miRNAs and 314 mature miRNAs, which is three times more than those were predicted by the available dataset-based methods (represented by EST+GSS. Based on the recently published miRNA data set of Musa acuminate, the recall rate and precision of our strategy are estimated to be 70.6% and 92.2%, respectively, significantly better than those of EST+GSS-based strategy (10.2% and 50.0%, respectively. Our novel, efficient and cost-effective strategy facilitates the study of the functional and evolutionary role of miRNAs, as well as miRNA-based molecular breeding, in non-model species of economic or evolutionary interest.

  10. Establishing the A. E. Watkins landrace cultivar collection as a resource for systematic gene discovery in bread wheat.

    Science.gov (United States)

    Wingen, Luzie U; Orford, Simon; Goram, Richard; Leverington-Waite, Michelle; Bilham, Lorelei; Patsiou, Theofania S; Ambrose, Mike; Dicks, Jo; Griffiths, Simon

    2014-08-01

    A high level of genetic diversity was found in the A. E. Watkins bread wheat landrace collection. Genotypic information was used to determine the population structure and to develop germplasm resources. In the 1930s A. E. Watkins acquired landrace cultivars of bread wheat (Triticum aestivum L.) from official channels of the board of Trade in London, many of which originated from local markets in 32 countries. The geographic distribution of the 826 landrace cultivars of the current collection, here called the Watkins collection, covers many Asian and European countries and some from Africa. The cultivars were genotyped with 41 microsatellite markers in order to investigate the genetic diversity and population structure of the collection. A high level of genetic diversity was found, higher than in a collection of modern European winter bread wheat varieties from 1945 to 2000. Furthermore, although weak, the population structure of the Watkins collection reveals nine ancestral geographical groupings. An exchange of genetic material between ancestral groups before commercial wheat-breeding started would be a possible explanation for this. The increased knowledge regarding the diversity of the Watkins collection was used to develop resources for wheat research and breeding, one of them a core set, which captures the majority of the genetic diversity detected. The understanding of genetic diversity and population structure together with the availability of breeding resources should help to accelerate the detection of new alleles in the Watkins collection.

  11. Open Educational Resources in Support of Science Learning: Tools for Inquiry and Observation

    Science.gov (United States)

    Scanlon, Eileen

    2012-01-01

    This article focuses on the potential of free tools, particularly inquiry tools for influencing participation in twenty-first-century learning in science, as well as influencing the development of communities around tools. Two examples are presented: one on the development of an open source tool for structured inquiry learning that can bridge the…

  12. “Time for Some Traffic Problems": Enhancing E-Discovery and Big Data Processing Tools with Linguistic Methods for Deception Detection

    Directory of Open Access Journals (Sweden)

    Erin Smith Crabb

    2014-09-01

    Full Text Available Linguistic deception theory provides methods to discover potentially deceptive texts to make them accessible to clerical review. This paper proposes the integration of these linguistic methods with traditional e-discovery techniques to identify deceptive texts within a given author’s larger body of written work, such as their sent email box. First, a set of linguistic features associated with deception are identified and a prototype classifier is constructed to analyze texts and describe the features’ distributions, while avoiding topic-specific features to improve recall of relevant documents. The tool is then applied to a portion of the Enron Email Dataset to illustrate how these strategies identify records, providing an example of its advantages and capability to stratify the large data set at hand.

  13. Remote sensing change detection tools for natural resource managers: Understanding concepts and tradeoffs in the design of landscape monitoring projects

    Science.gov (United States)

    Robert E. Kennedy; Philip A. Townsend; John E. Gross; Warren B. Cohen; Paul Bolstad; Wang Y. Q.; Phyllis Adams

    2009-01-01

    Remote sensing provides a broad view of landscapes and can be consistent through time, making it an important tool for monitoring and managing protected areas. An impediment to broader use of remote sensing science for monitoring has been the need for resource managers to understand the specialized capabilities of an ever-expanding array of image sources and analysis...

  14. Availability of EPA Tools and Resources to Increase Awareness of the Cardiovascular Health Effects of Air Pollution

    Science.gov (United States)

    On November 14, 2017 Dr. Wayne Cascio, Acting Director will present a webinar titled, “Availability of EPA Tools and Resources to Increase Awareness of the Cardiovascular Health Effects of Air Pollution” to HHS’ Million Hearts Federal Partner’s Monthly Cal...

  15. The Evolution of Discovery Systems in Academic Libraries: A Case Study at the University of Houston Libraries

    Science.gov (United States)

    Guajardo, Richard; Brett, Kelsey; Young, Frederick

    2017-01-01

    For the past several years academic libraries have been adopting discovery systems to provide a search experience that reflects user expectations and improves access to electronic resources. University of Houston Libraries has kept pace with this evolving trend by pursuing various discovery options; these include an open-source tool, a federated…

  16. miRDis: a Web tool for endogenous and exogenous microRNA discovery based on deep-sequencing data analysis.

    Science.gov (United States)

    Zhang, Hanyuan; Vieira Resende E Silva, Bruno; Cui, Juan

    2018-05-01

    Small RNA sequencing is the most widely used tool for microRNA (miRNA) discovery, and shows great potential for the efficient study of miRNA cross-species transport, i.e., by detecting the presence of exogenous miRNA sequences in the host species. Because of the increased appreciation of dietary miRNAs and their far-reaching implication in human health, research interests are currently growing with regard to exogenous miRNAs bioavailability, mechanisms of cross-species transport and miRNA function in cellular biological processes. In this article, we present microRNA Discovery (miRDis), a new small RNA sequencing data analysis pipeline for both endogenous and exogenous miRNA detection. Specifically, we developed and deployed a Web service that supports the annotation and expression profiling data of known host miRNAs and the detection of novel miRNAs, other noncoding RNAs, and the exogenous miRNAs from dietary species. As a proof-of-concept, we analyzed a set of human plasma sequencing data from a milk-feeding study where 225 human miRNAs were detected in the plasma samples and 44 show elevated expression after milk intake. By examining the bovine-specific sequences, data indicate that three bovine miRNAs (bta-miR-378, -181* and -150) are present in human plasma possibly because of the dietary uptake. Further evaluation based on different sets of public data demonstrates that miRDis outperforms other state-of-the-art tools in both detection and quantification of miRNA from either animal or plant sources. The miRDis Web server is available at: http://sbbi.unl.edu/miRDis/index.php.

  17. The urban harvest approach as framework and planning tool for improved water and resource cycles

    NARCIS (Netherlands)

    Leusbrock, I.; Nanninga, T.A.; Lieberg, K.; Agudelo, C.; Keesman, K.J.; Zeeman, G.; Rijnaarts, H.

    2015-01-01

    Water and resource availability in sufficient quantity and quality for anthropogenic needs represents one of the main challenges in the coming decades. To prepare for upcoming challenges such as increased urbanization and climate change related consequences, innovative and improved resource

  18. The Maryland Coastal Plain Aquifer Information System: A GIS-based tool for assessing groundwater resources

    Science.gov (United States)

    Andreasen, David C.; Nardi, Mark R.; Staley, Andrew W.; Achmad, Grufron; Grace, John W.

    2016-01-01

    Groundwater is the source of drinking water for ∼1.4 million people in the Coastal Plain Province of Maryland (USA). In addition, groundwater is essential for commercial, industrial, and agricultural uses. Approximately 0.757 × 109 L d–1 (200 million gallons/d) were withdrawn in 2010. As a result of decades of withdrawals from the coastal plain confined aquifers, groundwater levels have declined by as much as 70 m (230 ft) from estimated prepumping levels. Other issues posing challenges to long-term groundwater sustainability include degraded water quality from both man-made and natural sources, reduced stream base flow, land subsidence, and changing recharge patterns (drought) caused by climate change. In Maryland, groundwater supply is managed primarily by the Maryland Department of the Environment, which seeks to balance reasonable use of the resource with long-term sustainability. The chief goal of groundwater management in Maryland is to ensure safe and adequate supplies for all current and future users through the implementation of appropriate usage, planning, and conservation policies. To assist in that effort, the geographic information system (GIS)–based Maryland Coastal Plain Aquifer Information System was developed as a tool to help water managers access and visualize groundwater data for use in the evaluation of groundwater allocation and use permits. The system, contained within an ESRI ArcMap desktop environment, includes both interpreted and basic data for 16 aquifers and 14 confining units. Data map layers include aquifer and ­confining unit layer surfaces, aquifer extents, borehole information, hydraulic properties, time-series groundwater-level data, well records, and geophysical and lithologic logs. The aquifer and confining unit layer surfaces were generated specifically for the GIS system. The system also contains select groundwater-quality data and map layers that quantify groundwater and surface-water withdrawals. The aquifer

  19. Material flow cost accounting as a tool for improved resource efficiency in the hotel sector: A case of emerging market

    Directory of Open Access Journals (Sweden)

    Celani John Nyide

    2016-12-01

    Full Text Available Material Flow Cost Accounting (MFCA is one of the Environmental Management Accounting (EMA tools that has been developed to enable environmentally and economically efficient material usage and thus improve resource efficiency. However, the use of this tool to improve resource efficiency in the South African hotel sector remains unknown. An exploratory study, qualitative in nature, was conducted using a single case study with embedded units approach. A Hotel Management Group that met the selection criteria formed part of this study. In-depth interviews were conducted with 10 participants and additional documents were analysed. The investigated hotels have developed technologies that provide an environmental account in both physical and monetary units which constitute the use of MFCA to improve resource efficiencies. However, the study established a number of factors that affect the implementation of MFCA by the hotel sector in a South African context

  20. Resource Planning in Glaucoma: A Tool to Evaluate Glaucoma Service Capacity.

    Science.gov (United States)

    Batra, Ruchika; Sharma, Hannah E; Elaraoud, Ibrahim; Mohamed, Shabbir

    2017-12-28

    The National Patient Safety Agency (2009) publication advising timely follow-up of patients with established glaucoma followed several reported instances of visual loss due to postponed appointments and patients lost to follow-up. The Royal College of Ophthalmologists Quality Standards Development Group stated that all hospital appointments should occur within 15% of the intended follow-up period. To determine whether: 1. Glaucoma follow-up appointments at a teaching hospital occur within the requested time 2. Appointments are requested at appropriate intervals based on the NICE Guidelines 3. The capacity of the glaucoma service is adequate Methods: A two-part audit was undertaken of 98 and 99 consecutive patients respectively attending specialist glaucoma clinics. In the first part, the reasons for delayed appointments were recorded. In the second part the requested follow-up was compared with NICE guidelines where applicable. Based on the findings, changes were implemented and a re-audit of 100 patients was carried out. The initial audit found that although clinical decisions regarding follow-up intervals were 100% compliant with NICE guidelines where applicable, 24% of appointments were delayed beyond 15% of the requested period, due to administrative errors and inadequate capacity, leading to significant clinical deterioration in two patients. Following the introduction of an electronic appointment tracker and increased clinical capacity created by extra clinics and clinicians, the re-audit found a marked decrease in the percentage of appointments being delayed (9%). This audit is a useful tool to evaluate glaucoma service provision, assist in resource planning for the service and bring about change in a non-confrontational way. It can be widely applied and adapted for use in other medical specialities.

  1. Global resource sharing

    CERN Document Server

    Frederiksen, Linda; Nance, Heidi

    2011-01-01

    Written from a global perspective, this book reviews sharing of library resources on a global scale. With expanded discovery tools and massive digitization projects, the rich and extensive holdings of the world's libraries are more visible now than at any time in the past. Advanced communication and transmission technologies, along with improved international standards, present a means for the sharing of library resources around the globe. Despite these significant improvements, a number of challenges remain. Global Resource Sharing provides librarians and library managers with a comprehensive

  2. Project Final Report: Ubiquitous Computing and Monitoring System (UCoMS) for Discovery and Management of Energy Resources

    Energy Technology Data Exchange (ETDEWEB)

    Tzeng, Nian-Feng; White, Christopher D.; Moreman, Douglas

    2012-07-14

    The UCoMS research cluster has spearheaded three research areas since August 2004, including wireless and sensor networks, Grid computing, and petroleum applications. The primary goals of UCoMS research are three-fold: (1) creating new knowledge to push forward the technology forefronts on pertinent research on the computing and monitoring aspects of energy resource management, (2) developing and disseminating software codes and toolkits for the research community and the public, and (3) establishing system prototypes and testbeds for evaluating innovative techniques and methods. Substantial progress and diverse accomplishment have been made by research investigators in their respective areas of expertise cooperatively on such topics as sensors and sensor networks, wireless communication and systems, computational Grids, particularly relevant to petroleum applications.

  3. Quality tools and resources to support organisational improvement integral to high-quality primary care: a systematic review of published and grey literature.

    Science.gov (United States)

    Janamian, Tina; Upham, Susan J; Crossland, Lisa; Jackson, Claire L

    2016-04-18

    To conduct a systematic review of the literature to identify existing online primary care quality improvement tools and resources to support organisational improvement related to the seven elements in the Primary Care Practice Improvement Tool (PC-PIT), with the identified tools and resources to progress to a Delphi study for further assessment of relevance and utility. Systematic review of the international published and grey literature. CINAHL, Embase and PubMed databases were searched in March 2014 for articles published between January 2004 and December 2013. GreyNet International and other relevant websites and repositories were also searched in March-April 2014 for documents dated between 1992 and 2012. All citations were imported into a bibliographic database. Published and unpublished tools and resources were included in the review if they were in English, related to primary care quality improvement and addressed any of the seven PC-PIT elements of a high-performing practice. Tools and resources that met the eligibility criteria were then evaluated for their accessibility, relevance, utility and comprehensiveness using a four-criteria appraisal framework. We used a data extraction template to systematically extract information from eligible tools and resources. A content analysis approach was used to explore the tools and resources and collate relevant information: name of the tool or resource, year and country of development, author, name of the organisation that provided access and its URL, accessibility information or problems, overview of each tool or resource and the quality improvement element(s) it addresses. If available, a copy of the tool or resource was downloaded into the bibliographic database, along with supporting evidence (published or unpublished) on its use in primary care. This systematic review identified 53 tools and resources that can potentially be provided as part of a suite of tools and resources to support primary care practices in

  4. The Second Victim Experience and Support Tool: Validation of an Organizational Resource for Assessing Second Victim Effects and the Quality of Support Resources.

    Science.gov (United States)

    Burlison, Jonathan D; Scott, Susan D; Browne, Emily K; Thompson, Sierra G; Hoffman, James M

    2017-06-01

    Medical errors and unanticipated negative patient outcomes can damage the well-being of health care providers. These affected individuals, referred to as "second victims," can experience various psychological and physical symptoms. Support resources provided by health care organizations to prevent and reduce second victim-related harm are often inadequate. In this study, we present the development and psychometric evaluation of the Second Victim Experience and Support Tool (SVEST), a survey instrument that can assist health care organizations to implement and track the performance of second victim support resources. The SVEST (29 items representing 7 dimensions and 2 outcome variables) was completed by 303 health care providers involved in direct patient care. The survey collected responses on second victim-related psychological and physical symptoms and the quality of support resources. Desirability of possible support resources was also measured. The SVEST was assessed for content validity, internal consistency, and construct validity with confirmatory factor analysis. Confirmatory factor analysis results suggested good model fit for the survey. Cronbach α reliability scores for the survey dimensions ranged from 0.61 to 0.89. The most desired second victim support option was "A respected peer to discuss the details of what happened." The SVEST can be used by health care organizations to evaluate second victim experiences of their staff and the quality of existing support resources. It can also provide health care organization leaders with information on second victim-related support resources most preferred by their staff. The SVEST can be administered before and after implementing new second victim resources to measure perceptions of effectiveness.

  5. Metrics and tools for consistent cohort discovery and financial analyses post-transition to ICD-10-CM.

    Science.gov (United States)

    Boyd, Andrew D; Li, Jianrong John; Kenost, Colleen; Joese, Binoy; Yang, Young Min; Kalagidis, Olympia A; Zenku, Ilir; Saner, Donald; Bahroos, Neil; Lussier, Yves A

    2015-05-01

    In the United States, International Classification of Disease Clinical Modification (ICD-9-CM, the ninth revision) diagnosis codes are commonly used to identify patient cohorts and to conduct financial analyses related to disease. In October 2015, the healthcare system of the United States will transition to ICD-10-CM (the tenth revision) diagnosis codes. One challenge posed to clinical researchers and other analysts is conducting diagnosis-related queries across datasets containing both coding schemes. Further, healthcare administrators will manage growth, trends, and strategic planning with these dually-coded datasets. The majority of the ICD-9-CM to ICD-10-CM translations are complex and nonreciprocal, creating convoluted representations and meanings. Similarly, mapping back from ICD-10-CM to ICD-9-CM is equally complex, yet different from mapping forward, as relationships are likewise nonreciprocal. Indeed, 10 of the 21 top clinical categories are complex as 78% of their diagnosis codes are labeled as "convoluted" by our analyses. Analysis and research related to external causes of morbidity, injury, and poisoning will face the greatest challenges due to 41 745 (90%) convolutions and a decrease in the number of codes. We created a web portal tool and translation tables to list all ICD-9-CM diagnosis codes related to the specific input of ICD-10-CM diagnosis codes and their level of complexity: "identity" (reciprocal), "class-to-subclass," "subclass-to-class," "convoluted," or "no mapping." These tools provide guidance on ambiguous and complex translations to reveal where reports or analyses may be challenging to impossible.Web portal: http://www.lussierlab.org/transition-to-ICD9CM/Tables annotated with levels of translation complexity: http://www.lussierlab.org/publications/ICD10to9. © The Author 2015. Published by Oxford University Press on behalf of the American Medical Informatics Association.

  6. Genomic resources for gene discovery, functional genome annotation, and evolutionary studies of maize and its close relatives.

    Science.gov (United States)

    Wang, Chao; Shi, Xue; Liu, Lin; Li, Haiyan; Ammiraju, Jetty S S; Kudrna, David A; Xiong, Wentao; Wang, Hao; Dai, Zhaozhao; Zheng, Yonglian; Lai, Jinsheng; Jin, Weiwei; Messing, Joachim; Bennetzen, Jeffrey L; Wing, Rod A; Luo, Meizhong

    2013-11-01

    Maize is one of the most important food crops and a key model for genetics and developmental biology. A genetically anchored and high-quality draft genome sequence of maize inbred B73 has been obtained to serve as a reference sequence. To facilitate evolutionary studies in maize and its close relatives, much like the Oryza Map Alignment Project (OMAP) (www.OMAP.org) bacterial artificial chromosome (BAC) resource did for the rice community, we constructed BAC libraries for maize inbred lines Zheng58, Chang7-2, and Mo17 and maize wild relatives Zea mays ssp. parviglumis and Tripsacum dactyloides. Furthermore, to extend functional genomic studies to maize and sorghum, we also constructed binary BAC (BIBAC) libraries for the maize inbred B73 and the sorghum landrace Nengsi-1. The BAC/BIBAC vectors facilitate transfer of large intact DNA inserts from BAC clones to the BIBAC vector and functional complementation of large DNA fragments. These seven Zea Map Alignment Project (ZMAP) BAC/BIBAC libraries have average insert sizes ranging from 92 to 148 kb, organellar DNA from 0.17 to 2.3%, empty vector rates between 0.35 and 5.56%, and genome equivalents of 4.7- to 8.4-fold. The usefulness of the Parviglumis and Tripsacum BAC libraries was demonstrated by mapping clones to the reference genome. Novel genes and alleles present in these ZMAP libraries can now be used for functional complementation studies and positional or homology-based cloning of genes for translational genomics.

  7. Tools and data services registry

    DEFF Research Database (Denmark)

    Ison, Jon; Rapacki, Kristoffer; Ménager, Hervé

    2016-01-01

    Life sciences are yielding huge data sets that underpin scientific discoveries fundamental to improvement in human health, agriculture and the environment. In support of these discoveries, a plethora of databases and tools are deployed, in technically complex and diverse implementations, across...... a spectrum of scientific disciplines. The corpus of documentation of these resources is fragmented across the Web, with much redundancy, and has lacked a common standard of information. The outcome is that scientists must often struggle to find, understand, compare and use the best resources for the task...

  8. The EIPeptiDi tool: enhancing peptide discovery in ICAT-based LC MS/MS experiments

    Directory of Open Access Journals (Sweden)

    Tradigo Giuseppe

    2007-07-01

    Full Text Available Abstract Background Isotope-coded affinity tags (ICAT is a method for quantitative proteomics based on differential isotopic labeling, sample digestion and mass spectrometry (MS. The method allows the identification and relative quantification of proteins present in two samples and consists of the following phases. First, cysteine residues are either labeled using the ICAT Light or ICAT Heavy reagent (having identical chemical properties but different masses. Then, after whole sample digestion, the labeled peptides are captured selectively using the biotin tag contained in both ICAT reagents. Finally, the simplified peptide mixture is analyzed by nanoscale liquid chromatography-tandem mass spectrometry (LC-MS/MS. Nevertheless, the ICAT LC-MS/MS method still suffers from insufficient sample-to-sample reproducibility on peptide identification. In particular, the number and the type of peptides identified in different experiments can vary considerably and, thus, the statistical (comparative analysis of sample sets is very challenging. Low information overlap at the peptide and, consequently, at the protein level, is very detrimental in situations where the number of samples to be analyzed is high. Results We designed a method for improving the data processing and peptide identification in sample sets subjected to ICAT labeling and LC-MS/MS analysis, based on cross validating MS/MS results. Such a method has been implemented in a tool, called EIPeptiDi, which boosts the ICAT data analysis software improving peptide identification throughout the input data set. Heavy/Light (H/L pairs quantified but not identified by the MS/MS routine, are assigned to peptide sequences identified in other samples, by using similarity criteria based on chromatographic retention time and Heavy/Light mass attributes. EIPeptiDi significantly improves the number of identified peptides per sample, proving that the proposed method has a considerable impact on the protein

  9. California's Central Valley Groundwater Study: A Powerful New Tool to Assess Water Resources in California's Central Valley

    Science.gov (United States)

    Faunt, Claudia C.; Hanson, Randall T.; Belitz, Kenneth; Rogers, Laurel

    2009-01-01

    Competition for water resources is growing throughout California, particularly in the Central Valley. Since 1980, the Central Valley's population has nearly doubled to 3.8 million people. It is expected to increase to 6 million by 2020. Statewide population growth, anticipated reductions in Colorado River water deliveries, drought, and the ecological crisis in the Sacramento-San Joaquin Delta have created an intense demand for water. Tools and information can be used to help manage the Central Valley aquifer system, an important State and national resource.

  10. Forward-backward asymmetry as a discovery tool for Z{sup ′} bosons at the LHC

    Energy Technology Data Exchange (ETDEWEB)

    Accomando, Elena; Belyaev, Alexander; Fiaschi, Juri [School of Physics and Astronomy, University of Southampton, Highfield Campus,University Rd, Southampton, SO17 1BJ (United Kingdom); Mimasu, Ken [School of Physics and Astronomy, University of Sussex,Falmer, Brighton, BN1 9RH (United Kingdom); Moretti, Stefano [School of Physics and Astronomy, University of Southampton, Highfield Campus,University Rd, Southampton, SO17 1BJ (United Kingdom); Shepherd-Themistocleous, Claire [Particle Physics Department, STFC, Rutherford Appleton Laboratory, Harwell Science and Innovation Campus, Didcot, Oxfordshire, OX11 0QX (United Kingdom)

    2016-01-20

    The Forward-Backward Asymmetry (AFB) in Z{sup ′} physics is commonly only perceived as the observable which possibly allows one to interpret a Z{sup ′} signal appearing in the Drell-Yan channel by distinguishing different models of such (heavy) spin-1 bosons. In this paper, we revisit this issue, showing that the absence of any di-lepton rapidity cut, which is commonly used in the literature, can enhance the potential of the observable at the LHC. We moreover examine the ability of AFB in setting bounds on or even discovering a Z{sup ′} at the Large Hadron Collider (LHC) concluding that it may be a powerful tool for this purpose. We analyse two different scenarios: Z{sup ′}-bosons with a narrow and wide width, respectively. We find that, in the first case, the significance of the AFB search can be comparable with that of the ‘bump’ search usually adopted by the experimental collaborations; however, in being a ratio of (differential) cross sections, the AFB has the advantage of reducing experimental systematics as well as theoretical errors due to PDF uncertainties. In the second case, the AFB search can outperform the bump search in terms of differential shape, meaning the AFB distribution may be better suited for new broad resonances than the event counting strategy usually adopted in such cases.

  11. Forest Adaptation Resources: climate change tools and approaches for land managers, 2nd edition

    Science.gov (United States)

    Christopher W. Swanston; Maria K. Janowiak; Leslie A. Brandt; Patricia R. Butler; Stephen D. Handler; P. Danielle Shannon; Abigail Derby Lewis; Kimberly Hall; Robert T. Fahey; Lydia Scott; Angela Kerber; Jason W. Miesbauer; Lindsay. Darling

    2016-01-01

    Forests across the United States are expected to undergo numerous changes in response to the changing climate. This second edition of the Forest Adaptation Resources provides a collection of resources designed to help forest managers incorporate climate change considerations into management and devise adaptation tactics. It was developed as part of the Climate Change...

  12. UserTesting.com: A Tool for Usability Testing of Online Resources

    Science.gov (United States)

    Koundinya, Vikram; Klink, Jenna; Widhalm, Melissa

    2017-01-01

    Extension educators are increasingly using online resources in their program design and delivery. Usability testing is essential for ensuring that these resources are relevant and useful to learners. On the basis of our experiences with iteratively developing products using a testing service called UserTesting, we promote the use of fee-based…

  13. A real-time Java tool chain for resource constrained platforms

    DEFF Research Database (Denmark)

    Korsholm, Stephan E.; Søndergaard, Hans; Ravn, Anders Peter

    2014-01-01

    The Java programming language was originally developed for embedded systems, but the resource requirements of previous and current Java implementations – especially memory consumption – tend to exclude them from being used on a significant class of resource constrained embedded platforms. The con......The Java programming language was originally developed for embedded systems, but the resource requirements of previous and current Java implementations – especially memory consumption – tend to exclude them from being used on a significant class of resource constrained embedded platforms...... by integrating the following: (1) a lean virtual machine without any external dependencies on POSIX-like libraries or other OS functionalities; (2) a hardware abstraction layer, implemented almost entirely in Java through the use of hardware objects, first level interrupt handlers, and native variables; and (3....... An evaluation of the presented solution shows that the miniCDj benchmark gets reduced to a size where it can run on resource constrained platforms....

  14. A real-time Java tool chain for resource constrained platforms

    DEFF Research Database (Denmark)

    Korsholm, Stephan Erbs; Søndergaard, Hans; Ravn, Anders P.

    2013-01-01

    The Java programming language was originally developed for embedded systems, but the resource requirements of previous and current Java implementations - especially memory consumption - tend to exclude them from being used on a significant class of resource constrained embedded platforms. The con......The Java programming language was originally developed for embedded systems, but the resource requirements of previous and current Java implementations - especially memory consumption - tend to exclude them from being used on a significant class of resource constrained embedded platforms...... by integrating: (1) a lean virtual machine (HVM) without any external dependencies on POSIX-like libraries or other OS functionalities, (2) a hardware abstraction layer, implemented almost entirely in Java through the use of hardware objects, first level interrupt handlers, and native variables, and (3....... An evaluation of the presented solution shows that the miniCDj benchmark gets reduced to a size where it can run on resource constrained platforms....

  15. The new Planetary Science Archive: A tool for exploration and discovery of scientific datasets from ESA's planetary missions

    Science.gov (United States)

    Heather, David

    2016-07-01

    Introduction: The Planetary Science Archive (PSA) is the European Space Agency's (ESA) repository of science data from all planetary science and exploration missions. The PSA provides access to scientific datasets through various interfaces (e.g. FTP browser, Map based, Advanced search, and Machine interface): http://archives.esac.esa.int/psa All datasets are scientifically peer-reviewed by independent scientists, and are compliant with the Planetary Data System (PDS) standards. Updating the PSA: The PSA is currently implementing a number of significant changes, both to its web-based interface to the scientific community, and to its database structure. The new PSA will be up-to-date with versions 3 and 4 of the PDS standards, as PDS4 will be used for ESA's upcoming ExoMars and BepiColombo missions. The newly designed PSA homepage will provide direct access to scientific datasets via a text search for targets or missions. This will significantly reduce the complexity for users to find their data and will promote one-click access to the datasets. Additionally, the homepage will provide direct access to advanced views and searches of the datasets. Users will have direct access to documentation, information and tools that are relevant to the scientific use of the dataset, including ancillary datasets, Software Interface Specification (SIS) documents, and any tools/help that the PSA team can provide. A login mechanism will provide additional functionalities to the users to aid / ease their searches (e.g. saving queries, managing default views). Queries to the PSA database will be possible either via the homepage (for simple searches of missions or targets), or through a filter menu for more tailored queries. The filter menu will offer multiple options to search for a particular dataset or product, and will manage queries for both in-situ and remote sensing instruments. Parameters such as start-time, phase angle, and heliocentric distance will be emphasized. A further

  16. American Recovery and Reinvestment Act-comparative effectiveness research infrastructure investments: emerging data resources, tools and publications.

    Science.gov (United States)

    Segal, Courtney; Holve, Erin

    2014-11-01

    The Recovery Act provided a substantial, one-time investment in data infrastructure for comparative effectiveness research (CER). A review of the publications, data, and tools developed as a result of this support has informed understanding of the level of effort undertaken by these projects. Structured search queries, as well as outreach efforts, were conducted to identify and review resources from American Recovery and Reinvestment Act of 2009 CER projects building electronic clinical data infrastructure. The findings from this study provide a spectrum of productivity across a range of topics and settings. A total of 451 manuscripts published in 192 journals, and 141 data resources and tools were identified and address gaps in evidence on priority populations, conditions, and the infrastructure needed to support CER.

  17. Computational methods in drug discovery

    OpenAIRE

    Sumudu P. Leelananda; Steffen Lindert

    2016-01-01

    The process for drug discovery and development is challenging, time consuming and expensive. Computer-aided drug discovery (CADD) tools can act as a virtual shortcut, assisting in the expedition of this long process and potentially reducing the cost of research and development. Today CADD has become an effective and indispensable tool in therapeutic development. The human genome project has made available a substantial amount of sequence data that can be used in various drug discovery project...

  18. Delivering Electronic Resources with Web OPACs and Other Web-based Tools: Needs of Reference Librarians.

    Science.gov (United States)

    Bordeianu, Sever; Carter, Christina E.; Dennis, Nancy K.

    2000-01-01

    Describes Web-based online public access catalogs (Web OPACs) and other Web-based tools as gateway methods for providing access to library collections. Addresses solutions for overcoming barriers to information, such as through the implementation of proxy servers and other authentication tools for remote users. (Contains 18 references.)…

  19. Summary of Training Workshop on the Use of NASA tools for Coastal Resource Management in the Gulf of Mexico

    Energy Technology Data Exchange (ETDEWEB)

    Judd, Chaeli; Judd, Kathleen S.; Gulbransen, Thomas C.; Thom, Ronald M.

    2009-03-01

    A two-day training workshop was held in Xalapa, Mexico from March 10-11 2009 with the goal of training end users from the southern Gulf of Mexico states of Campeche and Veracruz in the use of tools to support coastal resource management decision-making. The workshop was held at the computer laboratory of the Institute de Ecologia, A.C. (INECOL). This report summarizes the results of that workshop and is a deliverable to our NASA client.

  20. Low cost, low tech SNP genotyping tools for resource-limited areas: Plague in Madagascar as a model.

    Science.gov (United States)

    Mitchell, Cedar L; Andrianaivoarimanana, Voahangy; Colman, Rebecca E; Busch, Joseph; Hornstra-O'Neill, Heidie; Keim, Paul S; Wagner, David M; Rajerison, Minoarisoa; Birdsell, Dawn N

    2017-12-01

    Genetic analysis of pathogenic organisms is a useful tool for linking human cases together and/or to potential environmental sources. The resulting data can also provide information on evolutionary patterns within a targeted species and phenotypic traits. However, the instruments often used to generate genotyping data, such as single nucleotide polymorphisms (SNPs), can be expensive and sometimes require advanced technologies to implement. This places many genotyping tools out of reach for laboratories that do not specialize in genetic studies and/or lack the requisite financial and technological resources. To address this issue, we developed a low cost and low tech genotyping system, termed agarose-MAMA, which combines traditional PCR and agarose gel electrophoresis to target phylogenetically informative SNPs. To demonstrate the utility of this approach for generating genotype data in a resource-constrained area (Madagascar), we designed an agarose-MAMA system targeting previously characterized SNPs within Yersinia pestis, the causative agent of plague. We then used this system to genetically type pathogenic strains of Y. pestis in a Malagasy laboratory not specialized in genetic studies, the Institut Pasteur de Madagascar (IPM). We conducted rigorous assay performance validations to assess potential variation introduced by differing research facilities, reagents, and personnel and found no difference in SNP genotyping results. These agarose-MAMA PCR assays are currently employed as an investigative tool at IPM, providing Malagasy researchers a means to improve the value of their plague epidemiological investigations by linking outbreaks to potential sources through genetic characterization of isolates and to improve understanding of disease ecology that may contribute to a long-term control effort. The success of our study demonstrates that the SNP-based genotyping capacity of laboratories in developing countries can be expanded with manageable financial cost for

  1. The Current Status of Germplum Database: a Tool for Characterization of Plum Genetic Resources in Romania

    Directory of Open Access Journals (Sweden)

    Monica Harta

    2016-11-01

    Full Text Available In Romania, Prunus genetic resources are kept in collections of varieties, populations and biotypes, mainly located in research and development institutes or fruit growing stations and, in the last years, by some private enterprises. Creating the experimental model for the Germplum database based on phenotypic descriptors and SSR molecular markers analysis is an important and topical objective for the efficient characterization of genetic resources and also for establishing a public-private partnership for the effective management of plum germplasm resources in Romania. The technical development of the Germplum database was completed and data will be added continuously after characterizing each new accession.

  2. Integrating GRID tools to build a computing resource broker: activities of DataGrid WP1

    International Nuclear Information System (INIS)

    Anglano, C.; Barale, S.; Gaido, L.; Guarise, A.; Lusso, S.; Werbrouck, A.

    2001-01-01

    Resources on a computational Grid are geographically distributed, heterogeneous in nature, owned by different individuals or organizations with their own scheduling policies, have different access cost models with dynamically varying loads and availability conditions. This makes traditional approaches to workload management, load balancing and scheduling inappropriate. The first work package (WP1) of the EU-funded DataGrid project is addressing the issue of optimizing the distribution of jobs onto Grid resources based on a knowledge of the status and characteristics of these resources that is necessarily out-of-date (collected in a finite amount of time at a very loosely coupled site). The authors describe the DataGrid approach in integrating existing software components (from Condor, Globus, etc.) to build a Grid Resource Broker, and the early efforts to define a workable scheduling strategy

  3. The Registry of Knowledge Translation Methods and Tools: a resource to support evidence-informed public health.

    Science.gov (United States)

    Peirson, Leslea; Catallo, Cristina; Chera, Sunita

    2013-08-01

    This paper examines the development of a globally accessible online Registry of Knowledge Translation Methods and Tools to support evidence-informed public health. A search strategy, screening and data extraction tools, and writing template were developed to find, assess, and summarize relevant methods and tools. An interactive website and searchable database were designed to house the registry. Formative evaluation was undertaken to inform refinements. Over 43,000 citations were screened; almost 700 were full-text reviewed, 140 of which were included. By November 2012, 133 summaries were available. Between January 1 and November 30, 2012 over 32,945 visitors from more than 190 countries accessed the registry. Results from 286 surveys and 19 interviews indicated the registry is valued and useful, but would benefit from a more intuitive indexing system and refinements to the summaries. User stories and promotional activities help expand the reach and uptake of knowledge translation methods and tools in public health contexts. The National Collaborating Centre for Methods and Tools' Registry of Methods and Tools is a unique and practical resource for public health decision makers worldwide.

  4. A vulnerability tool for adapting water and aquatic resources to climate change and extremes on the Shoshone National Forest, Wyoming

    Science.gov (United States)

    Rice, J.; Joyce, L. A.; Armel, B.; Bevenger, G.; Zubic, R.

    2011-12-01

    Climate change introduces a significant challenge for land managers and decision makers managing the natural resources that provide many benefits from forests. These benefits include water for urban and agricultural uses, wildlife habitat, erosion and climate control, aquifer recharge, stream flows regulation, water temperature regulation, and cultural services such as outdoor recreation and aesthetic enjoyment. The Forest Service has responded to this challenge by developing a national strategy for responding to climate change (the National Roadmap for Responding to Climate Change, July 2010). In concert with this national strategy, the Forest Service's Westwide Climate Initiative has conducted 4 case studies on individual Forests in the western U.S to develop climate adaptation tools. Western National Forests are particularly vulnerable to climate change as they have high-mountain topography, diversity in climate and vegetation, large areas of water limited ecosystems, and increasing urbanization. Information about the vulnerability and capacity of resources to adapt to climate change and extremes is lacking. There is an urgent need to provide customized tools and synthesized local scale information about the impacts to resources from future climate change and extremes, as well as develop science based adaptation options and strategies in National Forest management and planning. The case study on the Shoshone National Forest has aligned its objectives with management needs by developing a climate extreme vulnerability tool that guides adaptation options development. The vulnerability tool determines the likely degree to which native Yellowstone cutthroat trout and water availability are susceptible to, or unable to cope with adverse effects of climate change extremes. We spatially categorize vulnerability for water and native trout resources using exposure, sensitivity, and adaptive capacity indicators that use minimum and maximum climate and GIS data. Results

  5. Sustaining an Online, Shared Community Resource for Models, Robust Open source Software Tools and Data for Volcanology - the Vhub Experience

    Science.gov (United States)

    Patra, A. K.; Valentine, G. A.; Bursik, M. I.; Connor, C.; Connor, L.; Jones, M.; Simakov, N.; Aghakhani, H.; Jones-Ivey, R.; Kosar, T.; Zhang, B.

    2015-12-01

    Over the last 5 years we have created a community collaboratory Vhub.org [Palma et al, J. App. Volc. 3:2 doi:10.1186/2191-5040-3-2] as a place to find volcanology-related resources, and a venue for users to disseminate tools, teaching resources, data, and an online platform to support collaborative efforts. As the community (current active users > 6000 from an estimated community of comparable size) embeds the tools in the collaboratory into educational and research workflows it became imperative to: a) redesign tools into robust, open source reusable software for online and offline usage/enhancement; b) share large datasets with remote collaborators and other users seamlessly with security; c) support complex workflows for uncertainty analysis, validation and verification and data assimilation with large data. The focus on tool development/redevelopment has been twofold - firstly to use best practices in software engineering and new hardware like multi-core and graphic processing units. Secondly we wish to enhance capabilities to support inverse modeling, uncertainty quantification using large ensembles and design of experiments, calibration, validation. Among software engineering practices we practice are open source facilitating community contributions, modularity and reusability. Our initial targets are four popular tools on Vhub - TITAN2D, TEPHRA2, PUFF and LAVA. Use of tools like these requires many observation driven data sets e.g. digital elevation models of topography, satellite imagery, field observations on deposits etc. These data are often maintained in private repositories that are privately shared by "sneaker-net". As a partial solution to this we tested mechanisms using irods software for online sharing of private data with public metadata and access limits. Finally, we adapted use of workflow engines (e.g. Pegasus) to support the complex data and computing workflows needed for usage like uncertainty quantification for hazard analysis using physical

  6. Application of Fluorescence Two-Dimensional Difference In-Gel Electrophoresis as a Proteomic Biomarker Discovery Tool in Muscular Dystrophy Research

    Science.gov (United States)

    Carberry, Steven; Zweyer, Margit; Swandulla, Dieter; Ohlendieck, Kay

    2013-01-01

    In this article, we illustrate the application of difference in-gel electrophoresis for the proteomic analysis of dystrophic skeletal muscle. The mdx diaphragm was used as a tissue model of dystrophinopathy. Two-dimensional gel electrophoresis is a widely employed protein separation method in proteomic investigations. Although two-dimensional gels usually underestimate the cellular presence of very high molecular mass proteins, integral membrane proteins and low copy number proteins, this method is extremely powerful in the comprehensive analysis of contractile proteins, metabolic enzymes, structural proteins and molecular chaperones. This gives rise to two-dimensional gel electrophoretic separation as the method of choice for studying contractile tissues in health and disease. For comparative studies, fluorescence difference in-gel electrophoresis has been shown to provide an excellent biomarker discovery tool. Since aged diaphragm fibres from the mdx mouse model of Duchenne muscular dystrophy closely resemble the human pathology, we have carried out a mass spectrometry-based comparison of the naturally aged diaphragm versus the senescent dystrophic diaphragm. The proteomic comparison of wild type versus mdx diaphragm resulted in the identification of 84 altered protein species. Novel molecular insights into dystrophic changes suggest increased cellular stress, impaired calcium buffering, cytostructural alterations and disturbances of mitochondrial metabolism in dystrophin-deficient muscle tissue. PMID:24833232

  7. Brown-Like Adipocyte Progenitors Derived from Human iPS Cells: A New Tool for Anti-obesity Drug Discovery and Cell-Based Therapy?

    Science.gov (United States)

    Yao, Xi; Salingova, Barbara; Dani, Christian

    2018-04-10

    Alternative strategies are urgently required to fight obesity and associated metabolic disorders including diabetes and cardiovascular diseases. Brown and brown-like adipocytes (BAs) store fat, but in contrast to white adipocytes, activated BAs are equipped to dissipate energy stored. Therefore, BAs represent promising cell targets to counteract obesity. However, the scarcity of BAs in adults is a major limitation for a BA-based therapy of obesity, and the notion to increase the BA mass by transplanting BA progenitors (BAPs) in obese patients recently emerged. The next challenge is to identify an abundant and reliable source of BAPs. In this chapter, we describe the capacity of human-induced pluripotent stem cells (hiPSCs) to generate BAPs able to differentiate at a high efficiency with no gene transfer. This cell model represents an unlimited source of human BAPs that in a near future may be a suitable tool for both therapeutic transplantation and for the discovery of novel efficient and safe anti-obesity drugs. The generation of a relevant cell model, such as hiPSC-BAs in 3D adipospheres enriched with macrophages and endothelial cells to better mimic the microenvironment within the adipose tissue, will be the next critical step.

  8. The arctic water resource vulnerability index: An integrated assessment tool for community resilience and vulnerability with respect to freshwater

    Science.gov (United States)

    Alessa, L.; Kliskey, A.; Lammers, R.; Arp, C.; White, D.; Hinzman, L.; Busey, R.

    2008-01-01

    People in the Arctic face uncertainty in their daily lives as they contend with environmental changes at a range of scales from local to global. Freshwater is a critical resource to people, and although water resource indicators have been developed that operate from regional to global scales and for midlatitude to equatorial environments, no appropriate index exists for assessing the vulnerability of Arctic communities to changing water resources at the local scale. The Arctic Water Resource Vulnerability Index (AWRVI) is proposed as a tool that Arctic communities can use to assess their relative vulnerability-resilience to changes in their water resources from a variety of biophysical and socioeconomic processes. The AWRVI is based on a social-ecological systems perspective that includes physical and social indicators of change and is demonstrated in three case study communities/watersheds in Alaska. These results highlight the value of communities engaging in the process of using the AWRVI and the diagnostic capability of examining the suite of constituent physical and social scores rather than the total AWRVI score alone. ?? 2008 Springer Science+Business Media, LLC.

  9. Volatility Discovery

    DEFF Research Database (Denmark)

    Dias, Gustavo Fruet; Scherrer, Cristina; Papailias, Fotis

    The price discovery literature investigates how homogenous securities traded on different markets incorporate information into prices. We take this literature one step further and investigate how these markets contribute to stochastic volatility (volatility discovery). We formally show...... that the realized measures from homogenous securities share a fractional stochastic trend, which is a combination of the price and volatility discovery measures. Furthermore, we show that volatility discovery is associated with the way that market participants process information arrival (market sensitivity......). Finally, we compute volatility discovery for 30 actively traded stocks in the U.S. and report that Nyse and Arca dominate Nasdaq....

  10. Life cycle assessment as development and decision support tool for wastewater resource recovery technology

    DEFF Research Database (Denmark)

    Fang, Linda L.; Valverde Perez, Borja; Damgaard, Anders

    2016-01-01

    resource recovery. The freshwater and nutrient content of wastewater are recognized as potential valuable resources that can be recovered for beneficial reuse. Both recovery and reuse are intended to address existing environmental concerns, for example, water scarcity and use of non-renewable phosphorus...... and water recovery system in its potential operating environment, we assess the potential environmental impacts of such a system using the EASETECH model. In the simulation, recovered water and nutrients are used in scenarios of agricultural irrigation-fertilization and aquifer recharge. In these scenarios......, TRENS reduces global warming up to 15% and marine eutrophication impacts up to 9% compared to conventional treatment. This is due to the recovery and reuse of nutrient resources, primarily nitrogen. The key environmental concerns obtained through the LCA are linked to increased human toxicity impacts...

  11. Wheat Rust Information Resources - Integrated tools and data for improved decision making

    DEFF Research Database (Denmark)

    Hodson, David; Hansen, Jens Grønbech; Lassen, Poul

    giving access to an unprecedented set of data for rust surveys, alternate hosts (barberry), rust pathotypes, trap nurseries and resistant cultivars. Standardized protocols for data collection have permitted the development of a comprehensive data management system, named the Wheat Rust Toolbox....... Integration of the CIMMYT Wheat Atlas and the Genetic Resources Information System (GRIS) databases provides a rich resource on wheat cultivars and their resistance to important rust races. Data access is facilitated via dedicated web portals such as Rust Tracker (www.rusttracker.org) and the Global Rust...

  12. Large scale mapping of groundwater resources using a highly integrated set of tools

    DEFF Research Database (Denmark)

    Søndergaard, Verner; Auken, Esben; Christiansen, Anders Vest

    large areas with information from an optimum number of new investigation boreholes, existing boreholes, logs and water samples to get an integrated and detailed description of the groundwater resources and their vulnerability.Development of more time efficient and airborne geophysical data acquisition...... platforms (e.g. SkyTEM) have made large-scale mapping attractive and affordable in the planning and administration of groundwater resources. The handling and optimized use of huge amounts of geophysical data covering large areas has also required a comprehensive database, where data can easily be stored...

  13. Disk Rock Cutting Tool for the Implementation of Resource-Saving Technologies of Mining of Solid Minerals

    Science.gov (United States)

    Manietyev, Leonid; Khoreshok, Aleksey; Tsekhin, Alexander; Borisov, Andrey

    2017-11-01

    The directions of a resource and energy saving when creating a boom-type effectors of roadheaders of selective action with disc rock cutting tools on a multi-faceted prisms for the destruction of formation of minerals and rocks pricemax are presented. Justified reversing the modes of the crowns and booms to improve the efficiency of mining works. Parameters of destruction of coal and rock faces by the disk tool of a biconical design with the unified fastening knots to many-sided prisms on effectors of extraction mining machines are determined. Parameters of tension of the interfaced elements of knots of fastening of the disk tool at static interaction with the destroyed face of rocks are set. The technical solutions containing the constructive and kinematic communications realizing counter and reverse mode of rotation of two radial crowns with the disk tool on trihedral prisms and cases of booms with the disk tool on tetrahedral prisms in internal space between two axial crowns with the cutter are proposed. Reserves of expansion of the front of loading outside a table of a feeder of the roadheader of selective action, including side zones in which loading corridors by blades of trihedral prisms in internal space between two radial crowns are created are revealed.

  14. NetMap: a new tool in support of watershed science and resource management.

    Science.gov (United States)

    L. Benda; D. Miller; K. Andras; P. Bigelow; G. Reeves; D. Michael

    2007-01-01

    In this paper, we show how application of principles of river ecology can guide use of a comprehensive terrain database within geographic information system (GIS) to facilitate watershed analysis relevant to natural resource management. We present a unique arrangement of a terrain database, GIS, and principles of riverine ecology for the purpose of advancing watershed...

  15. Forest adaptation resources: Climate change tools and approaches for land managers

    Science.gov (United States)

    Chris Swanston; Maria, eds. Janowiak

    2012-01-01

    The forests of northern Wisconsin, a defining feature of the region's landscape, are expected to undergo numerous changes in response to the changing climate. This document provides a collection of resources designed to help forest managers incorporate climate change considerations into management and devise adaptation tactics. It was developed in northern...

  16. Supporting product development with a practical tool for applying the strategy of resource circulation

    NARCIS (Netherlands)

    Toxopeus, Marten E.; van den Hout, N.B.; van Diepen, B.G.D.; Owsianiak, Mikolaj; Bey, Niki; Ryberg, Morten; Hauschild, Michael Z.; Laurent, Alexis; Leclrec, Alexandra; Niero, Monia; Dong, Yan; Olsen, Stig I.

    In their pursuit of a transition towards a circular economy, manufacturers are challenged to implement abstract principles of circularity in product development. Therefore, this paper mentions strategies to enhance resource-effectivity by design. The specific industry case of Remeha, a manufacturer

  17. Pharmacological screening technologies for venom peptide discovery.

    Science.gov (United States)

    Prashanth, Jutty Rajan; Hasaballah, Nojod; Vetter, Irina

    2017-12-01

    Venomous animals occupy one of the most successful evolutionary niches and occur on nearly every continent. They deliver venoms via biting and stinging apparatuses with the aim to rapidly incapacitate prey and deter predators. This has led to the evolution of venom components that act at a number of biological targets - including ion channels, G-protein coupled receptors, transporters and enzymes - with exquisite selectivity and potency, making venom-derived components attractive pharmacological tool compounds and drug leads. In recent years, plate-based pharmacological screening approaches have been introduced to accelerate venom-derived drug discovery. A range of assays are amenable to this purpose, including high-throughput electrophysiology, fluorescence-based functional and binding assays. However, despite these technological advances, the traditional activity-guided fractionation approach is time-consuming and resource-intensive. The combination of screening techniques suitable for miniaturization with sequence-based discovery approaches - supported by advanced proteomics, mass spectrometry, chromatography as well as synthesis and expression techniques - promises to further improve venom peptide discovery. Here, we discuss practical aspects of establishing a pipeline for venom peptide drug discovery with a particular emphasis on pharmacology and pharmacological screening approaches. This article is part of the Special Issue entitled 'Venom-derived Peptides as Pharmacological Tools.' Copyright © 2017 Elsevier Ltd. All rights reserved.

  18. Tools for Scientist Engagement in E/PO: NASA SMD Community Workspace and Online Resources

    Science.gov (United States)

    Dalton, H.; Shipp, S. S.; Grier, J.; Gross, N. A.; Buxner, S.; Bartolone, L.; Peticolas, L. M.; Woroner, M.; Schwerin, T. G.

    2014-12-01

    The Science Mission Directorate (SMD) Science Education and Public Outreach (E/PO) Forums are here to help you get involved in E/PO! The Forums have been developing several online resources to support scientists who are - or who are interested in becoming - involved in E/PO. These include NASA Wavelength, EarthSpace, and the SMD E/PO online community workspace. NASA Wavelength is the one-stop shop of all peer-reviewed NASA education resources to find materials you - or your audiences - can use. Browse by audience (pre-K through 12, higher education, and informal education) or topic, or choose to search for something specific by keyword and audience. http://nasawavelength.org. EarthSpace, an online clearinghouse of Earth and space materials for use in the higher education classroom, is driven by a powerful search engine that allows you to browse the collection of resources by science topic, audience, type of material or key terms. All materials are peer-reviewed before posting, and because all submissions receive a digital object identifier (doi), submitted materials can be listed as publications. http://www.lpi.usra.edu/earthspace. The SMD E/PO online community workspace contains many resources for scientists. These include one-page guides on how to get involved, tips on how to make the most of your time spent on E/PO, and sample activities, as well as news on funding, policy, and what's happening in the E/PO community. The workspace also provides scientists and the public pathways to find opportunities for participation in E/PO, to learn about SMD E/PO projects and their impacts, to connect with SMD E/PO practitioners, and to explore resources to improve professional E/PO practice, including literature reviews, information about the Next Generation Science Standards, and best practices in evaluation and engaging diverse audiences. http://smdepo.org.

  19. Tools and Techniques for Evaluating the Effects of Maintenance Resource Management (MRM) in Air Safety

    Science.gov (United States)

    Taylor, James C.

    2002-01-01

    This research project was designed as part of a larger effort to help Human Factors (HF) implementers, and others in the aviation maintenance community, understand, evaluate, and validate the impact of Maintenance Resource Management (MRM) training programs, and other MRM interventions; on participant attitudes, opinions, behaviors, and ultimately on enhanced safety performance. It includes research and development of evaluation methodology as well as examination of psychological constructs and correlates of maintainer performance. In particular, during 2001, three issues were addressed. First a prototype process for measuring performance was developed and used. Second an automated calculator was developed to aid the HF implementer user in analyzing and evaluating local survey data. These results include being automatically compared with the experience from all MRM programs studied since 1991. Third the core survey (the Maintenance Resource Management Technical Operations Questionnaire, or 'MRM/TOQ') was further developed and tested to include topics of added relevance to the industry.

  20. The urban harvest approach as framework and planning tool for improved water and resource cycles.

    Science.gov (United States)

    Leusbrock, I; Nanninga, T A; Lieberg, K; Agudelo-Vera, C M; Keesman, K J; Zeeman, G; Rijnaarts, H H M

    2015-01-01

    Water and resource availability in sufficient quantity and quality for anthropogenic needs represents one of the main challenges in the coming decades. To prepare for upcoming challenges such as increased urbanization and climate change related consequences, innovative and improved resource management concepts are indispensable. In recent years we have developed and applied the urban harvest approach (UHA). The UHA aims to model and quantify the urban water cycle on different temporal and spatial scales. This approach allowed us to quantify the impact of the implementation of water saving measures and new water treatment concepts in cities. In this paper we will introduce the UHA and its application for urban water cycles. Furthermore, we will show first results for an extension to energy cycles and highlight future research items (e.g. nutrients, water-energy-nexus).

  1. GIS as a tool in participatory natural resource management: Examples from the Peruvian Andes

    OpenAIRE

    Bussink, C.

    2003-01-01

    Metadata only record Geographic Information Systems (GIS) are often seen as incompatible with participatory processes. However, since the late 1990s, attempts have been made in numerous projects around the world to define 'best practices' for improved natural resource management projects that integrate participation and accurate spatial information, using GIS (for example, see www.iapad.org/participatory_gis.htm). This article describes a project in the Peruvian Andes where spatial informa...

  2. FREEWAT: FREE and open source software tools for WATer resource management

    OpenAIRE

    Rossetto, Rudy; Borsi, Iacopo; Foglia, Laura

    2015-01-01

    FREEWAT is an HORIZON 2020 project financed by the EU Commission under the call WATER INNOVATION: BOOSTING ITS VALUE FOR EUROPE. FREEWAT main result will be an open source and public domain GIS integrated modelling environment for the simulation of water quantity and quality in surface water and groundwater with an integrated water management and planning module. FREEWAT aims at promoting water resource management by simplifying the application of the Water Framework Directive and other EU wa...

  3. Computational modeling as a tool for water resources management: an alternative approach to problems of multiple uses

    Directory of Open Access Journals (Sweden)

    Haydda Manolla Chaves da Hora

    2012-04-01

    Full Text Available Today in Brazil there are many cases of incompatibility regarding use of water and its availability. Due to the increase in required variety and volume, the concept of multiple uses was created, as stated by Pinheiro et al. (2007. The use of the same resource to satisfy different needs with several restrictions (qualitative and quantitative creates conflicts. Aiming to minimize these conflicts, this work was applied to the particular cases of Hydrographic Regions VI and VIII of Rio de Janeiro State, using computational modeling techniques (based on MOHID software – Water Modeling System as a tool for water resources management.

  4. Toxins and drug discovery.

    Science.gov (United States)

    Harvey, Alan L

    2014-12-15

    Components from venoms have stimulated many drug discovery projects, with some notable successes. These are briefly reviewed, from captopril to ziconotide. However, there have been many more disappointments on the road from toxin discovery to approval of a new medicine. Drug discovery and development is an inherently risky business, and the main causes of failure during development programmes are outlined in order to highlight steps that might be taken to increase the chances of success with toxin-based drug discovery. These include having a clear focus on unmet therapeutic needs, concentrating on targets that are well-validated in terms of their relevance to the disease in question, making use of phenotypic screening rather than molecular-based assays, and working with development partners with the resources required for the long and expensive development process. Copyright © 2014 The Author. Published by Elsevier Ltd.. All rights reserved.

  5. DoD Resource Augmentation for Civilian Consequence Management (DRACCM) Tool

    Science.gov (United States)

    2015-07-01

    was to develop a tool that is capable of assessing the effect of CBRN exposures on civilian populations and medical infrastructure in order to...allows planners to import exposures , calculate time-dependent casualties, and to assess the beneficial effect of medical countermeasures including the...option 2. The ability to import exposure data from HPAC and JEM 3. Health effects models that underwent Independent Verification & Validation (IV&V) by

  6. Low cost, low tech SNP genotyping tools for resource-limited areas: Plague in Madagascar as a model.

    Directory of Open Access Journals (Sweden)

    Cedar L Mitchell

    2017-12-01

    Full Text Available Genetic analysis of pathogenic organisms is a useful tool for linking human cases together and/or to potential environmental sources. The resulting data can also provide information on evolutionary patterns within a targeted species and phenotypic traits. However, the instruments often used to generate genotyping data, such as single nucleotide polymorphisms (SNPs, can be expensive and sometimes require advanced technologies to implement. This places many genotyping tools out of reach for laboratories that do not specialize in genetic studies and/or lack the requisite financial and technological resources. To address this issue, we developed a low cost and low tech genotyping system, termed agarose-MAMA, which combines traditional PCR and agarose gel electrophoresis to target phylogenetically informative SNPs.To demonstrate the utility of this approach for generating genotype data in a resource-constrained area (Madagascar, we designed an agarose-MAMA system targeting previously characterized SNPs within Yersinia pestis, the causative agent of plague. We then used this system to genetically type pathogenic strains of Y. pestis in a Malagasy laboratory not specialized in genetic studies, the Institut Pasteur de Madagascar (IPM. We conducted rigorous assay performance validations to assess potential variation introduced by differing research facilities, reagents, and personnel and found no difference in SNP genotyping results. These agarose-MAMA PCR assays are currently employed as an investigative tool at IPM, providing Malagasy researchers a means to improve the value of their plague epidemiological investigations by linking outbreaks to potential sources through genetic characterization of isolates and to improve understanding of disease ecology that may contribute to a long-term control effort.The success of our study demonstrates that the SNP-based genotyping capacity of laboratories in developing countries can be expanded with manageable

  7. Web-based discovery, access and analysis tools for the provision of different data sources like remote sensing products and climate data

    Science.gov (United States)

    Eberle, J.; Hese, S.; Schmullius, C.

    2012-12-01

    To provide different of Earth Observation products in the area of Siberia, the Siberian Earth System Science Cluster (SIB-ESS-C) was established as a spatial data infrastructure at the University of Jena (Germany), Department for Earth Observation. The infrastructure implements standards published by the Open Geospatial Consortium (OGC) and the International Organization for Standardization (ISO) for data discovery, data access and data analysis. The objective of SIB-ESS-C is to faciliate environmental research and Earth system science in Siberia. Several products from the Moderate Resolution Imaging Spectroradiometer sensor were integrated by serving ISO-compliant Metadata and providing OGC-compliant Web Map Service for data visualization and Web Coverage Services / Web Feature Service for data access. Furthermore climate data from the World Meteorological Organization were downloaded, converted, provided as OGC Sensor Observation Service. Each climate data station is described with ISO-compliant Metadata. All these datasets from multiple sources are provided within the SIB-ESS-C infrastructure (figure 1). Furthermore an automatic workflow integrates updates of these datasets daily. The brokering approach within the SIB-ESS-C system is to collect data from different sources, convert the data into common data formats, if necessary, and provide them with standardized Web services. Additional tools are made available within the SIB-ESS-C Geoportal for an easy access to download and analysis functions (figure 2). The data can be visualized, accessed and analysed with this Geoportal. Providing OGC-compliant services the data can also be accessed with other OGC-compliant clients.; Figure 1. Technical Concept of SIB-ESS-C providing different data sources ; Figure 2. Screenshot of the web-based SIB-ESS-C system.

  8. Analysis of Multiple Genomic Sequence Alignments: A Web Resource, Online Tools, and Lessons Learned From Analysis of Mammalian SCL Loci

    Science.gov (United States)

    Chapman, Michael A.; Donaldson, Ian J.; Gilbert, James; Grafham, Darren; Rogers, Jane; Green, Anthony R.; Göttgens, Berthold

    2004-01-01

    Comparative analysis of genomic sequences is becoming a standard technique for studying gene regulation. However, only a limited number of tools are currently available for the analysis of multiple genomic sequences. An extensive data set for the testing and training of such tools is provided by the SCL gene locus. Here we have expanded the data set to eight vertebrate species by sequencing the dog SCL locus and by annotating the dog and rat SCL loci. To provide a resource for the bioinformatics community, all SCL sequences and functional annotations, comprising a collation of the extensive experimental evidence pertaining to SCL regulation, have been made available via a Web server. A Web interface to new tools specifically designed for the display and analysis of multiple sequence alignments was also implemented. The unique SCL data set and new sequence comparison tools allowed us to perform a rigorous examination of the true benefits of multiple sequence comparisons. We demonstrate that multiple sequence alignments are, overall, superior to pairwise alignments for identification of mammalian regulatory regions. In the search for individual transcription factor binding sites, multiple alignments markedly increase the signal-to-noise ratio compared to pairwise alignments. PMID:14718377

  9. Bioinformatics for cancer immunotherapy target discovery

    DEFF Research Database (Denmark)

    Olsen, Lars Rønn; Campos, Benito; Barnkob, Mike Stein

    2014-01-01

    therapy target discovery in a bioinformatics analysis pipeline. We describe specialized bioinformatics tools and databases for three main bottlenecks in immunotherapy target discovery: the cataloging of potentially antigenic proteins, the identification of potential HLA binders, and the selection epitopes...

  10. Generating relevant climate adaptation science tools in concert with local natural resource agencies

    Science.gov (United States)

    Micheli, L.; Flint, L. E.; Veloz, S.; Heller, N. E.

    2015-12-01

    To create a framework for adapting to climate change, decision makers operating at the urban-wildland interface need to define climate vulnerabilities in the context of site-specific opportunities and constraints relative to water supply, land use suitability, wildfire risks, ecosystem services and quality of life. Pepperwood's TBC3.org is crafting customized climate vulnerability assessments with selected water and natural resource agencies of California's Sonoma, Marin, Napa and Mendocino counties under the auspices of Climate Ready North Bay, a public-private partnership funded by the California Coastal Conservancy. Working directly with managers from the very start of the process to define resource-specific information needs, we are developing high-resolution, spatially-explicit data products to help local governments and agency staff implement informed and effective climate adaptation strategies. Key preliminary findings for the region using the USGS' Basin Characterization Model (at a 270 m spatial resolution) include a unidirectional trend, independent of greater or lesser precipitation, towards increasing climatic water deficits across model scenarios. Therefore a key message is that managers will be facing an increasingly arid environment. Companion models translate the impacts of shifting climate and hydrology on vegetation composition and fire risks. The combination of drought stress on water supplies and native vegetation with an approximate doubling of fire risks may demand new approaches to watershed planning. Working with agencies we are exploring how to build capacity for protection and enhancement of key watershed functions with a focus on groundwater recharge, facilitating greater drought tolerance in forest and rangeland systems, and considering more aggressive approaches to management of fuel loads. Lessons learned about effective engagement include the need for extended in-depth dialog, translation of key climate adaptation questions into

  11. Utilization and perceived problems of online medical resources and search tools among different groups of European physicians.

    Science.gov (United States)

    Kritz, Marlene; Gschwandtner, Manfred; Stefanov, Veronika; Hanbury, Allan; Samwald, Matthias

    2013-06-26

    There is a large body of research suggesting that medical professionals have unmet information needs during their daily routines. To investigate which online resources and tools different groups of European physicians use to gather medical information and to identify barriers that prevent the successful retrieval of medical information from the Internet. A detailed Web-based questionnaire was sent out to approximately 15,000 physicians across Europe and disseminated through partner websites. 500 European physicians of different levels of academic qualification and medical specialization were included in the analysis. Self-reported frequency of use of different types of online resources, perceived importance of search tools, and perceived search barriers were measured. Comparisons were made across different levels of qualification (qualified physicians vs physicians in training, medical specialists without professorships vs medical professors) and specialization (general practitioners vs specialists). Most participants were Internet-savvy, came from Austria (43%, 190/440) and Switzerland (31%, 137/440), were above 50 years old (56%, 239/430), stated high levels of medical work experience, had regular patient contact and were employed in nonacademic health care settings (41%, 177/432). All groups reported frequent use of general search engines and cited "restricted accessibility to good quality information" as a dominant barrier to finding medical information on the Internet. Physicians in training reported the most frequent use of Wikipedia (56%, 31/55). Specialists were more likely than general practitioners to use medical research databases (68%, 185/274 vs 27%, 24/88; χ²₂=44.905, Presources on the Internet and frequent reliance on general search engines and social media among physicians require further attention. Possible solutions may be increased governmental support for the development and popularization of user-tailored medical search tools and open

  12. Engineering a mobile health tool for resource-poor settings to assess and manage cardiovascular disease risk: SMARThealth study.

    Science.gov (United States)

    Raghu, Arvind; Praveen, Devarsetty; Peiris, David; Tarassenko, Lionel; Clifford, Gari

    2015-04-29

    The incidence of chronic diseases in low- and middle-income countries is rapidly increasing both in urban and rural regions. A major challenge for health systems globally is to develop innovative solutions for the prevention and control of these diseases. This paper discusses the development and pilot testing of SMARTHealth, a mobile-based, point-of-care Clinical Decision Support (CDS) tool to assess and manage cardiovascular disease (CVD) risk in resource-constrained settings. Through pilot testing, the preliminary acceptability, utility, and efficiency of the CDS tool was obtained. The CDS tool was part of an mHealth system comprising a mobile application that consisted of an evidence-based risk prediction and management algorithm, and a server-side electronic medical record system. Through an agile development process and user-centred design approach, key features of the mobile application that fitted the requirements of the end users and environment were obtained. A comprehensive analytics framework facilitated a data-driven approach to investigate four areas, namely, system efficiency, end-user variability, manual data entry errors, and usefulness of point-of-care management recommendations to the healthcare worker. A four-point Likert scale was used at the end of every risk assessment to gauge ease-of-use of the system. The system was field-tested with eleven village healthcare workers and three Primary Health Centre doctors, who screened a total of 292 adults aged 40 years and above. 34% of participants screened by health workers were identified by the CDS tool to be high CVD risk and referred to a doctor. In-depth analysis of user interactions found the CDS tool feasible for use and easily integrable into the workflow of healthcare workers. Following completion of the pilot, further technical enhancements were implemented to improve uptake of the mHealth platform. It will then be evaluated for effectiveness and cost-effectiveness in a cluster randomized

  13. Privacy and User Experience in 21st Century Library Discovery

    Directory of Open Access Journals (Sweden)

    Shayna Pekala

    2017-06-01

    Full Text Available Over the last decade, libraries have taken advantage of emerging technologies to provide new discovery tools to help users find information and resources more efficiently. In the wake of this technological shift in discovery, privacy has become an increasingly prominent and complex issue for libraries. The nature of the web, over which users interact with discovery tools, has substantially diminished the library’s ability to control patron privacy. The emergence of a data economy has led to a new wave of online tracking and surveillance, in which multiple third parties collect and share user data during the discovery process, making it much more difficult, if not impossible, for libraries to protect patron privacy. In addition, users are increasingly starting their searches with web search engines, diminishing the library’s control over privacy even further. While libraries have a legal and ethical responsibility to protect patron privacy, they are simultaneously challenged to meet evolving user needs for discovery. In a world where “search” is synonymous with Google, users increasingly expect their library discovery experience to mimic their experience using web search engines. However, web search engines rely on a drastically different set of privacy standards, as they strive to create tailored, personalized search results based on user data. Libraries are seemingly forced to make a choice between delivering the discovery experience users expect and protecting user privacy. This paper explores the competing interests of privacy and user experience, and proposes possible strategies to address them in the future design of library discovery tools.

  14. Patient Safety Walkaround: a communication tool for the reallocation of health service resources

    Science.gov (United States)

    Ferorelli, Davide; Zotti, Fiorenza; Tafuri, Silvio; Pezzolla, Angela; Dell’Erba, Alessandro

    2016-01-01

    Abstract The study aims to evaluate the use of Patient Safety Walkaround (SWR) execution model in an Italian Hospital, through the adoption of parametric indices, survey tools, and process indicators. In the 1st meeting an interview was conducted to verify the knowledge of concepts of clinical risk management (process indicators). One month after, the questions provided by Frankel (survey tool) were administered. Each month after, an SWR has been carried trying to assist the healthcare professionals and collecting suggestions and solutions. Results have been classified according to Vincent model and analyzed to define an action plan. The amount of risk was quantified by the risk priority index (RPI). An organizational deficit concerns the management of the operating theatre. A state of intolerance was noticed of queuing patients for outpatient visits. The lack of scheduling of the operating rooms is often the cause of sudden displacements. A consequence is the conflict between patients and caregivers. Other causes of the increase of waiting times are the presence in the ward of a single trolley for medications and the presence of a single room for admission and preadmission of patients. Patients victims of allergic reactions have attributed such reactions to the presence of other patients in the process of acceptance and collection of medical history. All health professionals have reported the problem of n high number of relatives of the patients in the wards. Our study indicated the consistency of SWR as instrument to improve the quality of the care. PMID:27741109

  15. IsoNose - Isotopic Tools as Novel Sensors of Earth Surfaces Resources - A new Marie Curie Initial Training Network

    Science.gov (United States)

    von Blanckenburg, Friedhelm; Bouchez, Julien; Bouman, Caludia; Kamber, Balz; Gaillardet, Jérôme; Gorbushina, Anna; James, Rachael; Oelkers, Eric; Tesmer, Maja; Ashton, John

    2015-04-01

    The Marie Curie Initial Training Network »Isotopic Tools as Novel Sensors of Earth Surfaces Resources - IsoNose« is an alliance of eight international partners and five associated partners from science and industry. The project is coordinated at the Helmholtz Centre Potsdam GFZ German Research Centre for Geosciences and will run until February 2018. In the last 15 years advances in novel mass-spectrometric methods have opened opportunities to identify "isotopic fingerprints" of virtually all metals and to make use of the complete information contained in these fingerprints. The understanding developed with these new tools will ultimately guide the exploitation of Earth surface environments. However, progress in bringing these methods to end-users depends on a multi transfer of knowledge between (1) isotope Geochemistry and Microbiology, Environmental Sciences (2), Economic Geology and (3) instrument developers and users in the development of user-friendly and new mass spectrometric methods. IsoNose will focus on three major Earth surface resources: soil, water and metals. These resources are currently being exploited to an unprecedented extent and their efficient management is essential for future sustainable development. Novel stable isotope techniques will disclose the processes generating (e.g. weathering, mineral ore formation) and destroying (e.g. erosion, pollution) these resources. Within this field the following questions will be addressed and answered: - How do novel stable isotope signatures characterize weathering processes? - How do novel stable isotope signatures trace water transport? - How to use novel stable isotope as environmental tracers? - How to use novel stable isotope for detecting and exploring metal ores? - How to improve analytical capabilities and develop robust routine applications for novel stable isotopes? Starting from the central questions mentioned above the IsoNose activities are organized in five scientific work packages: 1

  16. mizuRoute version 1: A river network routing tool for a continental domain water resources applications

    Science.gov (United States)

    Mizukami, Naoki; Clark, Martyn P.; Sampson, Kevin; Nijssen, Bart; Mao, Yixin; McMillan, Hilary; Viger, Roland; Markstrom, Steven; Hay, Lauren E.; Woods, Ross; Arnold, Jeffrey R.; Brekke, Levi D.

    2016-01-01

    This paper describes the first version of a stand-alone runoff routing tool, mizuRoute. The mizuRoute tool post-processes runoff outputs from any distributed hydrologic model or land surface model to produce spatially distributed streamflow at various spatial scales from headwater basins to continental-wide river systems. The tool can utilize both traditional grid-based river network and vector-based river network data. Both types of river network include river segment lines and the associated drainage basin polygons, but the vector-based river network can represent finer-scale river lines than the grid-based network. Streamflow estimates at any desired location in the river network can be easily extracted from the output of mizuRoute. The routing process is simulated as two separate steps. First, hillslope routing is performed with a gamma-distribution-based unit-hydrograph to transport runoff from a hillslope to a catchment outlet. The second step is river channel routing, which is performed with one of two routing scheme options: (1) a kinematic wave tracking (KWT) routing procedure; and (2) an impulse response function – unit-hydrograph (IRF-UH) routing procedure. The mizuRoute tool also includes scripts (python, NetCDF operators) to pre-process spatial river network data. This paper demonstrates mizuRoute's capabilities to produce spatially distributed streamflow simulations based on river networks from the United States Geological Survey (USGS) Geospatial Fabric (GF) data set in which over 54 000 river segments and their contributing areas are mapped across the contiguous United States (CONUS). A brief analysis of model parameter sensitivity is also provided. The mizuRoute tool can assist model-based water resources assessments including studies of the impacts of climate change on streamflow.

  17. Beyond Discovery

    DEFF Research Database (Denmark)

    Korsgaard, Steffen; Sassmannshausen, Sean Patrick

    2017-01-01

    In this chapter we explore four alternatives to the dominant discovery view of entrepreneurship; the development view, the construction view, the evolutionary view, and the Neo-Austrian view. We outline the main critique points of the discovery presented in these four alternatives, as well...

  18. Chemical Discovery

    Science.gov (United States)

    Brown, Herbert C.

    1974-01-01

    The role of discovery in the advance of the science of chemistry and the factors that are currently operating to handicap that function are considered. Examples are drawn from the author's work with boranes. The thesis that exploratory research and discovery should be encouraged is stressed. (DT)

  19. Health worker motivation in Africa: the role of non-financial incentives and human resource management tools

    Directory of Open Access Journals (Sweden)

    Imhoff Ingo

    2006-08-01

    Full Text Available Abstract Background There is a serious human resource crisis in the health sector in developing countries, particularly in Africa. One of the challenges is the low motivation of health workers. Experience and the evidence suggest that any comprehensive strategy to maximize health worker motivation in a developing country context has to involve a mix of financial and non-financial incentives. This study assesses the role of non-financial incentives for motivation in two cases, in Benin and Kenya. Methods The study design entailed semi-structured qualitative interviews with doctors and nurses from public, private and NGO facilities in rural areas. The selection of health professionals was the result of a layered sampling process. In Benin 62 interviews with health professionals were carried out; in Kenya 37 were obtained. Results from individual interviews were backed up with information from focus group discussions. For further contextual information, interviews with civil servants in the Ministry of Health and at the district level were carried out. The interview material was coded and quantitative data was analysed with SPSS software. Results and discussion The study shows that health workers overall are strongly guided by their professional conscience and similar aspects related to professional ethos. In fact, many health workers are demotivated and frustrated precisely because they are unable to satisfy their professional conscience and impeded in pursuing their vocation due to lack of means and supplies and due to inadequate or inappropriately applied human resources management (HRM tools. The paper also indicates that even some HRM tools that are applied may adversely affect the motivation of health workers. Conclusion The findings confirm the starting hypothesis that non-financial incentives and HRM tools play an important role with respect to increasing motivation of health professionals. Adequate HRM tools can uphold and strengthen the

  20. Health worker motivation in Africa: the role of non-financial incentives and human resource management tools.

    Science.gov (United States)

    Mathauer, Inke; Imhoff, Ingo

    2006-08-29

    There is a serious human resource crisis in the health sector in developing countries, particularly in Africa. One of the challenges is the low motivation of health workers. Experience and the evidence suggest that any comprehensive strategy to maximize health worker motivation in a developing country context has to involve a mix of financial and non-financial incentives. This study assesses the role of non-financial incentives for motivation in two cases, in Benin and Kenya. The study design entailed semi-structured qualitative interviews with doctors and nurses from public, private and NGO facilities in rural areas. The selection of health professionals was the result of a layered sampling process. In Benin 62 interviews with health professionals were carried out; in Kenya 37 were obtained. Results from individual interviews were backed up with information from focus group discussions. For further contextual information, interviews with civil servants in the Ministry of Health and at the district level were carried out. The interview material was coded and quantitative data was analysed with SPSS software. The study shows that health workers overall are strongly guided by their professional conscience and similar aspects related to professional ethos. In fact, many health workers are demotivated and frustrated precisely because they are unable to satisfy their professional conscience and impeded in pursuing their vocation due to lack of means and supplies and due to inadequate or inappropriately applied human resources management (HRM) tools. The paper also indicates that even some HRM tools that are applied may adversely affect the motivation of health workers. The findings confirm the starting hypothesis that non-financial incentives and HRM tools play an important role with respect to increasing motivation of health professionals. Adequate HRM tools can uphold and strengthen the professional ethos of doctors and nurses. This entails acknowledging their

  1. Cash budgeting: an underutilized resource management tool in not-for-profit health care entities.

    Science.gov (United States)

    Hauser, R C; Edwards, D E; Edwards, J T

    1991-01-01

    Cash budgeting is generally considered to be an important part of resource management in all businesses. However, respondents to a survey of not-for-profit health care entities revealed that some 40 percent of the participants do not currently prepare cash budgets. Where budgeting occurred, the cash forecasts covered various time frames, and distribution of the document was inconsistent. Most budgets presented cash receipts and disbursements according to operating, investing, and financing activities--a format consistent with the year-end cash flow statement. By routinely preparing monthly cash budgets, the not-for-profit health care entity can project cash inflow/outflow or position with anticipated cash insufficiencies and surpluses. The budget should be compared each month to actual results to evaluate performance. The magnitude and timing of cash flows is much too critical to be left to chance.

  2. saltPAD: A New Analytical Tool for Monitoring Salt Iodization in Low Resource Settings

    Directory of Open Access Journals (Sweden)

    Nicholas M. Myers

    2016-03-01

    Full Text Available We created a paper test card that measures a common iodizing agent, iodate, in salt. To test the analytical metrics, usability, and robustness of the paper test card when it is used in low resource settings, the South African Medical Research Council and GroundWork performed independ‐ ent validation studies of the device. The accuracy and precision metrics from both studies were comparable. In the SAMRC study, more than 90% of the test results (n=1704 were correctly classified as corresponding to adequately or inadequately iodized salt. The cards are suitable for market and household surveys to determine whether salt is adequately iodized. Further development of the cards will improve their utility for monitoring salt iodization during production.

  3. The Concept Maps as a Didactic Resource Tool of Meaningful Learning in Astronomy Themes

    Science.gov (United States)

    Silveira, Felipa Pacífico Ribeiro de Assis; Mendonça, Conceição Aparecida Soares

    2015-07-01

    This article presents the results of an investigation that sought to understand the performance of the conceptual map (MC) as a teaching resource facilitator of meaningful learning of scientific concepts on astronomical themes, developed with elementary school students. The methodology employed to obtain and process the data was based on a quantitative and qualitative approach. On the quantitative level we designed a quasi-experimental research with a control group that did not use the MC and an experimental group that used the MC, both being evaluated in the beginning and end of the process. In this case, the performance of both groups is displayed in a descriptive and analytical study. In the qualitative approach, the MCs were interpreted using the structuring and assigned meanings shared by the student during his/her presentation. The results demonstrated through the improvement of qualifications that the MC made a difference in conceptual learning and in certain skills revealed by learning indicators.

  4. G-REALM: A lake/reservoir monitoring tool for drought monitoring and water resources management.

    Science.gov (United States)

    Birkett, C. M.; Ricko, M.; Beckley, B. D.; Yang, X.; Tetrault, R. L.

    2017-12-01

    G-REALM is a NASA/USDA funded operational program offering water-level products for lakes and reservoirs and these are currently derived from the NASA/CNES Topex/Jason series of satellite radar altimeters. The main stakeholder is the USDA/Foreign Agricultural Service (FAS) though many other end-users utilize the products for a variety of interdisciplinary science and operational programs. The FAS utilize the products within their CropExplorer Decision Support System (DSS) to help assess irrigation potential, and to monitor both short-term (agricultural) and longer-term (hydrological) drought conditions. There is increasing demand for a more global monitoring service that in particular, captures the variations in the smallest (1 to 100km2) reservoirs and water holdings in arid and semi-arid regions. Here, water resources are critical to both agriculture and regional security. A recent G-REALM 10-day resolution product upgrade and expansion has allowed for more accurate lake level products to be released and for a greater number of water bodies to be monitored. The next program phase focuses on the exploration of the enhanced radar altimeter data sets from the Cryosat-2 and Sentinel-3 missions with their improved spatial resolution, and the expansion of the system to the monitoring of 1,000 water bodies across the globe. In addition, a new element, the monitoring of surface water levels in wetland zones, is also being introduced. This aims to satisfy research and stakeholder requirements with respect to programs examining the links between inland fisheries catch potential and declining water levels, and to those monitoring the delicate balance between water resources, agriculture, and fisheries management in arid basins.

  5. Open source GIS based tools to improve hydrochemical water resources management in EU H2020 FREEWAT platform

    Science.gov (United States)

    Criollo, Rotman; Velasco, Violeta; Vázquez-Suñé, Enric; Nardi, Albert; Marazuela, Miguel A.; Rossetto, Rudy; Borsi, Iacopo; Foglia, Laura; Cannata, Massimiliano; De Filippis, Giovanna

    2017-04-01

    Due to the general increase of water scarcity (Steduto et al., 2012), water quantity and quality must be well known to ensure a proper access to water resources in compliance with local and regional directives. This circumstance can be supported by tools which facilitate process of data management and its analysis. Such analyses have to provide research/professionals, policy makers and users with the ability to improve the management of the water resources with standard regulatory guidelines. Compliance with the established standard regulatory guidelines (with a special focus on requirement deriving from the GWD) should have an effective monitoring, evaluation, and interpretation of a large number of physical and chemical parameters. These amounts of datasets have to be assessed and interpreted: (i) integrating data from different sources and gathered with different data access techniques and formats; (ii) managing data with varying temporal and spatial extent; (iii) integrating groundwater quality information with other relevant information such as further hydrogeological data (Velasco et al., 2014) and pre-processing these data generally for the realization of groundwater models. In this context, the Hydrochemical Analysis Tools, akvaGIS Tools, has been implemented within the H2020 FREEWAT project; which aims to manage water resources by modelling water resource management in an open source GIS platform (QGIS desktop). The main goal of AkvaGIS Tools is to improve water quality analysis through different capabilities to improve the case study conceptual model managing all data related into its geospatial database (implemented in Spatialite) and a set of tools for improving the harmonization, integration, standardization, visualization and interpretation of the hydrochemical data. To achieve that, different commands cover a wide range of methodologies for querying, interpreting, and comparing groundwater quality data and facilitate the pre-processing analysis for

  6. Activities, Animations, and Online Tools to Enable Undergraduate Student Learning of Geohazards, Climate Change, and Water Resources

    Science.gov (United States)

    Pratt-Sitaula, B. A.; Walker, B.; Douglas, B. J.; Cronin, V. S.; Funning, G.; Stearns, L. A.; Charlevoix, D.; Miller, M. M.

    2017-12-01

    The NSF-funded GEodesy Tools for Societal Issues (GETSI) project is developing teaching resources for use in introductory and majors-level courses, emphasizing a broad range of geodetic methods and data applied to societally important issues. The modules include a variety of hands-on activities, demonstrations, animations, and interactive online tools in order to facilitate student learning and engagement. A selection of these activities will be showcased at the AGU session. These activities and data analysis exercises are embedded in 4-6 units per module. Modules can take 2-3 weeks of course time total or individual units and activities can be selected and used over just 1-2 class periods. Existing modules are available online via serc.carleton.edu/getsi/ and include "Ice mass and sea level changes", "Imaging active tectonics with LiDAR and InSAR", "Measuring water resources with GPS, gravity, and traditional methods", "Surface process hazards", and "GPS, strain, and earthquakes". Modules, and their activities and demonstrations were designed by teams of faculty and content experts and underwent rigorous classroom testing and review using the process developed by the Science Education Resource Center's InTeGrate Project (serc.carleton.edu/integrate). All modules are aligned to Earth Science and Climate literacy principles. GETSI collaborating institutions are UNAVCO (which runs NSF's Geodetic Facility), Indiana University, and Mt San Antonio College. Initial funding came from NSF's TUES (Transforming Undergraduate Education in STEM). A second phase of funding from NSF IUSE (Improving Undergraduate STEM Education) is just starting and will fund another six modules (including their demonstrations, activities, and hands-on activities) as well as considerably more instructor professional development to facilitate implementation and use.

  7. 3&4D Geomodeling Applied to Mineral Resources Exploration - A New Tool for Targeting Deposits.

    Science.gov (United States)

    Royer, Jean-Jacques; Mejia, Pablo; Caumon, Guillaume; Collon-Drouaillet, Pauline

    2013-04-01

    3 & 4D geomodeling, a computer method for reconstituting the past deformation history of geological formations, has been used in oil and gas exploration for more than a decade for reconstituting fluid migration. It begins nowadays to be applied for exploring with new eyes old mature mining fields and new prospects. We describe shortly the 3&4D geomodeling basic notions, concepts, and methodology when applied to mineral resources assessment and modeling ore deposits, pointing out the advantages, recommendations and limitations, together with new challenges they rise. Several 3D GeoModels of mining explorations selected across Europe will be presented as illustrative case studies which have been achieved during the EU FP7 ProMine research project. It includes: (i) the Cu-Au porphyry deposits in the Hellenic Belt (Greece); (ii) the VMS in the Iberian Pyrite Belt including the Neves Corvo deposit (Portugal) and (iii) the sediment-hosted polymetallic Cu-Ag (Au, PGE) Kupferschiefer ore deposit in the Foresudetic Belt (Poland). In each case full 3D models using surfaces and regular grid (Sgrid) were built from all dataset available from exploration and exploitation including geological primary maps, 2D seismic cross-sections, and boreholes. The level of knowledge may differ from one site to another however those 3D resulting models were used to pilot additional field and exploration works. In the case of the Kupferschiefer, a sequential restoration-decompaction (4D geomodeling) from the Upper Permian to Cenozoic was conducted in the Lubin- Sieroszowice district of Poland. The results help in better understanding the various superimposed mineralization events which occurred through time in this copper deposit. A hydro-fracturing index was then calculated from the estimated overpressures during a Late Cretaceous-Early Paleocene up-lifting, and seems to correlate with the copper content distribution in the ore-series. These results are in agreement with an Early Paleocene

  8. Use of weighted characteristic analysis as a tool in resource assessment

    International Nuclear Information System (INIS)

    Sinding-Larsen, R.

    1979-01-01

    The paper describes a method which permits the integration of information from multiple domains in the assessment of resources. Maps displaying different attributes from multiple domains (e.g. geology, geophysics, geochemistry, remote sensing) are evaluated individually and partitioned in terms of the objective into favourable regions (aggregate of areal units) which are coded '1'; the remaining regions of uncertain significance are coded '0'. For some variables (e.g. geochemical and remote sensing) the spatial concept of 'anomalous' region, defined as any area 'higher' than its surrounding areas, can be used to delineate favourable regions. Binary surfaces generated for each of the geochemical, geological and geophysical variables available within a province of interest are evaluated within a model area and relative weights are calculated for each variable. The computed weights assign levels of importance to the component variables or characteristics of the model. Binary maps for the different domains are then combined into one map which expresses the degree of association between a given areal unit and the model, taking into account the relative weights or importance of the respective variables which characterize the model. An example of the use of weighted characteristic analysis with geological, geophysical and remote sensing variables showed the ability of the method to delineate areas with known mineralization when it was tested in a well-explored area in Norway. (author)

  9. Education and Training Networks as a Tool for Nuclear Security Human Resource Development and Capacity Building

    International Nuclear Information System (INIS)

    Nikonov, D.

    2014-01-01

    Human Resource Development for Capacity Building for Nuclear Security: • Comprehensive Training Programme Objective: To raise awareness, to fill gaps between the actual performance of personnel and the required competencies and skills and, to build-up qualified instructors/trainers. • Promoting Nuclear Security Education Objective: To support the development of teaching material, faculty expertise and preparedness, and the promotion of nuclear security education in collaboration with the academic and scientific community. Ultimate Goal: To develop capabilities for supporting sustainable implementation of the international legal instruments and IAEA guidelines for nuclear security worldwide, and to foster nuclear security culture. Education priorities for the future: • Incorporate feedback from the first pilot program into future academic activities in nuclear security; • Based on feedback from pilot program: • Revise the NSS12 guidance document; • Update educational materials and textbooks. • Support INSEN members, which consider launching MSc programs at their institutions; • Continue promoting nuclear security education as part of existing degree programs (through certificate or concentration options); • Support the use of new forms of teaching and learning in nuclear security education: • Online e-learning degree programmes and modules; • Learning by experience; • Problem-oriented learning tailored to nuclear security functions

  10. The Swedish MS registry – clinical support tool and scientific resource.

    Science.gov (United States)

    Hillert, J; Stawiarz, L

    2015-01-01

    The Swedish MS registry (SMSreg) is designed to assure quality health care for patients with multiple sclerosis (MS). It has been active since 2001 and web-based since 2004. It runs on government funding only and is used in all Swedish neurology departments. The SMSreg currently includes data on 14,500 of Sweden's estimated 17,500 prevalent patients with MS. One important function of SMSreg, to which participation is voluntary, is to serve as a tool for decision support and to provide an easy overview of the patient information needed at clinical visits. This is its core feature and explains why the majority of Swedish MS specialists contribute data. Another success factor for SMSreg is that entered data can be readily accessed, either through a query function into Excel format or through a set of predesigned tables and diagrams in which parameters can be selected. Recent development includes a portal allowing patients to view a summary of their registered data and to report a set of patient-reported outcomes. SMSreg data have been used in close to 100 published scientific reports. Current projects include an incidence cohort (EIMS), post-marketing cohorts of patients on novel disease-modifying drugs (IMSE), and a prevalence cohort (GEMS). As these studies combine physical sampling and questionnaire data with clinical documentation and possible linkage to other public registries, together they provide an excellent platform for integrated MS research. © 2015 John Wiley & Sons A/S. Published by John Wiley & Sons Ltd.

  11. Environmental education and digital resources as tools to raise awareness about radioactivity

    Energy Technology Data Exchange (ETDEWEB)

    Araujo, Liderlanio de Almeida; Leite, Lucia Fernanda C. da Costa, E-mail: liderlanioalmeida@gmail.com, E-mail: lfernanda@unicap.br, E-mail: helena@unicap.br [Universidade Catolica de Pernambuco (UNICAP), Recife, PE (Brazil). Centro de Ciencias e Tecnologia. Curso de Licenciatura em Quimica; Aquino, Katia Aparecida da Silva, E-mail: aquino@ufpe.br [Universidade Federal de Pernambuco (UFPE), Recife, PE (Brazil). Departamento de Energia Nuclear; Gazineu, Maria Helena Paranhos, E-mail: helenaparanhos@recife.ifpe.edu.br [Instituto Federal de Pernambuco (IFPE), Recife, PE (Brazil)

    2013-07-01

    The objective of this study was to investigate the knowledge and awareness of students from the Liceu de Artes e Oficios localized in Recife-Pernambuco, about the theme radioactivity and environment in an interdisciplinary perspective. Thirty first-year high school students participated in this activity. Initially a questionnaire was applied to assess the students' prior knowledge on the topic. Following this stage, a lecture about radioactivity and its multiple uses for the benefit of society was presented to the students, together with a video about the story of radioactivity. A guided visit to the Museum of Radioactivity at the Universidade Federal de Pernambuco - UFPE, Brazil was also promoted. After these activities the questionnaire was reapplied to evaluate the development of students' knowledge. Research in textbooks and in the Internet was also carried out to evaluate the teaching resources used worldwide to study the nuclear issue. From this information a booklet, indicating the benefits of radioactivity was prepared by the students and later distributed to the community. After the activities there was evolution of knowledge on the subject radioactivity. Sixty seven percent of the students were able to make the calculations of half-life, 81% correctly explicated the definition of and αand β particles and γ radiation. Finally, 93% discussed about the contributions of Pierre and Marie Curie and Becquerel as well as their importance on the history of radioactivity. Dynamic activities should be encouraged so that students can learn to build knowledge with autonomy and, in turn, influence the construction of a new society. (author)

  12. Environmental education and digital resources as tools to raise awareness about radioactivity

    International Nuclear Information System (INIS)

    Araujo, Liderlanio de Almeida; Leite, Lucia Fernanda C. da Costa

    2013-01-01

    The objective of this study was to investigate the knowledge and awareness of students from the Liceu de Artes e Oficios localized in Recife-Pernambuco, about the theme radioactivity and environment in an interdisciplinary perspective. Thirty first-year high school students participated in this activity. Initially a questionnaire was applied to assess the students' prior knowledge on the topic. Following this stage, a lecture about radioactivity and its multiple uses for the benefit of society was presented to the students, together with a video about the story of radioactivity. A guided visit to the Museum of Radioactivity at the Universidade Federal de Pernambuco - UFPE, Brazil was also promoted. After these activities the questionnaire was reapplied to evaluate the development of students' knowledge. Research in textbooks and in the Internet was also carried out to evaluate the teaching resources used worldwide to study the nuclear issue. From this information a booklet, indicating the benefits of radioactivity was prepared by the students and later distributed to the community. After the activities there was evolution of knowledge on the subject radioactivity. Sixty seven percent of the students were able to make the calculations of half-life, 81% correctly explicated the definition of and αand β particles and γ radiation. Finally, 93% discussed about the contributions of Pierre and Marie Curie and Becquerel as well as their importance on the history of radioactivity. Dynamic activities should be encouraged so that students can learn to build knowledge with autonomy and, in turn, influence the construction of a new society. (author)

  13. Lung ultrasound as a diagnostic tool for radiographically-confirmed pneumonia in low resource settings.

    Science.gov (United States)

    Ellington, Laura E; Gilman, Robert H; Chavez, Miguel A; Pervaiz, Farhan; Marin-Concha, Julio; Compen-Chang, Patricia; Riedel, Stefan; Rodriguez, Shalim J; Gaydos, Charlotte; Hardick, Justin; Tielsch, James M; Steinhoff, Mark; Benson, Jane; May, Evelyn A; Figueroa-Quintanilla, Dante; Checkley, William

    2017-07-01

    Pneumonia is a leading cause of morbidity and mortality in children worldwide; however, its diagnosis can be challenging, especially in settings where skilled clinicians or standard imaging are unavailable. We sought to determine the diagnostic accuracy of lung ultrasound when compared to radiographically-confirmed clinical pediatric pneumonia. Between January 2012 and September 2013, we consecutively enrolled children aged 2-59 months with primary respiratory complaints at the outpatient clinics, emergency department, and inpatient wards of the Instituto Nacional de Salud del Niño in Lima, Peru. All participants underwent clinical evaluation by a pediatrician and lung ultrasonography by one of three general practitioners. We also consecutively enrolled children without respiratory symptoms. Children with respiratory symptoms had a chest radiograph. We obtained ancillary laboratory testing in a subset. Final clinical diagnoses included 453 children with pneumonia, 133 with asthma, 103 with bronchiolitis, and 143 with upper respiratory infections. In total, CXR confirmed the diagnosis in 191 (42%) of 453 children with clinical pneumonia. A consolidation on lung ultrasound, which is our primary endpoint for pneumonia, had a sensitivity of 88.5%, specificity of 100%, and an area under-the-curve of 0.94 (95% CI 0.92-0.97) when compared to radiographically-confirmed clinical pneumonia. When any abnormality on lung ultrasound was compared to radiographically-confirmed clinical pneumonia the sensitivity increased to 92.2% and the specificity decreased to 95.2%, with an area under-the-curve of 0.94 (95% CI 0.91-0.96). Lung ultrasound had high diagnostic accuracy for the diagnosis of radiographically-confirmed pneumonia. Added benefits of lung ultrasound include rapid testing and high inter-rater agreement. Lung ultrasound may serve as an alternative tool for the diagnosis of pediatric pneumonia. Copyright © 2017 The Author(s). Published by Elsevier Ltd.. All rights

  14. Higgs Discovery

    DEFF Research Database (Denmark)

    Sannino, Francesco

    2013-01-01

    has been challenged by the discovery of a not-so-heavy Higgs-like state. I will therefore review the recent discovery \\cite{Foadi:2012bb} that the standard model top-induced radiative corrections naturally reduce the intrinsic non-perturbative mass of the composite Higgs state towards the desired...... via first principle lattice simulations with encouraging results. The new findings show that the recent naive claims made about new strong dynamics at the electroweak scale being disfavoured by the discovery of a not-so-heavy composite Higgs are unwarranted. I will then introduce the more speculative......I discuss the impact of the discovery of a Higgs-like state on composite dynamics starting by critically examining the reasons in favour of either an elementary or composite nature of this state. Accepting the standard model interpretation I re-address the standard model vacuum stability within...

  15. Evaluating the Auto-MODS Assay, a Novel Tool for Tuberculosis Diagnosis for Use in Resource-Limited Settings

    Science.gov (United States)

    Wang, Linwei; Mohammad, Sohaib H.; Li, Qiaozhi; Rienthong, Somsak; Rienthong, Dhanida; Nedsuwan, Supalert; Mahasirimongkol, Surakameth; Yasui, Yutaka

    2014-01-01

    There is an urgent need for simple, rapid, and affordable diagnostic tests for tuberculosis (TB) to combat the great burden of the disease in developing countries. The microscopic observation drug susceptibility assay (MODS) is a promising tool to fill this need, but it is not widely used due to concerns regarding its biosafety and efficiency. This study evaluated the automated MODS (Auto-MODS), which operates on principles similar to those of MODS but with several key modifications, making it an appealing alternative to MODS in resource-limited settings. In the operational setting of Chiang Rai, Thailand, we compared the performance of Auto-MODS with the gold standard liquid culture method in Thailand, mycobacterial growth indicator tube (MGIT) 960 plus the SD Bioline TB Ag MPT64 test, in terms of accuracy and efficiency in differentiating TB and non-TB samples as well as distinguishing TB and multidrug-resistant (MDR) TB samples. Sputum samples from clinically diagnosed TB and non-TB subjects across 17 hospitals in Chiang Rai were consecutively collected from May 2011 to September 2012. A total of 360 samples were available for evaluation, of which 221 (61.4%) were positive and 139 (38.6%) were negative for mycobacterial cultures according to MGIT 960. Of the 221 true-positive samples, Auto-MODS identified 212 as positive and 9 as negative (sensitivity, 95.9%; 95% confidence interval [CI], 92.4% to 98.1%). Of the 139 true-negative samples, Auto-MODS identified 135 as negative and 4 as positive (specificity, 97.1%; 95% CI, 92.8% to 99.2%). The median time to culture positivity was 10 days, with an interquartile range of 8 to 13 days for Auto-MODS. Auto-MODS is an effective and cost-sensitive alternative diagnostic tool for TB diagnosis in resource-limited settings. PMID:25378569

  16. A new generation of tools for search, recovery and quality evaluation of World Wide Web medical resources.

    Science.gov (United States)

    Aguillo, I

    2000-01-01

    Although the Internet is already a valuable information resource in medicine, there are important challenges to be faced before physicians and general users will have extensive access to this information. As a result of a research effort to compile a health-related Internet directory, new tools and strategies have been developed to solve key problems derived from the explosive growth of medical information on the Net and the great concern over the quality of such critical information. The current Internet search engines lack some important capabilities. We suggest using second generation tools (client-side based) able to deal with large quantities of data and to increase the usability of the records recovered. We tested the capabilities of these programs to solve health-related information problems, recognising six groups according to the kind of topics addressed: Z39.50 clients, downloaders, multisearchers, tracing agents, indexers and mappers. The evaluation of the quality of health information available on the Internet could require a large amount of human effort. A possible solution may be to use quantitative indicators based on the hypertext visibility of the Web sites. The cybermetric measures are valid for quality evaluation if they are derived from indirect peer review by experts with Web pages citing the site. The hypertext links acting as citations need to be extracted from a controlled sample of quality super-sites.

  17. Development of the GREEN (Garden Resources, Education, and Environment Nexus) Tool: An Evidence-Based Model for School Garden Integration.

    Science.gov (United States)

    Burt, Kate Gardner; Koch, Pamela; Contento, Isobel

    2017-10-01

    Researchers have established the benefits of school gardens on students' academic achievement, dietary outcomes, physical activity, and psychosocial skills, yet limited research has been conducted about how school gardens become institutionalized and sustained. Our aim was to develop a tool that captures how gardens are effectively established, integrated, and sustained in schools. We conducted a sequential, exploratory, mixed-methods study. Participants were identified with the help of Grow To Learn, the organization coordinating the New York City school garden initiative, and recruited via e-mail. A stratified, purposeful sample of 21 New York City elementary and middle schools participated in this study throughout the 2013/2014 school year. The sample was stratified in their garden budgets and purposeful in that each of the schools' gardens were determined to be well integrated and sustained. The processes and strategies used by school gardeners to establish well-integrated school gardens were assessed via data collected from surveys, interviews, observations, and concept mapping. Descriptive statistics as well as multidimensional scaling and hierarchical cluster analysis were used to examine the survey and concept mapping data. Qualitative data analysis consisted of thematic coding, pattern matching, explanation building and cross-case synthesis. Nineteen components within four domains of school garden integration were found through the mixed-methods concept mapping analysis. When the analyses of other data were combined, relationships between domains and components emerged. These data resulted in the development of the GREEN (Garden Resources, Education, and Environment Nexus) Tool. When schools with integrated and sustained gardens were studied, patterns emerged about how gardeners achieve institutionalization through different combinations of critical components. These patterns are best described by the GREEN Tool, the first framework to identify how to

  18. 41. DISCOVERY, SEARCH, AND COMMUNICATION OF TEXTUAL KNOWLEDGE RESOURCES IN DISTRIBUTED SYSTEMS a. Discovering and Utilizing Knowledge Sources for Metasearch Knowledge Systems

    Energy Technology Data Exchange (ETDEWEB)

    Zamora, Antonio

    2008-03-18

    Advanced Natural Language Processing Tools for Web Information Retrieval, Content Analysis, and Synthesis. The goal of this SBIR was to implement and evaluate several advanced Natural Language Processing (NLP) tools and techniques to enhance the precision and relevance of search results by analyzing and augmenting search queries and by helping to organize the search output obtained from heterogeneous databases and web pages containing textual information of interest to DOE and the scientific-technical user communities in general. The SBIR investigated 1) the incorporation of spelling checkers in search applications, 2) identification of significant phrases and concepts using a combination of linguistic and statistical techniques, and 3) enhancement of the query interface and search retrieval results through the use of semantic resources, such as thesauri. A search program with a flexible query interface was developed to search reference databases with the objective of enhancing search results from web queries or queries of specialized search systems such as DOE's Information Bridge. The DOE ETDE/INIS Joint Thesaurus was processed to create a searchable database. Term frequencies and term co-occurrences were used to enhance the web information retrieval by providing algorithmically-derived objective criteria to organize relevant documents into clusters containing significant terms. A thesaurus provides an authoritative overview and classification of a field of knowledge. By organizing the results of a search using the thesaurus terminology, the output is more meaningful than when the results are just organized based on the terms that co-occur in the retrieved documents, some of which may not be significant. An attempt was made to take advantage of the hierarchy provided by broader and narrower terms, as well as other field-specific information in the thesauri. The search program uses linguistic morphological routines to find relevant entries regardless of

  19. THE FLAG: A Web Resource of Innovative Assessment Tools for Faculty in College Science, Mathematics, Engineering, and Technology

    Science.gov (United States)

    Zeilik, M.; Mathieu, R. D.; National InstituteScience Education; College Level-One Team

    2000-12-01

    Even the most dedicated college faculty often discover that their students fail to learn what was taught in their courses and that much of what students do learn is quickly forgotten after the final exam. To help college faculty improve student learning in college Science, Mathematics, Engineering and Technology (SMET), the College Level - One Team of the National Institute for Science Education has created the "FLAG" a Field-tested Learning Assessment Guide for SMET faculty. Developed with funding from the National Science Foundation, the FLAG presents in guidebook format a diverse and robust collection of field-tested classroom assessment techniques (CATs), with supporting information on how to apply them in the classroom. Faculty can download the tools and techniques from the website, which also provides a goals clarifier, an assessment primer, a searchable database, and links to additional resources. The CATs and tools have been reviewed by an expert editorial board and the NISE team. These assessment strategies can help faculty improve the learning environments in their SMET courses especially the crucial introductory courses that most strongly shape students' college learning experiences. In addition, the FLAG includes the web-based Student Assessment of Learning Gains. The SALG offers a convenient way to evaluate the impact of your courses on students. It is based on findings that students' estimates of what they gained are more reliable and informative than their observations of what they liked about the course or teacher. It offers accurate feedback on how well the different aspects of teaching helped the students to learn. Students complete the SALG online after a generic template has been modified to fit the learning objectives and activities of your course. The results are presented to the teacher as summary statistics automatically. The FLAG can be found at the NISE "Innovations in SMET Education" website at www.wcer.wisc.edu/nise/cl1

  20. The Human Resources for Health Effort Index: a tool to assess and inform Strategic Health Workforce Investments.

    Science.gov (United States)

    Fort, Alfredo L; Deussom, Rachel; Burlew, Randi; Gilroy, Kate; Nelson, David

    2017-07-19

    Despite its importance, the field of human resources for health (HRH) has lagged in developing methods to measure its status and progress in low- and middle-income countries suffering a workforce crisis. Measures of professional health worker densities and distribution are purely numerical, unreliable, and do not represent the full spectrum of workers providing health services. To provide more information on the multi-dimensional characteristics of human resources for health, in 2013-2014, the global USAID-funded CapacityPlus project, led by IntraHealth International, developed and tested a 79-item HRH Effort Index modeled after the widely used Family Planning Effort Index. The index includes seven recognized HRH dimensions: Leadership and Advocacy; Policy and Governance; Finance; Education and Training; Recruitment, Distribution, and Retention; Human Resources Management; and Monitoring, Evaluation, and Information Systems. Each item is scored from 1 to 10 and scores are averaged with equal weights for each dimension and overall. The questionnaire is applied to knowledgeable informants from public, nongovernmental organization, and private sectors in each country. A pilot test among 49 respondents in Kenya and Nigeria provided useful information to improve, combine, and streamline questions. CapacityPlus applied the revised 50-item questionnaire in 2015 in Burkina Faso, Dominican Republic, Ghana, and Mali, among 92 respondents. Additionally, the index was applied subnationally in the Dominican Republic (16 respondents) and in a consensus-building meeting in Mali (43 respondents) after the national application. The results revealed a range of scores between 3.7 and 6.2 across dimensions, for overall scores between 4.8 and 5.5. Dimensions with lower scores included Recruitment, Distribution, and Retention, while Leadership and Advocacy had higher scores. The tool proved to be well understood and provided key qualitative information on the health workforce to assist

  1. Lessons learned developing a diagnostic tool for HIV-associated dementia feasible to implement in resource-limited settings: pilot testing in Kenya.

    Directory of Open Access Journals (Sweden)

    Judith Kwasa

    Full Text Available To conduct a preliminary evaluation of the utility and reliability of a diagnostic tool for HIV-associated dementia (HAD for use by primary health care workers (HCW which would be feasible to implement in resource-limited settings.In resource-limited settings, HAD is an indication for anti-retroviral therapy regardless of CD4 T-cell count. Anti-retroviral therapy, the treatment for HAD, is now increasingly available in resource-limited settings. Nonetheless, HAD remains under-diagnosed likely because of limited clinical expertise and availability of diagnostic tests. Thus, a simple diagnostic tool which is practical to implement in resource-limited settings is an urgent need.A convenience sample of 30 HIV-infected outpatients was enrolled in Western Kenya. We assessed the sensitivity and specificity of a diagnostic tool for HAD as administered by a primary HCW. This was compared to an expert clinical assessment which included examination by a physician, neuropsychological testing, and in selected cases, brain imaging. Agreement between HCW and an expert examiner on certain tool components was measured using Kappa statistic.The sample was 57% male, mean age was 38.6 years, mean CD4 T-cell count was 323 cells/µL, and 54% had less than a secondary school education. Six (20% of the subjects were diagnosed with HAD by expert clinical assessment. The diagnostic tool was 63% sensitive and 67% specific for HAD. Agreement between HCW and expert examiners was poor for many individual items of the diagnostic tool (K = .03-.65. This diagnostic tool had moderate sensitivity and specificity for HAD. However, reliability was poor, suggesting that substantial training and formal evaluations of training adequacy will be critical to enable HCW to reliably administer a brief diagnostic tool for HAD.

  2. Antibody informatics for drug discovery

    DEFF Research Database (Denmark)

    Shirai, Hiroki; Prades, Catherine; Vita, Randi

    2014-01-01

    to the antibody science in every project in antibody drug discovery. Recent experimental technologies allow for the rapid generation of large-scale data on antibody sequences, affinity, potency, structures, and biological functions; this should accelerate drug discovery research. Therefore, a robust bioinformatic...... infrastructure for these large data sets has become necessary. In this article, we first identify and discuss the typical obstacles faced during the antibody drug discovery process. We then summarize the current status of three sub-fields of antibody informatics as follows: (i) recent progress in technologies...... for antibody rational design using computational approaches to affinity and stability improvement, as well as ab-initio and homology-based antibody modeling; (ii) resources for antibody sequences, structures, and immune epitopes and open drug discovery resources for development of antibody drugs; and (iii...

  3. Get Involved in Planetary Discoveries through New Worlds, New Discoveries

    Science.gov (United States)

    Shupla, Christine; Shipp, S. S.; Halligan, E.; Dalton, H.; Boonstra, D.; Buxner, S.; SMD Planetary Forum, NASA

    2013-01-01

    "New Worlds, New Discoveries" is a synthesis of NASA’s 50-year exploration history which provides an integrated picture of our new understanding of our solar system. As NASA spacecraft head to and arrive at key locations in our solar system, "New Worlds, New Discoveries" provides an integrated picture of our new understanding of the solar system to educators and the general public! The site combines the amazing discoveries of past NASA planetary missions with the most recent findings of ongoing missions, and connects them to the related planetary science topics. "New Worlds, New Discoveries," which includes the "Year of the Solar System" and the ongoing celebration of the "50 Years of Exploration," includes 20 topics that share thematic solar system educational resources and activities, tied to the national science standards. This online site and ongoing event offers numerous opportunities for the science community - including researchers and education and public outreach professionals - to raise awareness, build excitement, and make connections with educators, students, and the public about planetary science. Visitors to the site will find valuable hands-on science activities, resources and educational materials, as well as the latest news, to engage audiences in planetary science topics and their related mission discoveries. The topics are tied to the big questions of planetary science: how did the Sun’s family of planets and bodies originate and how have they evolved? How did life begin and evolve on Earth, and has it evolved elsewhere in our solar system? Scientists and educators are encouraged to get involved either directly or by sharing "New Worlds, New Discoveries" and its resources with educators, by conducting presentations and events, sharing their resources and events to add to the site, and adding their own public events to the site’s event calendar! Visit to find quality resources and ideas. Connect with educators, students and the public to

  4. Talking to children about their HIV status: a review of available resources, tools, and models for improving and promoting pediatric disclosure.

    Science.gov (United States)

    Wright, S; Amzel, A; Ikoro, N; Srivastava, M; Leclerc-Madlala, S; Bowsky, S; Miller, H; Phelps, B R

    2017-08-01

    As children living with HIV (CLHIV) grow into adolescence and adulthood, caregivers and healthcare providers are faced with the sensitive challenge of when to disclose to a CLHIV his or her HIV status. Despite WHO recommendations for CLHIV to know their status, in countries most affected by HIV, effective resources are often limited, and national guidance on disclosure is often lacking. To address the need for effective resources, gray and scientific literature was searched to identify existing tools and resources that can aid in the disclosure process. From peer-reviewed literature, seven disclosure models from six different countries were identified. From the gray literature, 23 resources were identified including children's books (15), job aides to assist healthcare providers (5), and videos (3). While these existing resources can be tailored to reflect local norms and used to aid in the disclosure process, careful consideration must be taken in order to avoid damaging disclosure practices.

  5. Cyclic Investigation of Geophysical Studies in the Exploration and Discovery of Natural Resources in Our Country; Uelkemizdeki Dogal Kaynaklarin Aranmasi ve Bulunmasinda Jeofizik Calismalarin Doenemsel Incelenmesi

    Energy Technology Data Exchange (ETDEWEB)

    Gonulalan, A U [TPAO, Research Department, Ankara (Turkey)

    2007-07-01

    Although the methods of exploration geophysics were first utilized after the discovery of an oil field in 1921, they have also applied in the old centuries. Likewise, the half of the total production in the United States of America is covered by new oil fields discovered by utilizing geophysical methods. The industry's energy necessity increases the interest to oil. The investments in the field of geophysics by the companies which makes large amount of money in order to discover new oil fields, widespread use of computers, the developments of space technology and world-wide nuclear competition even though its great danger for human beings have great share in the development of geophysics. Our country has 18 different types mines which has more than 10 billion $ potential. Geophysical engineers have great Kowledge and labor in the discovery of 1,795 trillion wealth from borax to building stone, and 60 billion $ oil and gas. On the other hand, as 1,5 billion investment in the field of geophysics is only 0.08 % of total investments, the increase of investments will add more contribution.

  6. Computational methods in drug discovery

    Directory of Open Access Journals (Sweden)

    Sumudu P. Leelananda

    2016-12-01

    Full Text Available The process for drug discovery and development is challenging, time consuming and expensive. Computer-aided drug discovery (CADD tools can act as a virtual shortcut, assisting in the expedition of this long process and potentially reducing the cost of research and development. Today CADD has become an effective and indispensable tool in therapeutic development. The human genome project has made available a substantial amount of sequence data that can be used in various drug discovery projects. Additionally, increasing knowledge of biological structures, as well as increasing computer power have made it possible to use computational methods effectively in various phases of the drug discovery and development pipeline. The importance of in silico tools is greater than ever before and has advanced pharmaceutical research. Here we present an overview of computational methods used in different facets of drug discovery and highlight some of the recent successes. In this review, both structure-based and ligand-based drug discovery methods are discussed. Advances in virtual high-throughput screening, protein structure prediction methods, protein–ligand docking, pharmacophore modeling and QSAR techniques are reviewed.

  7. Construction of estimated flow- and load-duration curves for Kentucky using the Water Availability Tool for Environmental Resources (WATER)

    Science.gov (United States)

    Unthank, Michael D.; Newson, Jeremy K.; Williamson, Tanja N.; Nelson, Hugh L.

    2012-01-01

    Flow- and load-duration curves were constructed from the model outputs of the U.S. Geological Survey's Water Availability Tool for Environmental Resources (WATER) application for streams in Kentucky. The WATER application was designed to access multiple geospatial datasets to generate more than 60 years of statistically based streamflow data for Kentucky. The WATER application enables a user to graphically select a site on a stream and generate an estimated hydrograph and flow-duration curve for the watershed upstream of that point. The flow-duration curves are constructed by calculating the exceedance probability of the modeled daily streamflows. User-defined water-quality criteria and (or) sampling results can be loaded into the WATER application to construct load-duration curves that are based on the modeled streamflow results. Estimates of flow and streamflow statistics were derived from TOPographically Based Hydrological MODEL (TOPMODEL) simulations in the WATER application. A modified TOPMODEL code, SDP-TOPMODEL (Sinkhole Drainage Process-TOPMODEL) was used to simulate daily mean discharges over the period of record for 5 karst and 5 non-karst watersheds in Kentucky in order to verify the calibrated model. A statistical evaluation of the model's verification simulations show that calibration criteria, established by previous WATER application reports, were met thus insuring the model's ability to provide acceptably accurate estimates of discharge at gaged and ungaged sites throughout Kentucky. Flow-duration curves are constructed in the WATER application by calculating the exceedence probability of the modeled daily flow values. The flow-duration intervals are expressed as a percentage, with zero corresponding to the highest stream discharge in the streamflow record. Load-duration curves are constructed by applying the loading equation (Load = Flow*Water-quality criterion) at each flow interval.

  8. The Budding Yeast “Saccharomyces cerevisiae” as a Drug Discovery Tool to Identify Plant-Derived Natural Products with Anti-Proliferative Properties

    Directory of Open Access Journals (Sweden)

    Bouchra Qaddouri

    2011-01-01

    Full Text Available The budding yeast Saccharomyces cerevisiae is a valuable system to study cell-cycle regulation, which is defective in cancer cells. Due to the highly conserved nature of the cell-cycle machinery between yeast and humans, yeast studies are directly relevant to anticancer-drug discovery. The budding yeast is also an excellent model system for identifying and studying antifungal compounds because of the functional conservation of fungal genes. Moreover, yeast studies have also contributed greatly to our understanding of the biological targets and modes of action of bioactive compounds. Understanding the mechanism of action of clinically relevant compounds is essential for the design of improved second-generation molecules. Here we describe our methodology for screening a library of plant-derived natural products in yeast in order to identify and characterize new compounds with anti-proliferative properties.

  9. Methods and tools to simulate the effect of economic instruments in complex water resources systems. Application to the Jucar river basin.

    Science.gov (United States)

    Lopez-Nicolas, Antonio; Pulido-Velazquez, Manuel

    2014-05-01

    The main challenge of the BLUEPRINT to safeguard Europe's water resources (EC, 2012) is to guarantee that enough good quality water is available for people's needs, the economy and the environment. In this sense, economic policy instruments such as water pricing policies and water markets can be applied to enhance efficient use of water. This paper presents a method based on hydro-economic tools to assess the effect of economic instruments on water resource systems. Hydro-economic models allow integrated analysis of water supply, demand and infrastructure operation at the river basin scale, by simultaneously combining engineering, hydrologic and economic aspects of water resources management. The method made use of the simulation and optimization hydroeconomic tools SIMGAMS and OPTIGAMS. The simulation tool SIMGAMS allocates water resources among the users according to priorities and operating rules, and evaluate economic scarcity costs of the system by using economic demand functions. The model's objective function is designed so that the system aims to meet the operational targets (ranked according to priorities) at each month while following the system operating rules. The optimization tool OPTIGAMS allocates water resources based on an economic efficiency criterion: maximize net benefits, or alternatively, minimizing the total water scarcity and operating cost of water use. SIMGAS allows to simulate incentive water pricing policies based on marginal resource opportunity costs (MROC; Pulido-Velazquez et al., 2013). Storage-dependent step pricing functions are derived from the time series of MROC values at a certain reservoir in the system. These water pricing policies are defined based on water availability in the system (scarcity pricing), so that when water storage is high, the MROC is low, while low storage (drought periods) will be associated to high MROC and therefore, high prices. We also illustrate the use of OPTIGAMS to simulate the effect of ideal water

  10. Discovery Mondays

    CERN Multimedia

    2003-01-01

    Many people don't realise quite how much is going on at CERN. Would you like to gain first-hand knowledge of CERN's scientific and technological activities and their many applications? Try out some experiments for yourself, or pick the brains of the people in charge? If so, then the «Lundis Découverte» or Discovery Mondays, will be right up your street. Starting on May 5th, on every first Monday of the month you will be introduced to a different facet of the Laboratory. CERN staff, non-scientists, and members of the general public, everyone is welcome. So tell your friends and neighbours and make sure you don't miss this opportunity to satisfy your curiosity and enjoy yourself at the same time. You won't have to listen to a lecture, as the idea is to have open exchange with the expert in question and for each subject to be illustrated with experiments and demonstrations. There's no need to book, as Microcosm, CERN's interactive museum, will be open non-stop from 7.30 p.m. to 9 p.m. On the first Discovery M...

  11. The management of scarce water resources using GNSS, InSAR and in-situ micro gravity measurements as monitoring tools

    CSIR Research Space (South Africa)

    Wonnacott, R

    2015-08-01

    Full Text Available of Geomatics, Vol. 4, No. 3, August 2015 213  The management of scarce water resources using GNSS, InSAR and in-situ micro gravity measurements as monitoring tools Richard Wonnacott1, Chris Hartnady1, Jeanine Engelbrecht2 1Umvoto Africa (Pty) Ltd... shown to provide a useful tool for the measurement and monitoring of ground subsidence resulting from numerous natural and anthropogenic causes including the abstraction of groundwater and gas. Zerbini et al (2007) processed and combined data from a...

  12. The ChIP-Seq tools and web server: a resource for analyzing ChIP-seq and other types of genomic data.

    Science.gov (United States)

    Ambrosini, Giovanna; Dreos, René; Kumar, Sunil; Bucher, Philipp

    2016-11-18

    ChIP-seq and related high-throughput chromatin profilig assays generate ever increasing volumes of highly valuable biological data. To make sense out of it, biologists need versatile, efficient and user-friendly tools for access, visualization and itegrative analysis of such data. Here we present the ChIP-Seq command line tools and web server, implementing basic algorithms for ChIP-seq data analysis starting with a read alignment file. The tools are optimized for memory-efficiency and speed thus allowing for processing of large data volumes on inexpensive hardware. The web interface provides access to a large database of public data. The ChIP-Seq tools have a modular and interoperable design in that the output from one application can serve as input to another one. Complex and innovative tasks can thus be achieved by running several tools in a cascade. The various ChIP-Seq command line tools and web services either complement or compare favorably to related bioinformatics resources in terms of computational efficiency, ease of access to public data and interoperability with other web-based tools. The ChIP-Seq server is accessible at http://ccg.vital-it.ch/chipseq/ .

  13. Partial filling affinity capillary electrophoresis as a useful tool for fragment-based drug discovery: A proof of concept on thrombin.

    Science.gov (United States)

    Farcaş, E; Bouckaert, C; Servais, A-C; Hanson, J; Pochet, L; Fillet, M

    2017-09-01

    With the emergence of more challenging targets, a relatively new approach, fragment-based drug discovery (FBDD), proved its efficacy and gained increasing importance in the pharmaceutical industry. FBDD identifies low molecular-weight (MW) ligands (fragments) that bind to biologically important macromolecules, then a structure-guided fragment growing or merging approach is performed, contributing to the quality of the lead. However, to select the appropriate fragment to be evolved, sensitive analytical screening methods must be used to measure the affinity in the μM or even mM range. In this particular context, we developed a robust and selective partial filling affinity CE (ACE) method for the direct binding screening of a small fragment library in order to identify new thrombin inhibitors. To demonstrate the accuracy of our assay, the complex dissociation constants of three known thrombin inhibitors, namely benzamidine, p-aminobenzamidine and nafamostat were determined and found to be in good concordance with the previously reported values. Finally, the screening of a small library was performed and demonstrated the high discriminatory power of our method towards weak binders compared to classical spectrophotometric activity assay, proving the interest of our method in the context of FBDD. Copyright © 2017 Elsevier B.V. All rights reserved.

  14. Using the open Web as an information resource and scholarly Web search engines as retrieval tools for academic and research purposes

    OpenAIRE

    Filistea Naude; Chris Rensleigh; Adeline S.A. du Toit

    2010-01-01

    This study provided insight into the significance of the open Web as an information resource and Web search engines as research tools amongst academics. The academic staff establishment of the University of South Africa (Unisa) was invited to participate in a questionnaire survey and included 1188 staff members from five colleges. This study culminated in a PhD dissertation in 2008. One hundred and eighty seven respondents participated in the survey which gave a response rate of 15.7%. The re...

  15. Identifying Key Features, Cutting Edge Cloud Resources, and Artificial Intelligence Tools to Achieve User-Friendly Water Science in the Cloud

    Science.gov (United States)

    Pierce, S. A.

    2017-12-01

    Decision making for groundwater systems is becoming increasingly important, as shifting water demands increasingly impact aquifers. As buffer systems, aquifers provide room for resilient responses and augment the actual timeframe for hydrological response. Yet the pace impacts, climate shifts, and degradation of water resources is accelerating. To meet these new drivers, groundwater science is transitioning toward the emerging field of Integrated Water Resources Management, or IWRM. IWRM incorporates a broad array of dimensions, methods, and tools to address problems that tend to be complex. Computational tools and accessible cyberinfrastructure (CI) are needed to cross the chasm between science and society. Fortunately cloud computing environments, such as the new Jetstream system, are evolving rapidly. While still targeting scientific user groups systems such as, Jetstream, offer configurable cyberinfrastructure to enable interactive computing and data analysis resources on demand. The web-based interfaces allow researchers to rapidly customize virtual machines, modify computing architecture and increase the usability and access for broader audiences to advanced compute environments. The result enables dexterous configurations and opening up opportunities for IWRM modelers to expand the reach of analyses, number of case studies, and quality of engagement with stakeholders and decision makers. The acute need to identify improved IWRM solutions paired with advanced computational resources refocuses the attention of IWRM researchers on applications, workflows, and intelligent systems that are capable of accelerating progress. IWRM must address key drivers of community concern, implement transdisciplinary methodologies, adapt and apply decision support tools in order to effectively support decisions about groundwater resource management. This presentation will provide an overview of advanced computing services in the cloud using integrated groundwater management case

  16. Cogena, a novel tool for co-expressed gene-set enrichment analysis, applied to drug repositioning and drug mode of action discovery.

    Science.gov (United States)

    Jia, Zhilong; Liu, Ying; Guan, Naiyang; Bo, Xiaochen; Luo, Zhigang; Barnes, Michael R

    2016-05-27

    Drug repositioning, finding new indications for existing drugs, has gained much recent attention as a potentially efficient and economical strategy for accelerating new therapies into the clinic. Although improvement in the sensitivity of computational drug repositioning methods has identified numerous credible repositioning opportunities, few have been progressed. Arguably the "black box" nature of drug action in a new indication is one of the main blocks to progression, highlighting the need for methods that inform on the broader target mechanism in the disease context. We demonstrate that the analysis of co-expressed genes may be a critical first step towards illumination of both disease pathology and mode of drug action. We achieve this using a novel framework, co-expressed gene-set enrichment analysis (cogena) for co-expression analysis of gene expression signatures and gene set enrichment analysis of co-expressed genes. The cogena framework enables simultaneous, pathway driven, disease and drug repositioning analysis. Cogena can be used to illuminate coordinated changes within disease transcriptomes and identify drugs acting mechanistically within this framework. We illustrate this using a psoriatic skin transcriptome, as an exemplar, and recover two widely used Psoriasis drugs (Methotrexate and Ciclosporin) with distinct modes of action. Cogena out-performs the results of Connectivity Map and NFFinder webservers in similar disease transcriptome analyses. Furthermore, we investigated the literature support for the other top-ranked compounds to treat psoriasis and showed how the outputs of cogena analysis can contribute new insight to support the progression of drugs into the clinic. We have made cogena freely available within Bioconductor or https://github.com/zhilongjia/cogena . In conclusion, by targeting co-expressed genes within disease transcriptomes, cogena offers novel biological insight, which can be effectively harnessed for drug discovery and

  17. MSeqDR mvTool: A mitochondrial DNA Web and API resource for comprehensive variant annotation, universal nomenclature collation, and reference genome conversion.

    Science.gov (United States)

    Shen, Lishuang; Attimonelli, Marcella; Bai, Renkui; Lott, Marie T; Wallace, Douglas C; Falk, Marni J; Gai, Xiaowu

    2018-06-01

    Accurate mitochondrial DNA (mtDNA) variant annotation is essential for the clinical diagnosis of diverse human diseases. Substantial challenges to this process include the inconsistency in mtDNA nomenclatures, the existence of multiple reference genomes, and a lack of reference population frequency data. Clinicians need a simple bioinformatics tool that is user-friendly, and bioinformaticians need a powerful informatics resource for programmatic usage. Here, we report the development and functionality of the MSeqDR mtDNA Variant Tool set (mvTool), a one-stop mtDNA variant annotation and analysis Web service. mvTool is built upon the MSeqDR infrastructure (https://mseqdr.org), with contributions of expert curated data from MITOMAP (https://www.mitomap.org) and HmtDB (https://www.hmtdb.uniba.it/hmdb). mvTool supports all mtDNA nomenclatures, converts variants to standard rCRS- and HGVS-based nomenclatures, and annotates novel mtDNA variants. Besides generic annotations from dbNSFP and Variant Effect Predictor (VEP), mvTool provides allele frequencies in more than 47,000 germline mitogenomes, and disease and pathogenicity classifications from MSeqDR, Mitomap, HmtDB and ClinVar (Landrum et al., 2013). mvTools also provides mtDNA somatic variants annotations. "mvTool API" is implemented for programmatic access using inputs in VCF, HGVS, or classical mtDNA variant nomenclatures. The results are reported as hyperlinked html tables, JSON, Excel, and VCF formats. MSeqDR mvTool is freely accessible at https://mseqdr.org/mvtool.php. © 2018 Wiley Periodicals, Inc.

  18. Databases and web tools for cancer genomics study.

    Science.gov (United States)

    Yang, Yadong; Dong, Xunong; Xie, Bingbing; Ding, Nan; Chen, Juan; Li, Yongjun; Zhang, Qian; Qu, Hongzhu; Fang, Xiangdong

    2015-02-01

    Publicly-accessible resources have promoted the advance of scientific discovery. The era of genomics and big data has brought the need for collaboration and data sharing in order to make effective use of this new knowledge. Here, we describe the web resources for cancer genomics research and rate them on the basis of the diversity of cancer types, sample size, omics data comprehensiveness, and user experience. The resources reviewed include data repository and analysis tools; and we hope such introduction will promote the awareness and facilitate the usage of these resources in the cancer research community. Copyright © 2015 The Authors. Production and hosting by Elsevier Ltd.. All rights reserved.

  19. LACO-Wiki: A land cover validation tool and a new, innovative teaching resource for remote sensing and the geosciences

    Science.gov (United States)

    See, Linda; Perger, Christoph; Dresel, Christopher; Hofer, Martin; Weichselbaum, Juergen; Mondel, Thomas; Steffen, Fritz

    2016-04-01

    The validation of land cover products is an important step in the workflow of generating a land cover map from remotely-sensed imagery. Many students of remote sensing will be given exercises on classifying a land cover map followed by the validation process. Many algorithms exist for classification, embedded within proprietary image processing software or increasingly as open source tools. However, there is little standardization for land cover validation, nor a set of open tools available for implementing this process. The LACO-Wiki tool was developed as a way of filling this gap, bringing together standardized land cover validation methods and workflows into a single portal. This includes the storage and management of land cover maps and validation data; step-by-step instructions to guide users through the validation process; sound sampling designs; an easy-to-use environment for validation sample interpretation; and the generation of accuracy reports based on the validation process. The tool was developed for a range of users including producers of land cover maps, researchers, teachers and students. The use of such a tool could be embedded within the curriculum of remote sensing courses at a university level but is simple enough for use by students aged 13-18. A beta version of the tool is available for testing at: http://www.laco-wiki.net.

  20. Creation of Novel Protein Variants with CRISPR/Cas9-Mediated Mutagenesis: Turning a Screening By-Product into a Discovery Tool.

    Directory of Open Access Journals (Sweden)

    Katherine F Donovan

    Full Text Available CRISPR/Cas9 screening has proven to be a versatile tool for genomics research. Based on unexpected results from a genome-wide screen, we developed a CRISPR/Cas9-mediated approach to mutagenesis, exploiting the allelic diversity generated by error-prone non-homologous end-joining (NHEJ to identify novel gain-of-function and drug resistant alleles of the MAPK signaling pathway genes MEK1 and BRAF. We define the parameters of a scalable technique to easily generate cell populations containing thousands of endogenous allelic variants to map gene functions. Further, these results highlight an unexpected but important phenomenon, that Cas9-induced gain-of-function alleles are an inherent by-product of normal Cas9 loss-of-function screens and should be investigated during analysis of data from large-scale positive selection screens.

  1. Managing research and surveillance projects in real-time with a novel open-source eManagement tool designed for under-resourced countries.

    Science.gov (United States)

    Steiner, Andreas; Hella, Jerry; Grüninger, Servan; Mhalu, Grace; Mhimbira, Francis; Cercamondi, Colin I; Doulla, Basra; Maire, Nicolas; Fenner, Lukas

    2016-09-01

    A software tool is developed to facilitate data entry and to monitor research projects in under-resourced countries in real-time. The eManagement tool "odk_planner" is written in the scripting languages PHP and Python. The odk_planner is lightweight and uses minimal internet resources. It was designed to be used with the open source software Open Data Kit (ODK). The users can easily configure odk_planner to meet their needs, and the online interface displays data collected from ODK forms in a graphically informative way. The odk_planner also allows users to upload pictures and laboratory results and sends text messages automatically. User-defined access rights protect data and privacy. We present examples from four field applications in Tanzania successfully using the eManagement tool: 1) clinical trial; 2) longitudinal Tuberculosis (TB) Cohort Study with a complex visit schedule, where it was used to graphically display missing case report forms, upload digitalized X-rays, and send text message reminders to patients; 3) intervention study to improve TB case detection, carried out at pharmacies: a tablet-based electronic referral system monitored referred patients, and sent automated messages to remind pharmacy clients to visit a TB Clinic; and 4) TB retreatment case monitoring designed to improve drug resistance surveillance: clinicians at four public TB clinics and lab technicians at the TB reference laboratory used a smartphone-based application that tracked sputum samples, and collected clinical and laboratory data. The user friendly, open source odk_planner is a simple, but multi-functional, Web-based eManagement tool with add-ons that helps researchers conduct studies in under-resourced countries. © The Author 2016. Published by Oxford University Press on behalf of the American Medical Informatics Association. All rights reserved. For Permissions, please email: journals.permissions@oup.com.

  2. Relationships demand-supply of water and the rate of water shortage as tools for evaluating water resources in Colombia

    International Nuclear Information System (INIS)

    Dominguez Calle, Efrain Antonio; Gonzalo Rivera, Hebert; Vanegas, Sarmiento Raquel; Moreno, Pedro

    2008-01-01

    This paper shows updated results about Colombian water resources and their requirements by the economic sectors. Water demand water availability relationship is used as a pressure index on water resources. This relationship is expressed through the water scarcity index, which applies constraints over water availability; due to the runoff temporal variability and to the low levels of water during the dry season each year and for each geographic region to characterize average and low runoff years. Different water availability scenarios were building. One for modal runoff values and another for 95 percents for 2025 also were prepared. To the results call our attention to problems caused by the concentration of high density settlements and the presence of economics sectors in regions with low water availability. The infrastructure lag for management of a scarce high variable and over pressured resources emerges as a key factor to avoid a looming crisis in the process of water management

  3. Discovery, SAR, and Radiolabeling of Halogenated Benzimidazole Carboxamide Antagonists as Useful Tools for (alpha)4(beta)1 Integrin Expressed on T- and B-cell Lymphomas

    Energy Technology Data Exchange (ETDEWEB)

    Carpenter, R D; Natarajan, A; Lau, E Y; Andrei, M; Solano, D M; Lightstone, F C; DeNardo, S J; Lam, K S; Kurth, M J

    2010-02-08

    The cell surface receptor {alpha}{sub 4}{beta}{sub 1} integrin is an attractive yet poorly understood target for selective diagnosis and treatment of T- and B-cell lymphomas. This report focuses on the rapid microwave preparation of medicinally pertinent benzimidazole heterocycles, structure-activity relationships (SAR) of novel halobenzimidazole carboxamide antagonists 3-6, and preliminary biological evaluation of radioiodinated agents 7, 8, and 18. The I-125 derivative 18 had good tumor uptake (12 {+-} 1% ID/g at 24 h; 4.5 {+-} 1% ID/g at 48 h) and tumor:kidney ratio ({approx}4:1 at 24 h; 2.5:1 at 48 h) in xenograft murine models of B-cell lymphoma. Molecular homology models of {alpha}{sub 4}{beta}{sub 1} integrin have predicted that docked halobenzimidazole carboxamides have the halogen atom in a suitable orientation for halogen-hydrogen bonding. These high affinity ({approx} pM binding) halogenated ligands are attractive tools for medicinal and biological use; the fluoro and iodo derivatives are potential radiodiagnostic ({sup 18}F) or radiotherapeutic ({sup 131}I) agents, whereas the chloro and bromo analogues could provide structural insight into integrin-ligand interactions through photoaffinity cross-linking/mass spectroscopy experiments, as well as co-crystallization X-ray studies.

  4. The changing roles of natural resource professionals: providing tools to students to teach the public about fire

    Science.gov (United States)

    Pat Stephens Williams; Brian P. Oswald; Karen Stafford; Justice Jones; David. Kulhavy

    2011-01-01

    The Arthur Temple College of Forestry and Agriculture (ATCOFA) at Stephen F. Austin State University is taking a proactive stance toward preparing forestry students to work closely with the public on fire planning in wildland-urban interface areas. ATCOFA's incorporation of the "Changing Roles" curriculum provides lessons on how natural resource managers...

  5. Using analytical tools for decision-making and program planning in natural resources: breaking the fear barrier

    Science.gov (United States)

    David L. Peterson; Daniel L. Schmoldt

    1999-01-01

    The National Park Service and other public agencies are increasing their emphasis on inventory and monitoring (I&M) programs to obtain the information needed to infer changes in resource conditions and trigger management responses.A few individuals on a planning team can develop I&M programs, although a focused workshop is more effective.Workshops are...

  6. A collaborative filtering-based approach to biomedical knowledge discovery.

    Science.gov (United States)

    Lever, Jake; Gakkhar, Sitanshu; Gottlieb, Michael; Rashnavadi, Tahereh; Lin, Santina; Siu, Celia; Smith, Maia; Jones, Martin R; Krzywinski, Martin; Jones, Steven J M; Wren, Jonathan

    2018-02-15

    The increase in publication rates makes it challenging for an individual researcher to stay abreast of all relevant research in order to find novel research hypotheses. Literature-based discovery methods make use of knowledge graphs built using text mining and can infer future associations between biomedical concepts that will likely occur in new publications. These predictions are a valuable resource for researchers to explore a research topic. Current methods for prediction are based on the local structure of the knowledge graph. A method that uses global knowledge from across the knowledge graph needs to be developed in order to make knowledge discovery a frequently used tool by researchers. We propose an approach based on the singular value decomposition (SVD) that is able to combine data from across the knowledge graph through a reduced representation. Using cooccurrence data extracted from published literature, we show that SVD performs better than the leading methods for scoring discoveries. We also show the diminishing predictive power of knowledge discovery as we compare our predictions with real associations that appear further into the future. Finally, we examine the strengths and weaknesses of the SVD approach against another well-performing system using several predicted associations. All code and results files for this analysis can be accessed at https://github.com/jakelever/knowledgediscovery. sjones@bcgsc.ca. Supplementary data are available at Bioinformatics online. © The Author (2017). Published by Oxford University Press. All rights reserved. For Permissions, please email: journals.permissions@oup.com

  7. mHealth Visual Discovery Dashboard.

    Science.gov (United States)

    Fang, Dezhi; Hohman, Fred; Polack, Peter; Sarker, Hillol; Kahng, Minsuk; Sharmin, Moushumi; al'Absi, Mustafa; Chau, Duen Horng

    2017-09-01

    We present Discovery Dashboard, a visual analytics system for exploring large volumes of time series data from mobile medical field studies. Discovery Dashboard offers interactive exploration tools and a data mining motif discovery algorithm to help researchers formulate hypotheses, discover trends and patterns, and ultimately gain a deeper understanding of their data. Discovery Dashboard emphasizes user freedom and flexibility during the data exploration process and enables researchers to do things previously challenging or impossible to do - in the web-browser and in real time. We demonstrate our system visualizing data from a mobile sensor study conducted at the University of Minnesota that included 52 participants who were trying to quit smoking.

  8. Multiscale guidance and tools for implementing a landscape approach to resource management in the Bureau of Land Management

    Science.gov (United States)

    Carter, Sarah K.; Carr, Natasha B.; Miller, Kevin H.; Wood, David J.A.

    2017-01-19

    The Bureau of Land Management (BLM) is implementing a landscape approach to resource management (hereafter, landscape approach) to more effectively work with partners and understand the effects of management decisions. A landscape approach is a set of concepts and principles used to guide resource management when multiple stakeholders are involved and goals include diverse and sustainable social, environmental, and economic outcomes. Core principles of a landscape approach include seeking meaningful participation of diverse stakeholders, considering diverse resource values in multifunctional landscapes, acknowledging the tradeoffs needed to meet diverse objectives in the context of sustainable resource management, and addressing the complexity of social and ecological processes by embracing interdisciplinarity and considering multiple and broad spatial and temporal perspectives.In chapter 1, we outline the overall goal of this report: to provide a conceptual foundation and framework for implementing a landscape approach to resource management in the BLM, focusing on the role of multiscale natural resource monitoring and assessment information. In chapter 2, we describe a landscape approach to resource management. BLM actions taken to implement a landscape approach include a major effort to compile broad-scale data on natural resource status and condition across much of the west. These broadscale data now provide a regional context for interpreting monitoring data collected at individual sites and informing decisions made for local projects. We also illustrate the utility of using multiscale data to understand potential effects of different resource management decisions, define relevant terms in landscape ecology, and identify spatial scales at which planning and management decisions may be evaluated.In chapter 3, we describe how the BLM Rapid Ecoregional Assessment program and Assessment, Inventory and Monitoring program may be integrated to provide the multiscale

  9. Development of a 2nd Generation Decision Support Tool to Optimize Resource and Energy Recovery for Municipal Solid Waste

    Science.gov (United States)

    In 2012, EPA’s Office of Research and Development released the MSW decision support tool (MSW-DST) to help identify strategies for more sustainable MSW management. Depending upon local infrastructure, energy grid mix, population density, and waste composition and quantity, the m...

  10. Experimental resource pulses influence social-network dynamics and the potential for information flow in tool-using crows.

    Science.gov (United States)

    St Clair, James J H; Burns, Zackory T; Bettaney, Elaine M; Morrissey, Michael B; Otis, Brian; Ryder, Thomas B; Fleischer, Robert C; James, Richard; Rutz, Christian

    2015-11-03

    Social-network dynamics have profound consequences for biological processes such as information flow, but are notoriously difficult to measure in the wild. We used novel transceiver technology to chart association patterns across 19 days in a wild population of the New Caledonian crow--a tool-using species that may socially learn, and culturally accumulate, tool-related information. To examine the causes and consequences of changing network topology, we manipulated the environmental availability of the crows' preferred tool-extracted prey, and simulated, in silico, the diffusion of information across field-recorded time-ordered networks. Here we show that network structure responds quickly to environmental change and that novel information can potentially spread rapidly within multi-family communities, especially when tool-use opportunities are plentiful. At the same time, we report surprisingly limited social contact between neighbouring crow communities. Such scale dependence in information-flow dynamics is likely to influence the evolution and maintenance of material cultures.

  11. Using the iPlant collaborative discovery environment.

    Science.gov (United States)

    Oliver, Shannon L; Lenards, Andrew J; Barthelson, Roger A; Merchant, Nirav; McKay, Sheldon J

    2013-06-01

    The iPlant Collaborative is an academic consortium whose mission is to develop an informatics and social infrastructure to address the "grand challenges" in plant biology. Its cyberinfrastructure supports the computational needs of the research community and facilitates solving major challenges in plant science. The Discovery Environment provides a powerful and rich graphical interface to the iPlant Collaborative cyberinfrastructure by creating an accessible virtual workbench that enables all levels of expertise, ranging from students to traditional biology researchers and computational experts, to explore, analyze, and share their data. By providing access to iPlant's robust data-management system and high-performance computing resources, the Discovery Environment also creates a unified space in which researchers can access scalable tools. Researchers can use available Applications (Apps) to execute analyses on their data, as well as customize or integrate their own tools to better meet the specific needs of their research. These Apps can also be used in workflows that automate more complicated analyses. This module describes how to use the main features of the Discovery Environment, using bioinformatics workflows for high-throughput sequence data as examples. © 2013 by John Wiley & Sons, Inc.

  12. Bioinformatics in translational drug discovery.

    Science.gov (United States)

    Wooller, Sarah K; Benstead-Hume, Graeme; Chen, Xiangrong; Ali, Yusuf; Pearl, Frances M G

    2017-08-31

    Bioinformatics approaches are becoming ever more essential in translational drug discovery both in academia and within the pharmaceutical industry. Computational exploitation of the increasing volumes of data generated during all phases of drug discovery is enabling key challenges of the process to be addressed. Here, we highlight some of the areas in which bioinformatics resources and methods are being developed to support the drug discovery pipeline. These include the creation of large data warehouses, bioinformatics algorithms to analyse 'big data' that identify novel drug targets and/or biomarkers, programs to assess the tractability of targets, and prediction of repositioning opportunities that use licensed drugs to treat additional indications. © 2017 The Author(s).

  13. Application of conceptual maps as didactic and pedagogical tools in the area of resources and information services

    Directory of Open Access Journals (Sweden)

    Maria Giovanna Guedes Farias

    2016-07-01

    Full Text Available Objective. Analyze the use of concept maps as didactic and pedagogical tools, from an experiment conducted in the classroom, in the disciplines Specialized Sources and Information Services of the graduation in Librarianship. Method. The methodological approach is characterized by its applied nature, exploratory and observation. Data were collected from the students by applying a questionnaire, and analysis took place with the help of content analysis techniques. Results. The most students see the conceptual maps as facilitators of knowledge construction, and view the use of these in different contexts, like the librarian activities. Conclusions. The use of concept maps as didactic and pedagogical tools can foster in students the enhancement of learning and reflection on the establishment of a route for the shared construction of new knowledge, adding previous knowledge and transforming them for use in cognitive fill gaps.

  14. Water resources planning and modelling tools for the assessment of land use change in the Luvuvhu Catchment, South Africa

    Science.gov (United States)

    Jewitt, G. P. W.; Garratt, J. A.; Calder, I. R.; Fuller, L.

    In arid and semi-arid areas, total evaporation is a major component of the hydrological cycle and seasonal water shortages and drought are common. In these areas, the role of land use and land use change is particularly important and it is imperative that land and water resources are well managed. To aid efficient water management, it is useful to demonstrate how changing land use affects water resources. A convenient framework to consider this is through the use of the ‘blue-water’ and ‘green-water’ classification of Falkenmark, where green-water represents water use by land and blue-water represents runoff. In this study the hydrological response of nine land-use scenarios were simulated for the upper reaches of the Mutale River, an important tributary of the Luvuvhu River in S. Africa. The ACRU and HYLUC land use sensitive hydrological models, were used to investigate the change in blue and green water under the various land-use scenarios. The GIS software ArcGIS(8.3) was used to analyse available spatial data to generate inputs required by the hydrological models. The scenarios investigated included the current land use in the catchment, an increase or decrease in forest cover, and an increase or decrease in the area irrigated. Both models predict that increasing either forestry or irrigation significantly reduces the proportion of blue water in the catchment. The predictions from the models were combined with maps of catchment land use, to illustrate the changes in distribution of green and blue water in a user-friendly manner. The use of GIS in this way is designed to enable policy-makers and managers to quickly assimilate the water resource implication of the land use change.

  15. A decision-making tool for exchange transfusions in infants with severe hyperbilirubinemia in resource-limited settings.

    Science.gov (United States)

    Olusanya, B O; Iskander, I F; Slusher, T M; Wennberg, R P

    2016-05-01

    Late presentation and ineffective phototherapy account for excessive rates of avoidable exchange transfusions (ETs) in many low- and middle-income countries. Several system-based constraints sometimes limit the ability to provide timely ETs for all infants at risk of kernicterus, thus necessitating a treatment triage to optimize available resources. This article proposes a practical priority-setting model for term and near-term infants requiring ET after the first 48 h of life. The proposed model combines plasma/serum bilirubin estimation, clinical signs of acute bilirubin encephalopathy and neurotoxicity risk factors for predicting the risk of kernicterus based on available evidence in the literature.

  16. Could self-measured office blood pressure be a hypertension screening tool for limited-resources settings?

    Science.gov (United States)

    Salazar, Martin R; Espeche, Walter G; Stavile, Rodolfo N; Balbín, Eduardo; Leiva Sisnieguez, Betty C; Leiva Sisnieguez, Carlos E; March, Carlos E; Cor, Susana; Eugenio Acero, Irma; Carbajal, Horacio A

    2018-05-01

    Blood pressure (BP) was assessed by patients themselves in recently published trials. Self-measured office blood pressure (SMOBP) seems particularly interesting for limited health resources regions. The aim of our study was to evaluate the relationship between SMOBP values and those estimated by ambulatory blood pressure monitoring (ABPM). Six hundred seventy-seven patients were evaluated using both, SMOBP and ABPM. The differences between SMOBP and daytime ABPM were evaluated with paired "t" test. The correlations among SMOBP and ABPM were estimated using Pearson's r. The accuracy of SMOBP to identify abnormal ABPM was determined using area under ROC curve (AUC). Sensitivity, specificity, and positive and negative predictive values were calculated for different SMOBP cut-points. Using the average of three readings, systolic SMOBP was higher (3.7 (14.2) mmHg, p 95%) to identify individuals with hypertension in the ABPM; SMOBP < 130/80 mmHg reasonably discarded abnormal ABPM. In conclusion, a high proportion of individuals could be classified adequately using SMOBP, reducing the necessity of healthcare resources and supporting its utility for screening purposes.

  17. Local breast cancer spatial patterning: a tool for community health resource allocation to address local disparities in breast cancer mortality.

    Directory of Open Access Journals (Sweden)

    Dana M Brantley-Sieders

    Full Text Available Despite available demographic data on the factors that contribute to breast cancer mortality in large population datasets, local patterns are often overlooked. Such local information could provide a valuable metric by which regional community health resources can be allocated to reduce breast cancer mortality. We used national and statewide datasets to assess geographical distribution of breast cancer mortality rates and known risk factors influencing breast cancer mortality in middle Tennessee. Each county in middle Tennessee, and each ZIP code within metropolitan Davidson County, was scored for risk factor prevalence and assigned quartile scores that were used as a metric to identify geographic areas of need. While breast cancer mortality often correlated with age and incidence, geographic areas were identified in which breast cancer mortality rates did not correlate with age and incidence, but correlated with additional risk factors, such as mammography screening and socioeconomic status. Geographical variability in specific risk factors was evident, demonstrating the utility of this approach to identify local areas of risk. This method revealed local patterns in breast cancer mortality that might otherwise be overlooked in a more broadly based analysis. Our data suggest that understanding the geographic distribution of breast cancer mortality, and the distribution of risk factors that contribute to breast cancer mortality, will not only identify communities with the greatest need of support, but will identify the types of resources that would provide the most benefit to reduce breast cancer mortality in the community.

  18. Using the open Web as an information resource and scholarly Web search engines as retrieval tools for academic and research purposes

    Directory of Open Access Journals (Sweden)

    Filistea Naude

    2010-08-01

    Full Text Available This study provided insight into the significance of the open Web as an information resource and Web search engines as research tools amongst academics. The academic staff establishment of the University of South Africa (Unisa was invited to participate in a questionnaire survey and included 1188 staff members from five colleges. This study culminated in a PhD dissertation in 2008. One hundred and eighty seven respondents participated in the survey which gave a response rate of 15.7%. The results of this study show that academics have indeed accepted the open Web as a useful information resource and Web search engines as retrieval tools when seeking information for academic and research work. The majority of respondents used the open Web and Web search engines on a daily or weekly basis to source academic and research information. The main obstacles presented by using the open Web and Web search engines included lack of time to search and browse the Web, information overload, poor network speed and the slow downloading speed of webpages.

  19. Using the open Web as an information resource and scholarly Web search engines as retrieval tools for academic and research purposes

    Directory of Open Access Journals (Sweden)

    Filistea Naude

    2010-12-01

    Full Text Available This study provided insight into the significance of the open Web as an information resource and Web search engines as research tools amongst academics. The academic staff establishment of the University of South Africa (Unisa was invited to participate in a questionnaire survey and included 1188 staff members from five colleges. This study culminated in a PhD dissertation in 2008. One hundred and eighty seven respondents participated in the survey which gave a response rate of 15.7%. The results of this study show that academics have indeed accepted the open Web as a useful information resource and Web search engines as retrieval tools when seeking information for academic and research work. The majority of respondents used the open Web and Web search engines on a daily or weekly basis to source academic and research information. The main obstacles presented by using the open Web and Web search engines included lack of time to search and browse the Web, information overload, poor network speed and the slow downloading speed of webpages.

  20. Gas reserves, discoveries and production

    International Nuclear Information System (INIS)

    Saniere, A.

    2006-01-01

    Between 2000 and 2004, new discoveries, located mostly in the Asia/Pacific region, permitted a 71% produced reserve replacement rate. The Middle East and the offshore sector represent a growing proportion of world gas production Non-conventional gas resources are substantial but are not exploited to any significant extent, except in the United States, where they account for 30% of U.S. gas production. (author)

  1. Measurement tools of resource use and quality of life in clinical trials for dementia or cognitive impairment interventions: protocol for a scoping review.

    Science.gov (United States)

    Yang, Fan; Dawes, Piers; Leroi, Iracema; Gannon, Brenda

    2017-01-26

    Dementia and cognitive impairment could severely impact patients' life and bring heavy burden to patients, caregivers and societies. Some interventions are suggested for the older patients with these conditions to help them live well, but economic evaluation is needed to assess the cost-effectiveness of these interventions. Trial-based economic evaluation is an ideal method; however, little is known about the tools used to collect data of resource use and quality of life alongside the trials. Therefore, the aim of this review is to identify and describe the resource use and quality of life instruments in clinical trials of interventions for older patients with dementia or cognitive impairment. We will perform a search in main electronic databases (Ovid MEDLINE, PsycINFO, EMBASE, CINAHL, Cochrane Databases of Systematic Reviews, Web of Science and Scopus) using the key terms or their synonyms: older, dementia, cognitive impairment, cost, quality of life, intervention and tools. After removing duplicates, two independent reviewers will screen each entry for eligibility, initially by title and abstract, then by full-text. A hand search of the references of included articles and general search, e.g. Google Scholar, will also be conducted to identify potential relevant studies. All disagreements will be resolved by discussion or consultation with a third reviewer if necessary. Data analysis will be completed and reported in a narrative review. This review will identify the instruments used in clinical trials to collect resource use and quality of life data for dementia or cognitive impairment interventions. This will help to guide the study design of future trial-based economic evaluation of these interventions. PROSPERO CRD42016038495.

  2. Computational tools and resources for metabolism-related property predictions. 1. Overview of publicly available (free and commercial) databases and software.

    Science.gov (United States)

    Peach, Megan L; Zakharov, Alexey V; Liu, Ruifeng; Pugliese, Angelo; Tawa, Gregory; Wallqvist, Anders; Nicklaus, Marc C

    2012-10-01

    Metabolism has been identified as a defining factor in drug development success or failure because of its impact on many aspects of drug pharmacology, including bioavailability, half-life and toxicity. In this article, we provide an outline and descriptions of the resources for metabolism-related property predictions that are currently either freely or commercially available to the public. These resources include databases with data on, and software for prediction of, several end points: metabolite formation, sites of metabolic transformation, binding to metabolizing enzymes and metabolic stability. We attempt to place each tool in historical context and describe, wherever possible, the data it was based on. For predictions of interactions with metabolizing enzymes, we show a typical set of results for a small test set of compounds. Our aim is to give a clear overview of the areas and aspects of metabolism prediction in which the currently available resources are useful and accurate, and the areas in which they are inadequate or missing entirely.

  3. Comparison of three web-scale discovery services for health sciences research*

    Science.gov (United States)

    Hanneke, Rosie; O'Brien, Kelly K.

    2016-01-01

    Objective The purpose of this study was to investigate the relative effectiveness of three web-scale discovery (WSD) tools in answering health sciences search queries. Methods Simple keyword searches, based on topics from six health sciences disciplines, were run at multiple real-world implementations of EBSCO Discovery Service (EDS), Ex Libris's Primo, and ProQuest's Summon. Each WSD tool was evaluated in its ability to retrieve relevant results and in its coverage of MEDLINE content. Results All WSD tools returned between 50%–60% relevant results. Primo returned a higher number of duplicate results than the other 2 WSD products. Summon results were more relevant when search terms were automatically mapped to controlled vocabulary. EDS indexed the largest number of MEDLINE citations, followed closely by Summon. Additionally, keyword searches in all 3 WSD tools retrieved relevant material that was not found with precision (Medical Subject Headings) searches in MEDLINE. Conclusions None of the 3 WSD products studied was overwhelmingly more effective in returning relevant results. While difficult to place the figure of 50%–60% relevance in context, it implies a strong likelihood that the average user would be able to find satisfactory sources on the first page of search results using a rudimentary keyword search. The discovery of additional relevant material beyond that retrieved from MEDLINE indicates WSD tools' value as a supplement to traditional resources for health sciences researchers. PMID:27076797

  4. Comparison of three web-scale discovery services for health sciences research.

    Science.gov (United States)

    Hanneke, Rosie; O'Brien, Kelly K

    2016-04-01

    The purpose of this study was to investigate the relative effectiveness of three web-scale discovery (WSD) tools in answering health sciences search queries. Simple keyword searches, based on topics from six health sciences disciplines, were run at multiple real-world implementations of EBSCO Discovery Service (EDS), Ex Libris's Primo, and ProQuest's Summon. Each WSD tool was evaluated in its ability to retrieve relevant results and in its coverage of MEDLINE content. All WSD tools returned between 50%-60% relevant results. Primo returned a higher number of duplicate results than the other 2 WSD products. Summon results were more relevant when search terms were automatically mapped to controlled vocabulary. EDS indexed the largest number of MEDLINE citations, followed closely by Summon. Additionally, keyword searches in all 3 WSD tools retrieved relevant material that was not found with precision (Medical Subject Headings) searches in MEDLINE. None of the 3 WSD products studied was overwhelmingly more effective in returning relevant results. While difficult to place the figure of 50%-60% relevance in context, it implies a strong likelihood that the average user would be able to find satisfactory sources on the first page of search results using a rudimentary keyword search. The discovery of additional relevant material beyond that retrieved from MEDLINE indicates WSD tools' value as a supplement to traditional resources for health sciences researchers.

  5. Utility of a dermatology interest group blog: the impact of medical student interest groups and Web 2.0 tools as educational resources.

    Science.gov (United States)

    Jalalat, Sheila Z; Wagner, Richard F

    2014-01-01

    The open access University of Texas Dermatology Interest Group blog was established in 2004 for the purposes of increasing communication and collaboration between medical students and dermatology faculty, residents, and alumni, as well as to promote educational opportunities and the missions for which the interest group was created. This blog is unique because of its longevity and continuous postings directed toward the educational and professional needs of medical students and residents. A blog user survey was performed to assess viewers' thoughts, purpose of viewing, demographic profile, subscriber status, usage of the blog and other Web 2.0 tools (forums, Facebook, blogs, Twitter, podcasts), and perceived usefulness. Sixty-one anonymous online surveys were completed during a 1-month period. Statistical analyses of the responses demonstrated that the utilization of web-based tools and the blog were valuable resources for students, especially for blog subscribers, those more involved in an interest group, and those reading the blog for a longer period of time. The usefulness and impact of this method of communication and dissemination of information in medical education may encourage other student groups, faculty advisors, and educators to implement similar educational tools at their institutions.

  6. Utility of a dermatology interest group blog: the impact of medical student interest groups and Web 2.0 tools as educational resources

    Directory of Open Access Journals (Sweden)

    Jalalat SZ

    2014-09-01

    Full Text Available Sheila Z Jalalat, Richard F Wagner Jr Department of Dermatology, University of Texas Medical Branch, Galveston, TX, USA Abstract: The open access University of Texas Dermatology Interest Group blog was established in 2004 for the purposes of increasing communication and collaboration between medical students and dermatology faculty, residents, and alumni, as well as to promote educational opportunities and the missions for which the interest group was created. This blog is unique because of its longevity and continuous postings directed toward the educational and professional needs of medical students and residents. A blog user survey was performed to assess viewers' thoughts, purpose of viewing, demographic profile, subscriber status, usage of the blog and other Web 2.0 tools (forums, Facebook, blogs, Twitter, podcasts, and perceived usefulness. Sixty-one anonymous online surveys were completed during a 1-month period. Statistical analyses of the responses demonstrated that the utilization of web-based tools and the blog were valuable resources for students, especially for blog subscribers, those more involved in an interest group, and those reading the blog for a longer period of time. The usefulness and impact of this method of communication and dissemination of information in medical education may encourage other student groups, faculty advisors, and educators to implement similar educational tools at their institutions. Keywords: education, medical student, dermatology, blog

  7. Regional scenario building as a tool to support vulnerability assessment of food & water security and livelihood conditions under varying natural resources managements

    Science.gov (United States)

    Reinhardt, Julia; Liersch, Stefan; Dickens, Chris; Kabaseke, Clovis; Mulugeta Lemenih, Kassaye; Sghaier, Mongi; Hattermann, Fred

    2013-04-01

    state and availability of natural resources. Major concerns in all CS are the fast growing populations and natural resources degradation because of unsustainable natural resource management. Land use and resource competition are a consequence of unclear land tenure systems and limited resources availability. Scarce rainfall with high annual variability causes food insecurity if yield failures cannot be compensated, e.g. because of lacking financial resources. In all case studies critical uncertainties were identified to be more or less related to "poor governance". Missing governmental and political stability and effectiveness as well as corruption hamper the implementation of laws and policies related to natural resource management. Other critical uncertainties lie in the social domain. They are either related to demographic patterns like emigration or immigration varying the pressure on natural resources use or to the society in general like the evolvement of people's environmental awareness or voice and accountability. Methodological outcomes of the scenario building were that the complexity of the process requires the use of reliable and powerful tools to support the communication process. Concept maps were found to be a useful tool in this regard.

  8. ONLINE SOCIAL NETWORKS AS A TOOL FOR THE PROMOTION OF PHYSICAL ACTIVITY AND HEALTH: A RESOURCE SCIENTIFICALLY FEW EXPLORED

    Directory of Open Access Journals (Sweden)

    Arían Ramón Aladro Gonzalvo

    2015-06-01

    Full Text Available Due to the great impact  that are exerting the networks in society, it is crucial to know the features that distinguish online social networks bringing together users interested in receiving information and resources to improve or maintain the body in shape. This article aims to comment on the limited research interested in studying the features and particularities of online communities that provide information, advice and support in the execution, performance and promotion of the health and fitness activities. Particularly, it underline about the necessity to know of networks structure, user profiles and peer-to-peer interaction, sort of membership, mechanisms of communication, representation of the body image and patterns of association. Likewise, the size of the support networks, telepresence, technology acceptance and perceived risk on the network. Besides, we recommend exploring two Fitness-related online social networks. Finally, it makes known the recurring problems in the analysis in order to characterize psychosocial and communicative aspects of users in the virtual environment.

  9. Assessing the Financial Value of Human Resource Management Programs and Employee Behaviors: A Critical Tool Still Coming of Age

    Directory of Open Access Journals (Sweden)

    Aharon Tziner

    2015-11-01

    Full Text Available This paper highlights investigations into several aspects of the field of economic assessment of human resource management strategies and worker organizational behaviors, both classic and recent. We present the reader with both an historical overview and a review of conceptual and practical developments in this field. It is important to emphasize the influence of the early studies since later financial assessment models were built on the earlier paradigms. The basic thrust of this effort is to encourage the greater employment by managers of quantitative models that allow decision makers to generate all the factors needed to estimate real financial gains and/or losses before any intervention strategy is implemented in the workplace. As indicated, the use of these quantitative models to estimate the net financial gains of using particular intervention strategies or the value of certain types of worker behaviors, can ultimately save companies from making gross tactical errors and, more positively, can assist management in promoting the organization’s long-term economic goals with all the incumbent rewards.

  10. A plant resource and experiment management system based on the Golm Plant Database as a basic tool for omics research

    Directory of Open Access Journals (Sweden)

    Selbig Joachim

    2008-05-01

    Full Text Available Abstract Background For omics experiments, detailed characterisation of experimental material with respect to its genetic features, its cultivation history and its treatment history is a requirement for analyses by bioinformatics tools and for publication needs. Furthermore, meta-analysis of several experiments in systems biology based approaches make it necessary to store this information in a standardised manner, preferentially in relational databases. In the Golm Plant Database System, we devised a data management system based on a classical Laboratory Information Management System combined with web-based user interfaces for data entry and retrieval to collect this information in an academic environment. Results The database system contains modules representing the genetic features of the germplasm, the experimental conditions and the sampling details. In the germplasm module, genetically identical lines of biological material are generated by defined workflows, starting with the import workflow, followed by further workflows like genetic modification (transformation, vegetative or sexual reproduction. The latter workflows link lines and thus create pedigrees. For experiments, plant objects are generated from plant lines and united in so-called cultures, to which the cultivation conditions are linked. Materials and methods for each cultivation step are stored in a separate ACCESS database of the plant cultivation unit. For all cultures and thus every plant object, each cultivation site and the culture's arrival time at a site are logged by a barcode-scanner based system. Thus, for each plant object, all site-related parameters, e.g. automatically logged climate data, are available. These life history data and genetic information for the plant objects are linked to analytical results by the sampling module, which links sample components to plant object identifiers. This workflow uses controlled vocabulary for organs and treatments. Unique

  11. Evaluation of a Broad-Spectrum Partially Automated Adverse Event Surveillance System: A Potential Tool for Patient Safety Improvement in Hospitals With Limited Resources.

    Science.gov (United States)

    Saikali, Melody; Tanios, Alain; Saab, Antoine

    2017-11-21

    The aim of the study was to evaluate the sensitivity and resource efficiency of a partially automated adverse event (AE) surveillance system for routine patient safety efforts in hospitals with limited resources. Twenty-eight automated triggers from the hospital information system's clinical and administrative databases identified cases that were then filtered by exclusion criteria per trigger and then reviewed by an interdisciplinary team. The system, developed and implemented using in-house resources, was applied for 45 days of surveillance, for all hospital inpatient admissions (N = 1107). Each trigger was evaluated for its positive predictive value (PPV). Furthermore, the sensitivity of the surveillance system (overall and by AE category) was estimated relative to incidence ranges in the literature. The surveillance system identified a total of 123 AEs among 283 reviewed medical records, yielding an overall PPV of 52%. The tool showed variable levels of sensitivity across and within AE categories when compared with the literature, with a relatively low overall sensitivity estimated between 21% and 44%. Adverse events were detected in 23 of the 36 AE categories defined by an established harm classification system. Furthermore, none of the detected AEs were voluntarily reported. The surveillance system showed variable sensitivity levels across a broad range of AE categories with an acceptable PPV, overcoming certain limitations associated with other harm detection methods. The number of cases captured was substantial, and none had been previously detected or voluntarily reported. For hospitals with limited resources, this methodology provides valuable safety information from which interventions for quality improvement can be formulated.

  12. International use of an academic nephrology World Wide Web site: from medical information resource to business tool.

    Science.gov (United States)

    Abbott, Kevin C; Oliver, David K; Boal, Thomas R; Gadiyak, Grigorii; Boocks, Carl; Yuan, Christina M; Welch, Paul G; Poropatich, Ronald K

    2002-04-01

    Studies of the use of the World Wide Web to obtain medical knowledge have largely focused on patients. In particular, neither the international use of academic nephrology World Wide Web sites (websites) as primary information sources nor the use of search engines (and search strategies) to obtain medical information have been described. Visits ("hits") to the Walter Reed Army Medical Center (WRAMC) Nephrology Service website from April 30, 2000, to March 14, 2001, were analyzed for the location of originating source using Webtrends, and search engines (Google, Lycos, etc.) were analyzed manually for search strategies used. From April 30, 2000 to March 14, 2001, the WRAMC Nephrology Service website received 1,007,103 hits and 12,175 visits. These visits were from 33 different countries, and the most frequent regions were Western Europe, Asia, Australia, the Middle East, Pacific Islands, and South America. The most frequent organization using the site was the military Internet system, followed by America Online and automated search programs of online search engines, most commonly Google. The online lecture series was the most frequently visited section of the website. Search strategies used in search engines were extremely technical. The use of "robots" by standard Internet search engines to locate websites, which may be blocked by mandatory registration, has allowed users worldwide to access the WRAMC Nephrology Service website to answer very technical questions. This suggests that it is being used as an alternative to other primary sources of medical information and that the use of mandatory registration may hinder users from finding valuable sites. With current Internet technology, even a single service can become a worldwide information resource without sacrificing its primary customers.

  13. WEB-QUESTS IN THE ENGLISH LANGUAGE STUDYING AND TEACHING AS A VALUABLE RESOURCE AND EFFECTIVE TOOL

    Directory of Open Access Journals (Sweden)

    K. M. Pererva

    2015-05-01

    Full Text Available Purpose. This paper is a study of innovative methods of learning and teaching English with the help of Internet resources and students motivation to seek the necessary information at homework. Methodology. The main principle of the Web-Quest as a type of English language teaching is to motivate students. For example, by participation in the Web-Quest students, who were unsure of their knowledge, become more confident. Having clear goals and objectives, using computer skills, motivated young people more actively acts as a confident user of English. Findings. According to the technology of We-Quests students were asked to create one or more projects directly related to the successful execution of the work. It is a significant result of all the hard work of students, and it is the subject of evaluation. Evaluation is an essential component of Web-Quest or any other project, and from this point of view, the criteria should be clear and accessible to students from the very beginning. These instructions can and should be changed in order to differentiate and provide an oral presentation and written work. Originality. Basically, Web-Quests are mini-projects in which a higher percentage of the material obtained from the Internet. They can be created by teachers or students, depending on the type of training work. The author detailed the increase of possibilities in the search of Internet projects with other creative types of student work. They may include: review of the literature, essay writing, discussion of read works and other. Practical value. The paper confirmed that the roles and tasks, reflecting the real world, invites to cooperate, stimulate and train the thinking process at a higher level. That is why the use of Web-Quests can improve the language skills of the educational process (reading for information extraction, detailed reading, negotiations, oral and written communication, and other.

  14. Developing Process Maps as a Tool for a Surgical Infection Prevention Quality Improvement Initiative in Resource-Constrained Settings.

    Science.gov (United States)

    Forrester, Jared A; Koritsanszky, Luca A; Amenu, Demisew; Haynes, Alex B; Berry, William R; Alemu, Seifu; Jiru, Fekadu; Weiser, Thomas G

    2018-06-01

    Surgical infections cause substantial morbidity and mortality in low-and middle-income countries (LMICs). To improve adherence to critical perioperative infection prevention standards, we developed Clean Cut, a checklist-based quality improvement program to improve compliance with best practices. We hypothesized that process mapping infection prevention activities can help clinicians identify strategies for improving surgical safety. We introduced Clean Cut at a tertiary hospital in Ethiopia. Infection prevention standards included skin antisepsis, ensuring a sterile field, instrument decontamination/sterilization, prophylactic antibiotic administration, routine swab/gauze counting, and use of a surgical safety checklist. Processes were mapped by a visiting surgical fellow and local operating theater staff to facilitate the development of contextually relevant solutions; processes were reassessed for improvements. Process mapping helped identify barriers to using alcohol-based hand solution due to skin irritation, inconsistent administration of prophylactic antibiotics due to variable delivery outside of the operating theater, inefficiencies in assuring sterility of surgical instruments through lack of confirmatory measures, and occurrences of retained surgical items through inappropriate guidelines, staffing, and training in proper routine gauze counting. Compliance with most processes improved significantly following organizational changes to align tasks with specific process goals. Enumerating the steps involved in surgical infection prevention using a process mapping technique helped identify opportunities for improving adherence and plotting contextually relevant solutions, resulting in superior compliance with antiseptic standards. Simplifying these process maps into an adaptable tool could be a powerful strategy for improving safe surgery delivery in LMICs. Copyright © 2018 American College of Surgeons. Published by Elsevier Inc. All rights reserved.

  15. Applying genetics in inflammatory disease drug discovery

    DEFF Research Database (Denmark)

    Folkersen, Lasse; Biswas, Shameek; Frederiksen, Klaus Stensgaard

    2015-01-01

    , with several notable exceptions, the journey from a small-effect genetic variant to a functional drug has proven arduous, and few examples of actual contributions to drug discovery exist. Here, we discuss novel approaches of overcoming this hurdle by using instead public genetics resources as a pragmatic guide...... alongside existing drug discovery methods. Our aim is to evaluate human genetic confidence as a rationale for drug target selection....

  16. Technical summaries of Scotian Shelf - significant and commercial discoveries

    International Nuclear Information System (INIS)

    Dickey, J.E.; Bigelow, S.F.; Edens, J.A.; Brown, D.E.; Smith, B.; Makrides, C.; Mader, R.

    1997-03-01

    An independent assessment of the recoverable hydrocarbon resource currently held under' Significant and Commercial Discovery' status offshore Nova Scotia was presented. A generalized description of the regulatory issues regarding the discovered resources within the Scotian Basin was included. Twenty discoveries have been declared significant and two have been declared commercial, pursuant to the Canada-Nova Scotia Offshore Petroleum Resources Accord Implementation Acts. Salient facts about each discovery were documented. The information included the wells drilled within the structure, significant flow tests, geological and geophysical attributes, structural cross-section and areal extent, petrophysical parameters, hydrocarbons in place and anticipated hydrocarbon recoverable resource. tabs., figs

  17. Usability of Discovery Portals

    OpenAIRE

    Bulens, J.D.; Vullings, L.A.E.; Houtkamp, J.M.; Vanmeulebrouk, B.

    2013-01-01

    As INSPIRE progresses to be implemented in the EU, many new discovery portals are built to facilitate finding spatial data. Currently the structure of the discovery portals is determined by the way spatial data experts like to work. However, we argue that the main target group for discovery portals are not spatial data experts but professionals with limited spatial knowledge, and a focus outside the spatial domain. An exploratory usability experiment was carried out in which three discovery p...

  18. Usability of Discovery Portals

    NARCIS (Netherlands)

    Bulens, J.D.; Vullings, L.A.E.; Houtkamp, J.M.; Vanmeulebrouk, B.

    2013-01-01

    As INSPIRE progresses to be implemented in the EU, many new discovery portals are built to facilitate finding spatial data. Currently the structure of the discovery portals is determined by the way spatial data experts like to work. However, we argue that the main target group for discovery portals

  19. Discovery and the atom

    International Nuclear Information System (INIS)

    1989-01-01

    ''Discovery and the Atom'' tells the story of the founding of nuclear physics. This programme looks at nuclear physics up to the discovery of the neutron in 1932. Animation explains the science of the classic experiments, such as the scattering of alpha particles by Rutherford and the discovery of the nucleus. Archive film shows the people: Lord Rutherford, James Chadwick, Marie Curie. (author)

  20. The power tool

    International Nuclear Information System (INIS)

    HAYFIELD, J.P.

    1999-01-01

    POWER Tool--Planning, Optimization, Waste Estimating and Resourcing tool, a hand-held field estimating unit and relational database software tool for optimizing disassembly and final waste form of contaminated systems and equipment

  1. Web Resources and Tools for Slovenian with a Focus on the Slovenian-English Language Infrastructure: Dictionaries in the Digital Age

    Directory of Open Access Journals (Sweden)

    Mojca Šorli

    2017-12-01

    Full Text Available The article begins with a presentation of a selection of electronic monolingual and bi/multilingual lexicographic resources and corpora available today to contemporary users of Slovene. The focus is on works combined with English and designed for translation purposes which provide information on the meaning of words and wider lexical units, i.e., e-dictionaries, lexical databases, web translation tools and various corpora. In a separate sub-section the most common translation technologies are presented, together with an evaluation of their role in the modern translation process. Sections 2 and 3 provide a brief outline of the changes that have affected classical dictionary planning, compilation and use in the new digital environment, as well as of the relationship between dictionaries and related resources, such as lexical databases. Some stereotypes regarding dictionary use are identified and, in conclusion, the existing corpus-based databases for the Slovenian-English pair are presented, with a view to determining priorities for the future interlingual infrastructure action plans in Slovenia.

  2. Applied metabolomics in drug discovery.

    Science.gov (United States)

    Cuperlovic-Culf, M; Culf, A S

    2016-08-01

    The metabolic profile is a direct signature of phenotype and biochemical activity following any perturbation. Metabolites are small molecules present in a biological system including natural products as well as drugs and their metabolism by-products depending on the biological system studied. Metabolomics can provide activity information about possible novel drugs and drug scaffolds, indicate interesting targets for drug development and suggest binding partners of compounds. Furthermore, metabolomics can be used for the discovery of novel natural products and in drug development. Metabolomics can enhance the discovery and testing of new drugs and provide insight into the on- and off-target effects of drugs. This review focuses primarily on the application of metabolomics in the discovery of active drugs from natural products and the analysis of chemical libraries and the computational analysis of metabolic networks. Metabolomics methodology, both experimental and analytical is fast developing. At the same time, databases of compounds are ever growing with the inclusion of more molecular and spectral information. An increasing number of systems are being represented by very detailed metabolic network models. Combining these experimental and computational tools with high throughput drug testing and drug discovery techniques can provide new promising compounds and leads.

  3. Maximum Entropy in Drug Discovery

    Directory of Open Access Journals (Sweden)

    Chih-Yuan Tseng

    2014-07-01

    Full Text Available Drug discovery applies multidisciplinary approaches either experimentally, computationally or both ways to identify lead compounds to treat various diseases. While conventional approaches have yielded many US Food and Drug Administration (FDA-approved drugs, researchers continue investigating and designing better approaches to increase the success rate in the discovery process. In this article, we provide an overview of the current strategies and point out where and how the method of maximum entropy has been introduced in this area. The maximum entropy principle has its root in thermodynamics, yet since Jaynes’ pioneering work in the 1950s, the maximum entropy principle has not only been used as a physics law, but also as a reasoning tool that allows us to process information in hand with the least bias. Its applicability in various disciplines has been abundantly demonstrated. We give several examples of applications of maximum entropy in different stages of drug discovery. Finally, we discuss a promising new direction in drug discovery that is likely to hinge on the ways of utilizing maximum entropy.

  4. Comparison of three web-scale discovery services for health sciences research*

    Directory of Open Access Journals (Sweden)

    Rosie Hanneke, MLS

    2016-11-01

    Full Text Available Objective: The purpose of this study was to investigate the relative effectiveness of three web-scale discovery (WSD tools in answering health sciences search queries. Methods: Simple keyword searches, based on topics from six health sciences disciplines, were run at multiple real-world implementations of EBSCO Discovery Service (EDS, Ex Libris’s Primo, and ProQuest’s Summon. Each WSD tool was evaluated in its ability to retrieve relevant results and in its coverage of MEDLINE content. Results: All WSD tools returned between 50%–60% relevant results. Primo returned a higher number of duplicate results than the other 2WSD products. Summon results were more relevant when search terms were automatically mapped to controlled vocabulary. EDS indexed the largest number of MEDLINE citations, followed closely by Summon. Additionally, keyword searches in all 3 WSD tools retrieved relevant material that was not found with precision (Medical Subject Headings searches in MEDLINE. Conclusions: None of the 3 WSD products studied was overwhelmingly more effective in returning relevant results. While difficult to place the figure of 50%–60% relevance in context, it implies a strong likelihood that the average user would be able to find satisfactory sources on the first page of search results using a rudimentary keyword search. The discovery of additional relevant material beyond that retrieved from MEDLINE indicates WSD tools’ value as a supplement to traditional resources for health sciences researchers.

  5. Development of new exploration tools for seabed mineral resources - Result of R/V YOKOSUKA research cruise YK09-09 -

    Science.gov (United States)

    Harada, M.; Sayanagi, K.; Kasaya, T.; Sawa, T.; Goto, T.; Tada, N.; Ichihara, H.; Asada, M.; Nakajima, T.; Isezaki, N.

    2009-12-01

    Detailed information on subsurface structure under seafloor is necessary for the estimation of seabed resources such as the hydrothermal deposit and methane hydrate. Although advantages of geophysical exploration near seafloor are expected for the seabed resource survey, efficient method has not been well-established. The authors started a project to develop exploration tools for seabed resources under the financial support of MEXT-Japan. We carry out research and development mainly regarding measurement of the magnetic field with high-resolution and high-sampling rate electric exploration devices with accurately controlled active source signals. Developed tools will be mounted underwater platforms such as deep-tow system, ROV (remotely operated vehicle), and AUV (autonomous undersea vehicle). We carried out the research cruise (vessel: JAMSTEC R/V YOKOSUKA YK09-09, cruise period: 19-29 July 2009, area surveyed: Kumano-nada, off Kii Peninsula, Japan) to investigate the performance of developed equipments for magnetic exploration. We mounted an Overhauser and two flux-gate magnetometers on the deep-tow and the AUV URASHIMA. To inspect the efficiency of equipments, it is better to measure the magnetic anomaly which is caused by known magnetic source. Therefore, we made a magnetic target which is consisted of 50 neodymium magnets. Before the navigation, the magnetic target was put under water and its position was measured by the acoustic method. The depth of target is about 2,050 meters, and the measurement was performed in the circle of a radius of about 300 meters. The vehicles were navigated at heights of 25 meters for AUV, and about 15 meters for deep-tow. Each of underwater navigation was practiced for two times. Both performances were carried out successfully, which means that we detected the significant magnetic anomalies caused by the target. We will be able to estimate three-dimensional distribution of anomalous magnetic field, and the source property of

  6. Semantic Service Discovery Techniques for the composable web

    OpenAIRE

    Fernández Villamor, José Ignacio

    2013-01-01

    This PhD thesis contributes to the problem of resource and service discovery in the context of the composable web. In the current web, mashup technologies allow developers reusing services and contents to build new web applications. However, developers face a problem of information flood when searching for appropriate services or resources for their combination. To contribute to overcoming this problem, a framework is defined for the discovery of services and resources. In this framework, thr...

  7. Statistical prediction of seasonal discharge in Central Asia for water resources management: development of a generic (pre-)operational modeling tool

    Science.gov (United States)

    Apel, Heiko; Baimaganbetov, Azamat; Kalashnikova, Olga; Gavrilenko, Nadejda; Abdykerimova, Zharkinay; Agalhanova, Marina; Gerlitz, Lars; Unger-Shayesteh, Katy; Vorogushyn, Sergiy; Gafurov, Abror

    2017-04-01

    The semi-arid regions of Central Asia crucially depend on the water resources supplied by the mountainous areas of the Tien-Shan and Pamirs. During the summer months the snow and glacier melt dominated river discharge originating in the mountains provides the main water resource available for agricultural production, but also for storage in reservoirs for energy generation during the winter months. Thus a reliable seasonal forecast of the water resources is crucial for a sustainable management and planning of water resources. In fact, seasonal forecasts are mandatory tasks of all national hydro-meteorological services in the region. In order to support the operational seasonal forecast procedures of hydromet services, this study aims at the development of a generic tool for deriving statistical forecast models of seasonal river discharge. The generic model is kept as simple as possible in order to be driven by available hydrological and meteorological data, and be applicable for all catchments with their often limited data availability in the region. As snowmelt dominates summer runoff, the main meteorological predictors for the forecast models are monthly values of winter precipitation and temperature as recorded by climatological stations in the catchments. These data sets are accompanied by snow cover predictors derived from the operational ModSnow tool, which provides cloud free snow cover data for the selected catchments based on MODIS satellite images. In addition to the meteorological data antecedent streamflow is used as a predictor variable. This basic predictor set was further extended by multi-monthly means of the individual predictors, as well as composites of the predictors. Forecast models are derived based on these predictors as linear combinations of up to 3 or 4 predictors. A user selectable number of best models according to pre-defined performance criteria is extracted automatically by the developed model fitting algorithm, which includes a test

  8. Topology Discovery Using Cisco Discovery Protocol

    OpenAIRE

    Rodriguez, Sergio R.

    2009-01-01

    In this paper we address the problem of discovering network topology in proprietary networks. Namely, we investigate topology discovery in Cisco-based networks. Cisco devices run Cisco Discovery Protocol (CDP) which holds information about these devices. We first compare properties of topologies that can be obtained from networks deploying CDP versus Spanning Tree Protocol (STP) and Management Information Base (MIB) Forwarding Database (FDB). Then we describe a method of discovering topology ...

  9. Sea Level Rise Data Discovery

    Science.gov (United States)

    Quach, N.; Huang, T.; Boening, C.; Gill, K. M.

    2016-12-01

    Research related to sea level rise crosses multiple disciplines from sea ice to land hydrology. The NASA Sea Level Change Portal (SLCP) is a one-stop source for current sea level change information and data, including interactive tools for accessing and viewing regional data, a virtual dashboard of sea level indicators, and ongoing updates through a suite of editorial products that include content articles, graphics, videos, and animations. The architecture behind the SLCP makes it possible to integrate web content and data relevant to sea level change that are archived across various data centers as well as new data generated by sea level change principal investigators. The Extensible Data Gateway Environment (EDGE) is incorporated into the SLCP architecture to provide a unified platform for web content and science data discovery. EDGE is a data integration platform designed to facilitate high-performance geospatial data discovery and access with the ability to support multi-metadata standard specifications. EDGE has the capability to retrieve data from one or more sources and package the resulting sets into a single response to the requestor. With this unified endpoint, the Data Analysis Tool that is available on the SLCP can retrieve dataset and granule level metadata as well as perform geospatial search on the data. This talk focuses on the architecture that makes it possible to seamlessly integrate and enable discovery of disparate data relevant to sea level rise.

  10. A Standardized Needs Assessment Tool to Inform the Curriculum Development Process for Pediatric Resuscitation Simulation-Based Education in Resource-Limited Settings

    Directory of Open Access Journals (Sweden)

    Nicole Shilkofski

    2018-02-01

    burden reported by respondents was relatively consistent with WHO country-specific UFMR statistics in each setting. Results of the needs assessment survey were subsequently used to refine goals and objectives for the simulation curriculum and to ensure delivery of pragmatic educational content with recommendations that were contextualized for local capacity and resource availability. Effective use of the tool in two different settings increases its potential generalizability.

  11. Cyber-Enabled Scientific Discovery

    International Nuclear Information System (INIS)

    Chan, Tony; Jameson, Leland

    2007-01-01

    It is often said that numerical simulation is third in the group of three ways to explore modern science: theory, experiment and simulation. Carefully executed modern numerical simulations can, however, be considered at least as relevant as experiment and theory. In comparison to physical experimentation, with numerical simulation one has the numerically simulated values of every field variable at every grid point in space and time. In comparison to theory, with numerical simulation one can explore sets of very complex non-linear equations such as the Einstein equations that are very difficult to investigate theoretically. Cyber-enabled scientific discovery is not just about numerical simulation but about every possible issue related to scientific discovery by utilizing cyberinfrastructure such as the analysis and storage of large data sets, the creation of tools that can be used by broad classes of researchers and, above all, the education and training of a cyber-literate workforce

  12. An online conserved SSR discovery through cross-species comparison

    Directory of Open Access Journals (Sweden)

    Tun-Wen Pai

    2009-02-01

    Full Text Available Tun-Wen Pai1, Chien-Ming Chen1, Meng-Chang Hsiao1, Ronshan Cheng2, Wen-Shyong Tzou3, Chin-Hua Hu31Department of Computer Science and Engineering; 2Department of Aquaculture, 3Institute of Bioscience and Biotechnology, National Taiwan Ocean University, Keelung, Taiwan, Republic of ChinaAbstract: Simple sequence repeats (SSRs play important roles in gene regulation and genome evolution. Although there exist several online resources for SSR mining, most of them only extract general SSR patterns without providing functional information. Here, an online search tool, CG-SSR (Comparative Genomics SSR discovery, has been developed for discovering potential functional SSRs from vertebrate genomes through cross-species comparison. In addition to revealing SSR candidates in conserved regions among various species, it also combines accurate coordinate and functional genomics information. CG-SSR is the first comprehensive and efficient online tool for conserved SSR discovery.Keywords: microsatellites, genome, comparative genomics, functional SSR, gene ontology, conserved region

  13. Statistical forecast of seasonal discharge in Central Asia using observational records: development of a generic linear modelling tool for operational water resource management

    Directory of Open Access Journals (Sweden)

    H. Apel

    2018-04-01

    Full Text Available The semi-arid regions of Central Asia crucially depend on the water resources supplied by the mountainous areas of the Tien Shan and Pamir and Altai mountains. During the summer months the snow-melt- and glacier-melt-dominated river discharge originating in the mountains provides the main water resource available for agricultural production, but also for storage in reservoirs for energy generation during the winter months. Thus a reliable seasonal forecast of the water resources is crucial for sustainable management and planning of water resources. In fact, seasonal forecasts are mandatory tasks of all national hydro-meteorological services in the region. In order to support the operational seasonal forecast procedures of hydro-meteorological services, this study aims to develop a generic tool for deriving statistical forecast models of seasonal river discharge based solely on observational records. The generic model structure is kept as simple as possible in order to be driven by meteorological and hydrological data readily available at the hydro-meteorological services, and to be applicable for all catchments in the region. As snow melt dominates summer runoff, the main meteorological predictors for the forecast models are monthly values of winter precipitation and temperature, satellite-based snow cover data, and antecedent discharge. This basic predictor set was further extended by multi-monthly means of the individual predictors, as well as composites of the predictors. Forecast models are derived based on these predictors as linear combinations of up to four predictors. A user-selectable number of the best models is extracted automatically by the developed model fitting algorithm, which includes a test for robustness by a leave-one-out cross-validation. Based on the cross-validation the predictive uncertainty was quantified for every prediction model. Forecasts of the mean seasonal discharge of the period April to September are derived

  14. Statistical forecast of seasonal discharge in Central Asia using observational records: development of a generic linear modelling tool for operational water resource management

    Science.gov (United States)

    Apel, Heiko; Abdykerimova, Zharkinay; Agalhanova, Marina; Baimaganbetov, Azamat; Gavrilenko, Nadejda; Gerlitz, Lars; Kalashnikova, Olga; Unger-Shayesteh, Katy; Vorogushyn, Sergiy; Gafurov, Abror

    2018-04-01

    The semi-arid regions of Central Asia crucially depend on the water resources supplied by the mountainous areas of the Tien Shan and Pamir and Altai mountains. During the summer months the snow-melt- and glacier-melt-dominated river discharge originating in the mountains provides the main water resource available for agricultural production, but also for storage in reservoirs for energy generation during the winter months. Thus a reliable seasonal forecast of the water resources is crucial for sustainable management and planning of water resources. In fact, seasonal forecasts are mandatory tasks of all national hydro-meteorological services in the region. In order to support the operational seasonal forecast procedures of hydro-meteorological services, this study aims to develop a generic tool for deriving statistical forecast models of seasonal river discharge based solely on observational records. The generic model structure is kept as simple as possible in order to be driven by meteorological and hydrological data readily available at the hydro-meteorological services, and to be applicable for all catchments in the region. As snow melt dominates summer runoff, the main meteorological predictors for the forecast models are monthly values of winter precipitation and temperature, satellite-based snow cover data, and antecedent discharge. This basic predictor set was further extended by multi-monthly means of the individual predictors, as well as composites of the predictors. Forecast models are derived based on these predictors as linear combinations of up to four predictors. A user-selectable number of the best models is extracted automatically by the developed model fitting algorithm, which includes a test for robustness by a leave-one-out cross-validation. Based on the cross-validation the predictive uncertainty was quantified for every prediction model. Forecasts of the mean seasonal discharge of the period April to September are derived every month from

  15. Automated discovery systems and the inductivist controversy

    Science.gov (United States)

    Giza, Piotr

    2017-09-01

    The paper explores possible influences that some developments in the field of branches of AI, called automated discovery and machine learning systems, might have upon some aspects of the old debate between Francis Bacon's inductivism and Karl Popper's falsificationism. Donald Gillies facetiously calls this controversy 'the duel of two English knights', and claims, after some analysis of historical cases of discovery, that Baconian induction had been used in science very rarely, or not at all, although he argues that the situation has changed with the advent of machine learning systems. (Some clarification of terms machine learning and automated discovery is required here. The key idea of machine learning is that, given data with associated outcomes, software can be trained to make those associations in future cases which typically amounts to inducing some rules from individual cases classified by the experts. Automated discovery (also called machine discovery) deals with uncovering new knowledge that is valuable for human beings, and its key idea is that discovery is like other intellectual tasks and that the general idea of heuristic search in problem spaces applies also to discovery tasks. However, since machine learning systems discover (very low-level) regularities in data, throughout this paper I use the generic term automated discovery for both kinds of systems. I will elaborate on this later on). Gillies's line of argument can be generalised: thanks to automated discovery systems, philosophers of science have at their disposal a new tool for empirically testing their philosophical hypotheses. Accordingly, in the paper, I will address the question, which of the two philosophical conceptions of scientific method is better vindicated in view of the successes and failures of systems developed within three major research programmes in the field: machine learning systems in the Turing tradition, normative theory of scientific discovery formulated by Herbert Simon

  16. Interactive Electronic Decision Trees for the Integrated Primary Care Management of Febrile Children in Low Resource Settings - Review of existing tools.

    Science.gov (United States)

    Keitel, Kristina; D'Acremont, Valérie

    2018-04-20

    The lack of effective, integrated diagnostic tools pose a major challenge to the primary care management of febrile childhood illnesses. These limitations are especially evident in low-resource settings and are often inappropriately compensated by antimicrobial over-prescription. Interactive electronic decision trees (IEDTs) have the potential to close these gaps: guiding antibiotic use and better identifying serious disease. This narrative review summarizes existing IEDTs, to provide an overview of their degree of validation, as well as to identify gaps in current knowledge and prospects for future innovation. Structured literature review in PubMed and Embase complemented by google search and contact with developers. Six integrated IEDTs were identified: three (eIMCI, REC, and Bangladesh digital IMCI) based on Integrated Management of Childhood Illnesses (IMCI); four (SL eCCM, MEDSINC, e-iCCM, and D-Tree eCCM) on Integrated Community Case Management (iCCM); two (ALMANACH, MSFeCARE) with a modified IMCI content; and one (ePOCT) that integrates novel content with biomarker testing. The types of publications and evaluation studies varied greatly: the content and evidence-base was published for two (ALMANACH and ePOCT), ALMANACH and ePOCT were validated in efficacy studies. Other types of evaluations, such as compliance, acceptability were available for D-Tree eCCM, eIMCI, ALMANACH. Several evaluations are still ongoing. Future prospects include conducting effectiveness and impact studies using data gathered through larger studies to adapt the medical content to local epidemiology, improving the software and sensors, and Assessing factors that influence compliance and scale-up. IEDTs are valuable tools that have the potential to improve management of febrile children in primary care and increase the rational use of diagnostics and antimicrobials. Next steps in the evidence pathway should be larger effectiveness and impact studies (including cost analysis) and

  17. Management Tools

    Science.gov (United States)

    1987-01-01

    Manugistics, Inc. (formerly AVYX, Inc.) has introduced a new programming language for IBM and IBM compatible computers called TREES-pls. It is a resource management tool originating from the space shuttle, that can be used in such applications as scheduling, resource allocation project control, information management, and artificial intelligence. Manugistics, Inc. was looking for a flexible tool that can be applied to many problems with minimal adaptation. Among the non-government markets are aerospace, other manufacturing, transportation, health care, food and beverage and professional services.

  18. Bioinformatics resource manager v2.3: an integrated software environment for systems biology with microRNA and cross-species analysis tools

    Directory of Open Access Journals (Sweden)

    Tilton Susan C

    2012-11-01

    Full Text Available Abstract Background MicroRNAs (miRNAs are noncoding RNAs that direct post-transcriptional regulation of protein coding genes. Recent studies have shown miRNAs are important for controlling many biological processes, including nervous system development, and are highly conserved across species. Given their importance, computational tools are necessary for analysis, interpretation and integration of high-throughput (HTP miRNA data in an increasing number of model species. The Bioinformatics Resource Manager (BRM v2.3 is a software environment for data management, mining, integration and functional annotation of HTP biological data. In this study, we report recent updates to BRM for miRNA data analysis and cross-species comparisons across datasets. Results BRM v2.3 has the capability to query predicted miRNA targets from multiple databases, retrieve potential regulatory miRNAs for known genes, integrate experimentally derived miRNA and mRNA datasets, perform ortholog mapping across species, and retrieve annotation and cross-reference identifiers for an expanded number of species. Here we use BRM to show that developmental exposure of zebrafish to 30 uM nicotine from 6–48 hours post fertilization (hpf results in behavioral hyperactivity in larval zebrafish and alteration of putative miRNA gene targets in whole embryos at developmental stages that encompass early neurogenesis. We show typical workflows for using BRM to integrate experimental zebrafish miRNA and mRNA microarray datasets with example retrievals for zebrafish, including pathway annotation and mapping to human ortholog. Functional analysis of differentially regulated (p Conclusions BRM provides the ability to mine complex data for identification of candidate miRNAs or pathways that drive phenotypic outcome and, therefore, is a useful hypothesis generation tool for systems biology. The miRNA workflow in BRM allows for efficient processing of multiple miRNA and mRNA datasets in a single

  19. Bioinformatics resource manager v2.3: an integrated software environment for systems biology with microRNA and cross-species analysis tools

    Science.gov (United States)

    2012-01-01

    Background MicroRNAs (miRNAs) are noncoding RNAs that direct post-transcriptional regulation of protein coding genes. Recent studies have shown miRNAs are important for controlling many biological processes, including nervous system development, and are highly conserved across species. Given their importance, computational tools are necessary for analysis, interpretation and integration of high-throughput (HTP) miRNA data in an increasing number of model species. The Bioinformatics Resource Manager (BRM) v2.3 is a software environment for data management, mining, integration and functional annotation of HTP biological data. In this study, we report recent updates to BRM for miRNA data analysis and cross-species comparisons across datasets. Results BRM v2.3 has the capability to query predicted miRNA targets from multiple databases, retrieve potential regulatory miRNAs for known genes, integrate experimentally derived miRNA and mRNA datasets, perform ortholog mapping across species, and retrieve annotation and cross-reference identifiers for an expanded number of species. Here we use BRM to show that developmental exposure of zebrafish to 30 uM nicotine from 6–48 hours post fertilization (hpf) results in behavioral hyperactivity in larval zebrafish and alteration of putative miRNA gene targets in whole embryos at developmental stages that encompass early neurogenesis. We show typical workflows for using BRM to integrate experimental zebrafish miRNA and mRNA microarray datasets with example retrievals for zebrafish, including pathway annotation and mapping to human ortholog. Functional analysis of differentially regulated (p<0.05) gene targets in BRM indicates that nicotine exposure disrupts genes involved in neurogenesis, possibly through misregulation of nicotine-sensitive miRNAs. Conclusions BRM provides the ability to mine complex data for identification of candidate miRNAs or pathways that drive phenotypic outcome and, therefore, is a useful hypothesis

  20. A hybrid human and machine resource curation pipeline for the Neuroscience Information Framework.

    Science.gov (United States)

    Bandrowski, A E; Cachat, J; Li, Y; Müller, H M; Sternberg, P W; Ciccarese, P; Clark, T; Marenco, L; Wang, R; Astakhov, V; Grethe, J S; Martone, M E

    2012-01-01

    The breadth of information resources available to researchers on the Internet continues to expand, particularly in light of recently implemented data-sharing policies required by funding agencies. However, the nature of dense, multifaceted neuroscience data and the design of contemporary search engine systems makes efficient, reliable and relevant discovery of such information a significant challenge. This challenge is specifically pertinent for online databases, whose dynamic content is 'hidden' from search engines. The Neuroscience Information Framework (NIF; http://www.neuinfo.org) was funded by the NIH Blueprint for Neuroscience Research to address the problem of finding and utilizing neuroscience-relevant resources such as software tools, data sets, experimental animals and antibodies across the Internet. From the outset, NIF sought to provide an accounting of available resources, whereas developing technical solutions to finding, accessing and utilizing them. The curators therefore, are tasked with identifying and registering resources, examining data, writing configuration files to index and display data and keeping the contents current. In the initial phases of the project, all aspects of the registration and curation processes were manual. However, as the number of resources grew, manual curation became impractical. This report describes our experiences and successes with developing automated resource discovery and semiautomated type characterization with text-mining scripts that facilitate curation team efforts to discover, integrate and display new content. We also describe the DISCO framework, a suite of automated web services that significantly reduce manual curation efforts to periodically check for resource updates. Lastly, we discuss DOMEO, a semi-automated annotation tool that improves the discovery and curation of resources that are not necessarily website-based (i.e. reagents, software tools). Although the ultimate goal of automation was to

  1. Assessing the quality of infertility resources on the World Wide Web: tools to guide clients through the maze of fact and fiction.

    Science.gov (United States)

    Okamura, Kyoko; Bernstein, Judith; Fidler, Anne T

    2002-01-01

    The Internet has become a major source of health information for women, but information placed on the World Wide Web does not routinely undergo a peer review process before dissemination. In this study, we present an analysis of 197 infertility-related Web sites for quality and accountability, using JAMA's minimal core standards for responsible print. Only 2% of the web sites analyzed met all four recommended standards, and 50.8% failed to report any of the four. Commercial web sites were more likely to fail to meet minimum standards (71.2%) than those with educational (46.8%) or supportive (29.8%) elements. Web sites with educational and informational components were most common (70.6%), followed by commercial sites (52.8%) and sites that offered a forum for infertility support and activism (28.9%). Internet resources available to infertile patients are at best variable. The current state of infertility-related materials on the World Wide Web offers unprecedented opportunities to improve services to a growing number of e-health users. Because of variations in quality of site content, women's health clinicians must assume responsibility for a new role as information monitor. This study provides assessment tools clinicians can apply and share with clients.

  2. Utility of the heteroduplex assay (HDA) as a simple and cost-effective tool for the identification of HIV type 1 dual infections in resource-limited settings.

    Science.gov (United States)

    Powell, Rebecca L R; Urbanski, Mateusz M; Burda, Sherri; Nanfack, Aubin; Kinge, Thompson; Nyambi, Phillipe N

    2008-01-01

    The predominance of unique recombinant forms (URFs) of HIV-1 in Cameroon suggests that dual infection, the concomitant or sequential infection with genetically distinct HIV-1 strains, occurs frequently in this region; yet, identifying dual infection among large HIV cohorts in local, resource-limited settings is uncommon, since this generally relies on labor-intensive and costly sequencing methods. Consequently, there is a need to develop an effective, cost-efficient method appropriate to the developing world to identify these infections. In the present study, the heteroduplex assay (HDA) was used to verify dual or single infection status, as shown by traditional sequence analysis, for 15 longitudinally sampled study subjects from Cameroon. Heteroduplex formation, indicative of a dual infection, was identified for all five study subjects shown by sequence analysis to be dually infected. Conversely, heteroduplex formation was not detectable for all 10 HDA reactions of the singly infected study subjects. These results suggest that the HDA is a simple yet powerful and inexpensive tool for the detection of both intersubtype and intrasubtype dual infections, and that the HDA harbors significant potential for reliable, high-throughput screening for dual infection. As these infections and the recombinants they generate facilitate leaps in HIV-1 evolution, and may present major challenges for treatment and vaccine design, this assay will be critical for monitoring the continuing pandemic in regions of the world where HIV-1 viral diversity is broad.

  3. Enhancing knowledge discovery from cancer genomics data with Galaxy.

    Science.gov (United States)

    Albuquerque, Marco A; Grande, Bruno M; Ritch, Elie J; Pararajalingam, Prasath; Jessa, Selin; Krzywinski, Martin; Grewal, Jasleen K; Shah, Sohrab P; Boutros, Paul C; Morin, Ryan D

    2017-05-01

    The field of cancer genomics has demonstrated the power of massively parallel sequencing techniques to inform on the genes and specific alterations that drive tumor onset and progression. Although large comprehensive sequence data sets continue to be made increasingly available, data analysis remains an ongoing challenge, particularly for laboratories lacking dedicated resources and bioinformatics expertise. To address this, we have produced a collection of Galaxy tools that represent many popular algorithms for detecting somatic genetic alterations from cancer genome and exome data. We developed new methods for parallelization of these tools within Galaxy to accelerate runtime and have demonstrated their usability and summarized their runtimes on multiple cloud service providers. Some tools represent extensions or refinement of existing toolkits to yield visualizations suited to cohort-wide cancer genomic analysis. For example, we present Oncocircos and Oncoprintplus, which generate data-rich summaries of exome-derived somatic mutation. Workflows that integrate these to achieve data integration and visualizations are demonstrated on a cohort of 96 diffuse large B-cell lymphomas and enabled the discovery of multiple candidate lymphoma-related genes. Our toolkit is available from our GitHub repository as Galaxy tool and dependency definitions and has been deployed using virtualization on multiple platforms including Docker. © The Author 2017. Published by Oxford University Press.

  4. Beyond information retrieval: information discovery and multimedia information retrieval

    OpenAIRE

    Roberto Raieli

    2017-01-01

    The paper compares the current methodologies for search and discovery of information and information resources: terminological search and term-based language, own of information retrieval (IR); semantic search and information discovery, being developed mainly through the language of linked data; semiotic search and content-based language, experienced by multimedia information retrieval (MIR).MIR semiotic methodology is, then, detailed.

  5. Service Discovery At Home

    NARCIS (Netherlands)

    Sundramoorthy, V.; Scholten, Johan; Jansen, P.G.; Hartel, Pieter H.

    Service discovery is a fady new field that kicked off since the advent of ubiquitous computing and has been found essential in the making of intelligent networks by implementing automated discovery and remote control between deviies. This paper provides an ovewiew and comparison of several prominent

  6. Academic Drug Discovery Centres

    DEFF Research Database (Denmark)

    Kirkegaard, Henriette Schultz; Valentin, Finn

    2014-01-01

    Academic drug discovery centres (ADDCs) are seen as one of the solutions to fill the innovation gap in early drug discovery, which has proven challenging for previous organisational models. Prior studies of ADDCs have identified the need to analyse them from the angle of their economic...

  7. Decades of Discovery

    Science.gov (United States)

    2011-06-01

    For the past two-and-a-half decades, the Office of Science at the U.S. Department of Energy has been at the forefront of scientific discovery. Over 100 important discoveries supported by the Office of Science are represented in this document.

  8. Service discovery at home

    NARCIS (Netherlands)

    Sundramoorthy, V.; Scholten, Johan; Jansen, P.G.; Hartel, Pieter H.

    2003-01-01

    Service discovery is a fairly new field that kicked off since the advent of ubiquitous computing and has been found essential in the making of intelligent networks by implementing automated discovery and remote control between devices. This paper provides an overview and comparison of several

  9. Using the "No Child Left Behind Act" To Improve Schools in Your State: A Tool Kit for Business Leaders. Information Resources for Business Leadership To Increase Student Achivement under the "No Child Left Behind Act of 2001."

    Science.gov (United States)

    Business Roundtable, Washington, DC.

    This tool kit is intended to help business leaders seize specific opportunities to partner with educators and political leaders in the next year to implement reforms called for by the No Child Left Behind Act of 2001, which provides new accountability measures and resources to raise the achievement of students throughout the United States. The…

  10. Bioenergy Knowledge Discovery Framework Fact Sheet

    Energy Technology Data Exchange (ETDEWEB)

    None

    2017-07-01

    The Bioenergy Knowledge Discovery Framework (KDF) supports the development of a sustainable bioenergy industry by providing access to a variety of data sets, publications, and collaboration and mapping tools that support bioenergy research, analysis, and decision making. In the KDF, users can search for information, contribute data, and use the tools and map interface to synthesize, analyze, and visualize information in a spatially integrated manner.

  11. "Eureka, Eureka!" Discoveries in Science

    Science.gov (United States)

    Agarwal, Pankaj

    2011-01-01

    Accidental discoveries have been of significant value in the progress of science. Although accidental discoveries are more common in pharmacology and chemistry, other branches of science have also benefited from such discoveries. While most discoveries are the result of persistent research, famous accidental discoveries provide a fascinating…

  12. (Self-) Discovery Service: Helping Students Help Themselves

    Science.gov (United States)

    Debonis, Rocco; O'Donnell, Edward; Thomes, Cynthia

    2012-01-01

    EBSCO Discovery Service (EDS) has been heavily used by UMUC students since its implementation in fall 2011, but experience has shown that it is not always the most appropriate source for satisfying students' information needs and that they often need assistance in understanding how the tool works and how to use it effectively. UMUC librarians have…

  13. Text mining resources for the life sciences.

    Science.gov (United States)

    Przybyła, Piotr; Shardlow, Matthew; Aubin, Sophie; Bossy, Robert; Eckart de Castilho, Richard; Piperidis, Stelios; McNaught, John; Ananiadou, Sophia

    2016-01-01

    Text mining is a powerful technology for quickly distilling key information from vast quantities of biomedical literature. However, to harness this power the researcher must be well versed in the availability, suitability, adaptability, interoperability and comparative accuracy of current text mining resources. In this survey, we give an overview of the text mining resources that exist in the life sciences to help researchers, especially those employed in biocuration, to engage with text mining in their own work. We categorize the various resources under three sections: Content Discovery looks at where and how to find biomedical publications for text mining; Knowledge Encoding describes the formats used to represent the different levels of information associated with content that enable text mining, including those formats used to carry such information between processes; Tools and Services gives an overview of workflow management systems that can be used to rapidly configure and compare domain- and task-specific processes, via access to a wide range of pre-built tools. We also provide links to relevant repositories in each section to enable the reader to find resources relevant to their own area of interest. Throughout this work we give a special focus to resources that are interoperable-those that have the crucial ability to share information, enabling smooth integration and reusability. © The Author(s) 2016. Published by Oxford University Press.

  14. Text mining resources for the life sciences

    Science.gov (United States)

    Shardlow, Matthew; Aubin, Sophie; Bossy, Robert; Eckart de Castilho, Richard; Piperidis, Stelios; McNaught, John; Ananiadou, Sophia

    2016-01-01

    Text mining is a powerful technology for quickly distilling key information from vast quantities of biomedical literature. However, to harness this power the researcher must be well versed in the availability, suitability, adaptability, interoperability and comparative accuracy of current text mining resources. In this survey, we give an overview of the text mining resources that exist in the life sciences to help researchers, especially those employed in biocuration, to engage with text mining in their own work. We categorize the various resources under three sections: Content Discovery looks at where and how to find biomedical publications for text mining; Knowledge Encoding describes the formats used to represent the different levels of information associated with content that enable text mining, including those formats used to carry such information between processes; Tools and Services gives an overview of workflow management systems that can be used to rapidly configure and compare domain- and task-specific processes, via access to a wide range of pre-built tools. We also provide links to relevant repositories in each section to enable the reader to find resources relevant to their own area of interest. Throughout this work we give a special focus to resources that are interoperable—those that have the crucial ability to share information, enabling smooth integration and reusability. PMID:27888231

  15. Planetary Sciences Literature - Access and Discovery

    Science.gov (United States)

    Henneken, Edwin A.; ADS Team

    2017-10-01

    The NASA Astrophysics Data System (ADS) has been around for over 2 decades, helping professional astronomers and planetary scientists navigate, without charge, through the increasingly complex environment of scholarly publications. As boundaries between disciplines dissolve and expand, the ADS provides powerful tools to help researchers discover useful information efficiently. In its new form, code-named ADS Bumblebee (https://ui.adsabs.harvard.edu), it may very well answer questions you didn't know you had! While the classic ADS (http://ads.harvard.edu) focuses mostly on searching basic metadata (author, title and abstract), today's ADS is best described as a an "aggregator" of scholarly resources relevant to the needs of researchers in astronomy and planetary sciences, and providing a discovery environment on top of this. In addition to indexing content from a variety of publishers, data and software archives, the ADS enriches its records by text-mining and indexing the full-text articles (about 4.7 million in total, with 130,000 from planetary science journals), enriching its metadata through the extraction of citations and acknowledgments. Recent technology developments include a new Application Programming Interface (API), a new user interface featuring a variety of visualizations and bibliometric analysis, and integration with ORCID services to support paper claiming. The new ADS provides powerful tools to help you find review papers on a given subject, prolific authors working on a subject and who they are collaborating with (within and outside their group) and papers most read by by people who read recent papers on the topic of your interest. These are just a couple of examples of the capabilities of the new ADS. We currently index most journals covering the planetary sciences and we are striving to include those journals most frequently cited by planetary science publications. The ADS is operated by the Smithsonian Astrophysical Observatory under NASA

  16. Promise Fulfilled? An EBSCO Discovery Service Usability Study

    Science.gov (United States)

    Williams, Sarah C.; Foster, Anita K.

    2011-01-01

    Discovery tools are the next phase of library search systems. Illinois State University's Milner Library implemented EBSCO Discovery Service in August 2010. The authors conducted usability studies on the system in the fall of 2010. The aims of the study were twofold: first, to determine how Milner users set about using the system in order to…

  17. Too New for Textbooks: The Biotechnology Discoveries & Applications Guidebook

    Science.gov (United States)

    Loftin, Madelene; Lamb, Neil E.

    2013-01-01

    The "Biotechnology Discoveries and Applications" guidebook aims to provide teachers with an overview of the recent advances in genetics and biotechnology, allowing them to share these findings with their students. The annual guidebook introduces a wealth of modern genomic discoveries and provides teachers with tools to integrate exciting…

  18. Green Power Partner Resources

    Science.gov (United States)

    EPA Green Power Partners can access tools and resources to help promote their green power commitments. Partners use these tools to communicate the benefits of their green power use to their customers, stakeholders, and the general public.

  19. The effect of pharmacogenetic profiling with a clinical decision support tool on healthcare resource utilization and estimated costs in the elderly exposed to polypharmacy.

    Science.gov (United States)

    Brixner, D; Biltaji, E; Bress, A; Unni, S; Ye, X; Mamiya, T; Ashcraft, K; Biskupiak, J

    2016-01-01

    To compare healthcare resource utilization (HRU) and clinical decision-making for elderly patients based on cytochrome P450 (CYP) pharmacogenetic testing and the use of a comprehensive medication management clinical decision support tool (CDST), to a cohort of similar non-tested patients. An observational study compared a prospective cohort of patients ≥65 years subjected to pharmacogenetic testing to a propensity score (PS) matched historical cohort of untested patients in a claims database. Patients had a prescribed medication or dose change of at least one of 61 oral drugs or combinations of ≥3 drugs at enrollment. Four-month HRU outcomes examined included hospitalizations, emergency department (ED) and outpatient visits and provider acceptance of test recommendations. Costs were estimated using national data sources. There were 205 tested patients PS matched to 820 untested patients. Hospitalization rate was 9.8% in the tested group vs. 16.1% in the untested group (RR = 0.61, 95% CI = 0.39-0.95, p = 0.027), ED visit rate was 4.4% in the tested group vs. 15.4% in the untested group (RR = 0.29, 95% CI = 0.15-0.55, p = 0.0002) and outpatient visit rate was 71.7% in the tested group vs. 36.5% in the untested group (RR = 1.97, 95% CI = 1.74-2.23, p provider majority (95%) considered the test helpful and 46% followed CDST provided recommendations. Patients CYP DNA tested and treated according to the personalized prescribing system had a significant decrease in hospitalizations and emergency department visits, resulting in potential cost savings. Providers had a high satisfaction rate with the clinical utility of the system and followed recommendations when appropriate.

  20. User manuals for the Delaware River Basin Water Availability Tool for Environmental Resources (DRB–WATER) and associated WATER application utilities

    Science.gov (United States)

    Williamson, Tanja N.; Lant, Jeremiah G.

    2015-11-18

    The Water Availability Tool for Environmental Resources (WATER) is a decision support system (DSS) for the nontidal part of the Delaware River Basin (DRB) that provides a consistent and objective method of simulating streamflow under historical, forecasted, and managed conditions. WATER integrates geospatial sampling of landscape characteristics, including topographic and soil properties, with a regionally calibrated hillslope-hydrology model, an impervious-surface model, and hydroclimatic models that have been parameterized using three hydrologic response units—forested, agricultural, and developed land cover. It is this integration that enables the regional hydrologic-modeling approach used in WATER without requiring site-specific optimization or those stationary conditions inferred when using a statistical model. The DSS provides a “historical” database, ideal for simulating streamflow for 2001–11, in addition to land-cover forecasts that focus on 2030 and 2060. The WATER Application Utilities are provided with the DSS and apply change factors for precipitation, temperature, and potential evapotranspiration to a 1981–2011 climatic record provided with the DSS. These change factors were derived from a suite of general circulation models (GCMs) and representative concentration pathway (RCP) emission scenarios. These change factors are based on 25-year monthly averages (normals) that are centere on 2030 and 2060. The WATER Application Utilities also can be used to apply a 2010 snapshot of water use for the DRB; a factorial approach enables scenario testing of increased or decreased water use for each simulation. Finally, the WATER Application Utilities can be used to reformat streamflow time series for input to statistical or reservoir management software. 

  1. The Greatest Mathematical Discovery?

    Energy Technology Data Exchange (ETDEWEB)

    Bailey, David H.; Borwein, Jonathan M.

    2010-05-12

    What mathematical discovery more than 1500 years ago: (1) Is one of the greatest, if not the greatest, single discovery in the field of mathematics? (2) Involved three subtle ideas that eluded the greatest minds of antiquity, even geniuses such as Archimedes? (3) Was fiercely resisted in Europe for hundreds of years after its discovery? (4) Even today, in historical treatments of mathematics, is often dismissed with scant mention, or else is ascribed to the wrong source? Answer: Our modern system of positional decimal notation with zero, together with the basic arithmetic computational schemes, which were discovered in India about 500 CE.

  2. Perspectives of biomolecular NMR in drug discovery: the blessing and curse of versatility

    International Nuclear Information System (INIS)

    Jahnke, Wolfgang

    2007-01-01

    The versatility of NMR and its broad applicability to several stages in the drug discovery process is well known and generally considered one of the major strengths of NMR (Pellecchia et al., Nature Rev Drug Discov 1:211-219, 2002; Stockman and Dalvit, Prog Nucl Magn Reson Spectrosc 41:187-231, 2002; Lepre et al., Comb Chem High throughput screen 5:583-590, 2002; Wyss et al., Curr Opin Drug Discov Devel 5:630-647, 2002; Jahnke and Widmer, Cell Mol Life Sci 61:580-599, 2004; Huth et al., Methods Enzymol 394:549-571, 2005b; Klages et al., Mol Biosyst 2:318-332, 2006; Takeuchi and Wagner, Curr Opin Struct Biol 16:109-117, 2006; Zartler and Shapiro, Curr Pharm Des 12:3963-3972, 2006). Indeed, NMR is the only biophysical technique which can detect and quantify molecular interactions, and at the same time provide detailed structural information with atomic level resolution. NMR should therefore be ideally suited and widely requested as a tool for drug discovery research, and numerous examples of drug discovery projects which have substantially benefited from NMR contributions or were even driven by NMR have been described in the literature. However, not all pharmaceutical companies have rigorously implemented NMR as integral tool of their research processes. Some companies invest with limited resources, and others do not use biomolecular NMR at all. This discrepancy in assessing the value of a technology is striking, and calls for clarification-under which circumstances can NMR provide added value to the drug discovery process? What kind of contributions can NMR make, and how is it implemented and integrated for maximum impact? This perspectives article suggests key areas of impact for NMR, and a model of integrating NMR with other technologies to realize synergies and maximize their value for drug discovery

  3. Tools and data services registry

    DEFF Research Database (Denmark)

    Ison, Jon; Rapacki, Kristoffer; Ménager, Hervé

    2016-01-01

    Life sciences are yielding huge data sets that underpin scientific discoveries fundamental to improvement in human health, agriculture and the environment. In support of these discoveries, a plethora of databases and tools are deployed, in technically complex and diverse implementations, across a...

  4. Multidimensional process discovery

    NARCIS (Netherlands)

    Ribeiro, J.T.S.

    2013-01-01

    Typically represented in event logs, business process data describe the execution of process events over time. Business process intelligence (BPI) techniques such as process mining can be applied to get strategic insight into business processes. Process discovery, conformance checking and

  5. Fateful discovery almost forgotten

    CERN Multimedia

    1989-01-01

    "The discovery of the fission of uranium exactly half a century ago is at risk of passing unremarked because of the general ambivalence towards the consequences of this development. Can that be wise?" (4 pages)

  6. Defining Creativity with Discovery

    OpenAIRE

    Wilson, Nicholas Charles; Martin, Lee

    2017-01-01

    The standard definition of creativity has enabled significant empirical and theoretical advances, yet contains philosophical conundrums concerning the nature of novelty and the role of recognition and values. In this work we offer an act of conceptual valeting that addresses these issues and in doing so, argue that creativity definitions can be extended through the use of discovery. Drawing on dispositional realist philosophy we outline why adding the discovery and bringing into being of new ...

  7. On the antiproton discovery

    International Nuclear Information System (INIS)

    Piccioni, O.

    1989-01-01

    The author of this article describes his own role in the discovery of the antiproton. Although Segre and Chamberlain received the Nobel Prize in 1959 for its discovery, the author claims that their experimental method was his idea which he communicated to them informally in December 1954. He describes how his application for citizenship (he was Italian), and other scientists' manipulation, prevented him from being at Berkeley to work on the experiment himself. (UK)

  8. Discovery Driven Growth

    DEFF Research Database (Denmark)

    Bukh, Per Nikolaj

    2009-01-01

    Anmeldelse af Discovery Driven Growh : A breakthrough process to reduce risk and seize opportunity, af Rita G. McGrath & Ian C. MacMillan, Boston: Harvard Business Press. Udgivelsesdato: 14 august......Anmeldelse af Discovery Driven Growh : A breakthrough process to reduce risk and seize opportunity, af Rita G. McGrath & Ian C. MacMillan, Boston: Harvard Business Press. Udgivelsesdato: 14 august...

  9. The π discovery

    International Nuclear Information System (INIS)

    Fowler, P.H.

    1988-01-01

    The paper traces the discovery of the Π meson. The discovery was made by exposure of nuclear emulsions to cosmic radiation at high altitudes, with subsequent scanning of the emulsions for meson tracks. Disintegration of nuclei by a negative meson, and the decay of a Π meson were both observed. Further measurements revealed the mass of the meson. The studies carried out on the origin of the Π-mesons, and their mode of decay, are both described. (U.K.)

  10. Computational neuropharmacology: dynamical approaches in drug discovery.

    Science.gov (United States)

    Aradi, Ildiko; Erdi, Péter

    2006-05-01

    Computational approaches that adopt dynamical models are widely accepted in basic and clinical neuroscience research as indispensable tools with which to understand normal and pathological neuronal mechanisms. Although computer-aided techniques have been used in pharmaceutical research (e.g. in structure- and ligand-based drug design), the power of dynamical models has not yet been exploited in drug discovery. We suggest that dynamical system theory and computational neuroscience--integrated with well-established, conventional molecular and electrophysiological methods--offer a broad perspective in drug discovery and in the search for novel targets and strategies for the treatment of neurological and psychiatric diseases.

  11. Diagnostic accuracy of touch imprint cytology for head and neck malignancies: a useful intra-operative tool in resource limited countries.

    Science.gov (United States)

    Naveed, Hania; Abid, Mariam; Hashmi, Atif Ali; Edhi, Muhammad Muzammamil; Sheikh, Ahmareen Khalid; Mudassir, Ghazala; Khan, Amir

    2017-01-01

    Intraoperative consultation is an important tool for the evaluation of the upper aerodigestive tract (UAT) malignancies. Although frozen section analysis is a preferred method of intra-operative consultation, however in resource limited countries like Pakistan, this facility is not available in most institutes; therefore, we aimed to evaluate the diagnostic accuracy of touch imprint cytology for UAT malignancies using histopathology of the same tissue as gold standard. The study involved 70 cases of UAT lesions operated during the study period. Intraoperatively, after obtaining the fresh biopsy specimen and prior to placing them in fixative, each specimen was imprinted on 4-6 glass slides, fixed immediately in 95% alcohol and stained with Hematoxylin and Eosin stain. After completion of the cytological procedure, the surgical biopsy specimen was processed. The slides of both touch Imprint cytology and histopathology were examined by two consultant histopathologists. The result of touch imprint cytology showed that touch imprint cytology was diagnostic in 68 cases (97.1%), 55 (78.6%) being malignant, 2 cases (2.9%) were suspicious for malignancy, 11 cases (15.7%) were negative for malignancy while 2 cases (2.9%) were false negative. Amongst the 70 cases, 55 cases (78.6%) were malignant showing squamous cell carcinoma in 49 cases (70%), adenoid cystic carcinoma in 2 cases (2.9%), non-Hodgkin lymphoma 2 cases (2.9%), Mucoepidermoid carcinoma 1 case (1.4%), spindle cell sarcoma in 1 case (1.4%). Two cases (2.9%) were suspicious of malignancy showing atypical squamoid cells on touch imprint cytology, while 13 cases (18.6%) were negative for malignancy, which also included 2 false negative cases. The overall diagnostic accuracy of touch imprint cytology came out to be 96.7% with a sensitivity and specificity of 96 and 100%, respectively while PPV and NPV of touch imprint cytology was found to be 100 and 84%, respectively. Our experience in this study has demonstrated

  12. Cardio-Thoracic Ratio Is Stable, Reproducible and Has Potential as a Screening Tool for HIV-1 Related Cardiac Disorders in Resource Poor Settings.

    Directory of Open Access Journals (Sweden)

    Hanif Esmail

    Full Text Available Cardiovascular disorders are common in HIV-1 infected persons in Africa and presentation is often insidious. Development of screening algorithms for cardiovascular disorders appropriate to a resource-constrained setting could facilitate timely referral. Cardiothoracic ratio (CTR on chest radiograph (CXR has been suggested as a potential screening tool but little is known about its reproducibility and stability. Our primary aim was to evaluate the stability and the inter-observer variability of CTR in HIV-1 infected outpatients. We further evaluated the prevalence of cardiomegaly (CTR≥0.5 and its relationship with other risk factors in this population.HIV-1 infected participants were identified during screening for a tuberculosis vaccine trial in Khayelitsha, South Africa between August 2011 and April 2012. Participants had a digital posterior-anterior CXR performed as well as history, examination and baseline observations. CXRs were viewed using OsiriX software and CTR calculated using digital callipers.450 HIV-1-infected adults were evaluated, median age 34 years (IQR 30-40 with a CD4 count 566/mm3 (IQR 443-724, 70% on antiretroviral therapy (ART. The prevalence of cardiomegaly was 12.7% (95% C.I. 9.6%-15.8%. CTR was calculated by a 2nd reader for 113 participants, measurements were highly correlated r = 0.95 (95% C.I. 0.93-0.97 and agreement of cardiomegaly substantial κ = 0.78 (95% C.I 0.61-0.95. CXR were repeated in 51 participants at 4-12 weeks, CTR measurements between the 2 time points were highly correlated r = 0.77 (95% C.I 0.68-0.88 and agreement of cardiomegaly excellent κ = 0.92 (95% C.I. 0.77-1. Participants with cardiomegaly had a higher median BMI (31.3; IQR 27.4-37.4 versus 26.9; IQR 23.2-32.4; p<0.0001 and median systolic blood pressure (130; IQR 121-141 versus 125; IQR 117-135; p = 0.01.CTR is a robust measurement, stable over time with substantial inter-observer agreement. A prospective study evaluating utility of CXR to

  13. Assessment of fuelwood resources in acacia woodlands in the Rift Valley of Ethiopia. Towards the development of planning tools for sustainable management

    Energy Technology Data Exchange (ETDEWEB)

    Eshete, Getachew [Swedish Univ. of Agricultural Sciences, Umeaa (Sweden). Dept. of Forest Resource Management and Geomatics

    1999-07-01

    The subjects addressed in this thesis are the development and description of methods for acquiring information on the state and change and for forecasting the potential future state in the acacia woodlands in the Rift Valley of Ethiopia. Since issues of data collection and prediction are core components of planning, it is believed that the methods developed and described will be useful in future integrated planning tools for the sustainable management of acacia woodlands. The thesis is composed of five studies that are reported separately. In the first study, the focus was on describing the population structure and regeneration of the main tree species in the study area. In the second study, biomass functions suitable for application in multiphase inventory designs were developed. In the third study, the application of satellite image data in the assessment of the acacia woodlands, partly using the functions developed in the second study, is demonstrated. Different methods were applied for assessing the biomass and crown closure of acacias in the study area. Results using the different estimators revealed that the average woody biomass is in the range of 10 to 20 tons ha{sup -1} and the crown cover was reduced by approximately 7 to 12 %, between the years 1972 and 1995, in the study area. Furthermore, the advantages and difficulties of using different designs were investigated. Maps indicating the patterns of the state and change of the woodlands in relation to the landscape features of the study site were also produced. In the fourth study, the aim was to find out whether or not tree-rings in acacias can be used as indicators of growth periodicity. The results indicated that the acacias form one ring per year in the study area, although this was not the case in surrounding, more humid, areas. Following this outcome, in the fifth study, single tree growth functions were developed and were utilized to project the biomass growth of acacias in the study area. One area

  14. Developing integrated crop knowledge networks to advance candidate gene discovery.

    Science.gov (United States)

    Hassani-Pak, Keywan; Castellote, Martin; Esch, Maria; Hindle, Matthew; Lysenko, Artem; Taubert, Jan; Rawlings, Christopher

    2016-12-01

    The chances of raising crop productivity to enhance global food security would be greatly improved if we had a complete understanding of all the biological mechanisms that underpinned traits such as crop yield, disease resistance or nutrient and water use efficiency. With more crop genomes emerging all the time, we are nearer having the basic information, at the gene-level, to begin assembling crop gene catalogues and using data from other plant species to understand how the genes function and how their interactions govern crop development and physiology. Unfortunately, the task of creating such a complete knowledge base of gene functions, interaction networks and trait biology is technically challenging because the relevant data are dispersed in myriad databases in a variety of data formats with variable quality and coverage. In this paper we present a general approach for building genome-scale knowledge networks that provide a unified representation of heterogeneous but interconnected datasets to enable effective knowledge mining and gene discovery. We describe the datasets and outline the methods, workflows and tools that we have developed for creating and visualising these networks for the major crop species, wheat and barley. We present the global characteristics of such knowledge networks and with an example linking a seed size phenotype to a barley WRKY transcription factor orthologous to TTG2 from Arabidopsis, we illustrate the value of integrated data in biological knowledge discovery. The software we have developed (www.ondex.org) and the knowledge resources (http://knetminer.rothamsted.ac.uk) we have created are all open-source and provide a first step towards systematic and evidence-based gene discovery in order to facilitate crop improvement.

  15. Qualitative and quantitative characterization of plasma proteins when incorporating traveling wave ion mobility into a liquid chromatography-mass spectrometry workflow for biomarker discovery: use of product ion quantitation as an alternative data analysis tool for label free quantitation.

    Science.gov (United States)

    Daly, Charlotte E; Ng, Leong L; Hakimi, Amirmansoor; Willingale, Richard; Jones, Donald J L

    2014-02-18

    Discovery of protein biomarkers in clinical samples necessitates significant prefractionation prior to liquid chromatography-mass spectrometry (LC-MS) analysis. Integrating traveling wave ion mobility spectrometry (TWIMS) enables in-line gas phase separation which when coupled with nanoflow liquid chromatography and data independent acquisition tandem mass spectrometry, confers significant advantages to the discovery of protein biomarkers by improving separation and inherent sensitivity. Incorporation of TWIMS leads to a packet of concentrated ions which ultimately provides a significant improvement in sensitivity. As a consequence of ion packeting, when present at high concentrations, accurate quantitation of proteins can be affected due to detector saturation effects. Human plasma was analyzed in triplicate using liquid-chromatography data independent acquisition mass spectrometry (LC-DIA-MS) and using liquid-chromatography ion-mobility data independent acquisition mass spectrometry (LC-IM-DIA-MS). The inclusion of TWIMS was assessed for the effect on sample throughput, data integrity, confidence of protein and peptide identification, and dynamic range. The number of identified proteins is significantly increased by an average of 84% while both the precursor and product mass accuracies are maintained between the modalities. Sample dynamic range is also maintained while quantitation is achieved for all but the most abundant proteins by incorporating a novel data interpretation method that allows accurate quantitation to occur. This additional separation is all achieved within a workflow with no discernible deleterious effect on throughput. Consequently, TWIMS greatly enhances proteome coverage and can be reliably used for quantification when using an alternative product ion quantification strategy. Using TWIMS in biomarker discovery in human plasma is thus recommended.

  16. Bioinformatics for discovery of microbiome variation

    DEFF Research Database (Denmark)

    Brejnrod, Asker Daniel

    of various molecular methods to build hypotheses about the impact of a copper contaminated soil. The introduction is a broad introduction to the field of microbiome research with a focus on the technologies that enable these discoveries and how some of the broader issues have related to this thesis......Sequencing based tools have revolutionized microbiology in recent years. Highthroughput DNA sequencing have allowed high-resolution studies on microbial life in many different environments and at unprecedented low cost. These culture-independent methods have helped discovery of novel bacteria...... 1 ,“Large-scale benchmarking reveals false discoveries and count transformation sensitivity in 16S rRNA gene amplicon data analysis methods used in microbiome studies”, benchmarked the performance of a variety of popular statistical methods for discovering differentially abundant bacteria . between...

  17. Drug target ontology to classify and integrate drug discovery data.

    Science.gov (United States)

    Lin, Yu; Mehta, Saurabh; Küçük-McGinty, Hande; Turner, John Paul; Vidovic, Dusica; Forlin, Michele; Koleti, Amar; Nguyen, Dac-Trung; Jensen, Lars Juhl; Guha, Rajarshi; Mathias, Stephen L; Ursu, Oleg; Stathias, Vasileios; Duan, Jianbin; Nabizadeh, Nooshin; Chung, Caty; Mader, Christopher; Visser, Ubbo; Yang, Jeremy J; Bologa, Cristian G; Oprea, Tudor I; Schürer, Stephan C

    2017-11-09

    One of the most successful approaches to develop new small molecule therapeutics has been to start from a validated druggable protein target. However, only a small subset of potentially druggable targets has attracted significant research and development resources. The Illuminating the Druggable Genome (IDG) project develops resources to catalyze the development of likely targetable, yet currently understudied prospective drug targets. A central component of the IDG program is a comprehensive knowledge resource of the druggable genome. As part of that effort, we have developed a framework to integrate, navigate, and analyze drug discovery data based on formalized and standardized classifications and annotations of druggable protein targets, the Drug Target Ontology (DTO). DTO was constructed by extensive curation and consolidation of various resources. DTO classifies the four major drug target protein families, GPCRs, kinases, ion channels and nuclear receptors, based on phylogenecity, function, target development level, disease association, tissue expression, chemical ligand and substrate characteristics, and target-family specific characteristics. The formal ontology was built using a new software tool to auto-generate most axioms from a database while supporting manual knowledge acquisition. A modular, hierarchical implementation facilitate ontology development and maintenance and makes use of various external ontologies, thus integrating the DTO into the ecosystem of biomedical ontologies. As a formal OWL-DL ontology, DTO contains asserted and inferred axioms. Modeling data from the Library of Integrated Network-based Cellular Signatures (LINCS) program illustrates the potential of DTO for contextual data integration and nuanced definition of important drug target characteristics. DTO has been implemented in the IDG user interface Portal, Pharos and the TIN-X explorer of protein target disease relationships. DTO was built based on the need for a formal semantic

  18. Introduction to fragment-based drug discovery.

    Science.gov (United States)

    Erlanson, Daniel A

    2012-01-01

    Fragment-based drug discovery (FBDD) has emerged in the past decade as a powerful tool for discovering drug leads. The approach first identifies starting points: very small molecules (fragments) that are about half the size of typical drugs. These fragments are then expanded or linked together to generate drug leads. Although the origins of the technique date back some 30 years, it was only in the mid-1990s that experimental techniques became sufficiently sensitive and rapid for the concept to be become practical. Since that time, the field has exploded: FBDD has played a role in discovery of at least 18 drugs that have entered the clinic, and practitioners of FBDD can be found throughout the world in both academia and industry. Literally dozens of reviews have been published on various aspects of FBDD or on the field as a whole, as have three books (Jahnke and Erlanson, Fragment-based approaches in drug discovery, 2006; Zartler and Shapiro, Fragment-based drug discovery: a practical approach, 2008; Kuo, Fragment based drug design: tools, practical approaches, and examples, 2011). However, this chapter will assume that the reader is approaching the field with little prior knowledge. It will introduce some of the key concepts, set the stage for the chapters to follow, and demonstrate how X-ray crystallography plays a central role in fragment identification and advancement.

  19. Discovery of charm

    International Nuclear Information System (INIS)

    Goldhaber, G.

    1984-11-01

    In my talk I will cover the period 1973 to 1976 which saw the discoveries of the J/psi and psi' resonances and most of the Psion spectroscopy, the tau lepton and the D 0 ,D + charmed meson doublet. Occasionally I will refer briefly to more recent results. Since this conference is on the history of the weak-interactions I will deal primarily with the properties of naked charm and in particular the weakly decaying doublet of charmed mesons. Most of the discoveries I will mention were made with the SLAC-LBL Magnetic Detector or MARK I which we operated at SPEAR from 1973 to 1976. 27 references

  20. Proxy support for service discovery using mDNS/DNS-SD in low power networks

    NARCIS (Netherlands)

    Stolikj, M.; Verhoeven, R.; Cuijpers, P.J.L.; Lukkien, J.J.

    2014-01-01

    We present a solution for service discovery of resource constrained devices based on mDNS/DNS-SD. We extend the mDNS/DNS-SD service discovery protocol with support for proxy servers. Proxy servers temporarily store information about services offered on resource constrained devices and respond on

  1. A study of the discovery process in 802.11 networks

    OpenAIRE

    Castignani , German; Arcia Moret , Andres Emilio; Montavont , Nicolas

    2011-01-01

    International audience; Today wireless communications are a synonym of mobility and resource sharing. These characteristics, proper of both infrastructure and ad-hoc networks, heavily relies on a general resource discovery process. The discovery process, being an unavoidable procedure, has to be fast and reliable to mitigate the effect of network disruptions. In this article, by means of simulations and a real testbed, our contribution is twofold. First we assess the discovery process focusin...

  2. Hydrochemistry and isotope geochemistry as management tools for groundwater resources in multilayer aquifers: A study case from the Po plain (Lomellina, South-Western Lombardy, Italy)

    Energy Technology Data Exchange (ETDEWEB)

    Pilla, G; Sacchi, E; Ciancetti, G; Braga, G [Dipartimento di Scienze della Terra, Universita di Pavia, Pavia (Italy); Zuppi, G M [Dipartimento di Scienze Ambientali, Universita Ca' Foscari di Venezia, Venice (Italy)

    2003-07-01

    Full text: The Po plain, located in Northern Italy, hosts a multi-layer alluvial aquifer of Quaternary age constituted by sands interbedded with clays. The plain supports most of the agricultural and industrial activities of Northern Italy, which are associated with groundwater pollution in the shallower portions of the aquifer. The increasing demand of water for industrial and domestic use has led to the exploitation of deeper layers of the aquifer, without a rational management of the resource. Only in the last decade, the government agencies have started a global evaluation of the quality standards of pumped groundwater, urged by the increasing need for clean water for domestic use. The task is particularly difficult because of missing or approximate well logs and the presence of multi-filter wells tapping in different aquifers. In this case the chemical and isotopic characterisation of groundwaters is the only reliable tool to reconstruct the geometry, the interconnections and the characteristics of the aquifers. This study, promoted by the local agency for groundwater management and protection (Amministrazione Provinciale di Pavia, settore tutela e valorizzazione ambientale - U.O.C. Acqua) focused on a limited portion of the Po plain, the Lomellina region, of approximately 900 km{sup 2}. The region is bound to the South by the Po river, to the East and West by the Sesia and the Ticino rivers respectively, and to the North by the administrative boundary. The study aimed at the hydrogeological, hydrochemical and isotopic characterisation of the aquifers, allowing to serve as basis for the correct management of the groundwater resource. A preliminary reconstruction of the hydrogeological asset of the Lomellina plain was performed through the analysis of the stratigraphic data from 102 municipal wells. On this basis, a shallow phreatic aquifer, reaching depths of about 50-60 m from the surface, and two groups of aquifers containing confined groundwater, were

  3. Scheme Program Documentation Tools

    DEFF Research Database (Denmark)

    Nørmark, Kurt

    2004-01-01

    are separate and intended for different documentation purposes they are related to each other in several ways. Both tools are based on XML languages for tool setup and for documentation authoring. In addition, both tools rely on the LAML framework which---in a systematic way---makes an XML language available...... as named functions in Scheme. Finally, the Scheme Elucidator is able to integrate SchemeDoc resources as part of an internal documentation resource....

  4. Open Drug Discovery Toolkit (ODDT): a new open-source player in the drug discovery field.

    Science.gov (United States)

    Wójcikowski, Maciej; Zielenkiewicz, Piotr; Siedlecki, Pawel

    2015-01-01

    There has been huge progress in the open cheminformatics field in both methods and software development. Unfortunately, there has been little effort to unite those methods and software into one package. We here describe the Open Drug Discovery Toolkit (ODDT), which aims to fulfill the need for comprehensive and open source drug discovery software. The Open Drug Discovery Toolkit was developed as a free and open source tool for both computer aided drug discovery (CADD) developers and researchers. ODDT reimplements many state-of-the-art methods, such as machine learning scoring functions (RF-Score and NNScore) and wraps other external software to ease the process of developing CADD pipelines. ODDT is an out-of-the-box solution designed to be easily customizable and extensible. Therefore, users are strongly encouraged to extend it and develop new methods. We here present three use cases for ODDT in common tasks in computer-aided drug discovery. Open Drug Discovery Toolkit is released on a permissive 3-clause BSD license for both academic and industrial use. ODDT's source code, additional examples and documentation are available on GitHub (https://github.com/oddt/oddt).

  5. Discovery: Pile Patterns

    Science.gov (United States)

    de Mestre, Neville

    2017-01-01

    Earlier "Discovery" articles (de Mestre, 1999, 2003, 2006, 2010, 2011) considered patterns from many mathematical situations. This article presents a group of patterns used in 19th century mathematical textbooks. In the days of earlier warfare, cannon balls were stacked in various arrangements depending on the shape of the pile base…

  6. Discovery and Innovation

    African Journals Online (AJOL)

    Discovery and Innovation is a journal of the African Academy of Sciences (AAS) ... World (TWAS) meant to focus attention on science and technology in Africa and the ... of Non-wood Forest Products: Potential Impacts and Challenges in Africa ...

  7. Discovery of TUG-770

    DEFF Research Database (Denmark)

    Christiansen, Elisabeth; Hansen, Steffen V F; Urban, Christian

    2013-01-01

    Free fatty acid receptor 1 (FFA1 or GPR40) enhances glucose-stimulated insulin secretion from pancreatic β-cells and currently attracts high interest as a new target for the treatment of type 2 diabetes. We here report the discovery of a highly potent FFA1 agonist with favorable physicochemical...

  8. The discovery of fission

    International Nuclear Information System (INIS)

    McKay, H.A.C.

    1978-01-01

    In this article by the retired head of the Separation Processes Group of the Chemistry Division, Atomic Energy Research Establishment, Harwell, U.K., the author recalls what he terms 'an exciting drama, the unravelling of the nature of the atomic nucleus' in the years before the Second World War, including the discovery of fission. 12 references. (author)

  9. The Discovery of America

    Science.gov (United States)

    Martin, Paul S.

    1973-01-01

    Discusses a model for explaining the spread of human population explosion on North American continent since its discovery 12,000 years ago. The model may help to map the spread of Homo sapiens throughout the New World by using the extinction chronology of the Pleistocene megafauna. (Author/PS)

  10. Novel approaches to develop community-built biological network models for potential drug discovery.

    Science.gov (United States)

    Talikka, Marja; Bukharov, Natalia; Hayes, William S; Hofmann-Apitius, Martin; Alexopoulos, Leonidas; Peitsch, Manuel C; Hoeng, Julia

    2017-08-01

    Hundreds of thousands of data points are now routinely generated in clinical trials by molecular profiling and NGS technologies. A true translation of this data into knowledge is not possible without analysis and interpretation in a well-defined biology context. Currently, there are many public and commercial pathway tools and network models that can facilitate such analysis. At the same time, insights and knowledge that can be gained is highly dependent on the underlying biological content of these resources. Crowdsourcing can be employed to guarantee the accuracy and transparency of the biological content underlining the tools used to interpret rich molecular data. Areas covered: In this review, the authors describe crowdsourcing in drug discovery. The focal point is the efforts that have successfully used the crowdsourcing approach to verify and augment pathway tools and biological network models. Technologies that enable the building of biological networks with the community are also described. Expert opinion: A crowd of experts can be leveraged for the entire development process of biological network models, from ontologies to the evaluation of their mechanistic completeness. The ultimate goal is to facilitate biomarker discovery and personalized medicine by mechanistically explaining patients' differences with respect to disease prevention, diagnosis, and therapy outcome.

  11. The Petroleum resources on the Norwegian Continental Shelf. 2011

    Energy Technology Data Exchange (ETDEWEB)

    NONE

    2011-07-01

    This resource report provides a survey of petroleum resources on the NCS. Content: Resource account; Unconventional oil and gas resources; Future oil and gas production; Challenges for producing fields; Discoveries; Undiscovered resources; Curbing greenhouse gas emissions; Technology and talent; Exploration and new areas; How undiscovered resources are calculated; The NPD's project database; Play analysis; Changes to and reductions in estimated undiscovered resources; Unconventional petroleum resources; Many wells, Increased exploration, Every little helps; Varied discovery success; Sub-basalt in the Norwegian Sea; High exploration costs; Profitable exploration; Unopened areas - mostly in the far north; Resource base; Small discoveries; Location; Development solutions, Profitability of discoveries; Things may take time; Area perspective; Development of production; Remaining reserves and resources in fields; Target for reserve growth; Existing technology; Water and gas injection; Drilling and wells; Infrastructure challenges; New methods and technology; Challenges for pilot projects; Long-term thinking and creativity. (eb)

  12. Configurable User Interface Framework for Data Discovery in Cross-Disciplinary and Citizen Science

    Science.gov (United States)

    Rozell, E.; Wang, H.; West, P.; Zednik, S.; Fox, P.

    2012-04-01

    Use cases for data discovery and analysis vary widely when looking across disciplines and levels of expertise. Domain experts across disciplines may have a thorough understanding of self-describing data formats, such as netCDF, and the software packages that are compatible. However, they may be unfamiliar with specific vocabulary terms used to describe the data parameters or instrument packages in someone else's collection, which are often useful in data discovery. Citizen scientists may struggle with both expert vocabularies and knowledge of existing tools for analyzing and visualizing data. There are some solutions for each problem individually. For expert vocabularies, semantic technologies like the Resource Description Framework (RDF) have been used to map terms from an expert vocabulary to layperson terminology. For data analysis and visualization, tools can be mapped to data products using semantic technologies as well. This presentation discusses a solution to both problems based on the S2S Framework, a configurable user interface (UI) framework for Web services. S2S unifies the two solutions previously described using a data service abstraction ("search services") and a UI abstraction ("widgets"). Using the OWL Web Ontology Language, S2S defines a vocabulary for describing search services and their outputs, and the compatibility of those outputs with UI widgets. By linking search service outputs to widgets, S2S can automatically compose UIs for search and analysis of data, making it easier for citizen scientists to manipulate data. We have also created Linked Data widgets for S2S, which can leverage distributed RDF resources to present alternative views of expert vocabularies. This presentation covers some examples where we have applied these solutions to improve data discovery for both cross-disciplinary and non-expert users.

  13. Two-Year Community: Tools for Success--A Study of the Resources and Study Habits of General Chemistry I Students at Two Community Colleges

    Science.gov (United States)

    Bruck, Laura B.; Bruck, Aaron D.

    2018-01-01

    Recruitment and retention in the sciences is both difficult and crucial, especially in the community college setting. In this study, the resources used by General Chemistry I students at two different public, predominantly two-year colleges in two states were studied via surveys for a semester. Data were analyzed with respect to student attitudes…

  14. Orphan diseases: state of the drug discovery art.

    Science.gov (United States)

    Volmar, Claude-Henry; Wahlestedt, Claes; Brothers, Shaun P

    2017-06-01

    Since 1983 more than 300 drugs have been developed and approved for orphan diseases. However, considering the development of novel diagnosis tools, the number of rare diseases vastly outpaces therapeutic discovery. Academic centers and nonprofit institutes are now at the forefront of rare disease R&D, partnering with pharmaceutical companies when academic researchers discover novel drugs or targets for specific diseases, thus reducing the failure risk and cost for pharmaceutical companies. Considerable progress has occurred in the art of orphan drug discovery, and a symbiotic relationship now exists between pharmaceutical industry, academia, and philanthropists that provides a useful framework for orphan disease therapeutic discovery. Here, the current state-of-the-art of drug discovery for orphan diseases is reviewed. Current technological approaches and challenges for drug discovery are considered, some of which can present somewhat unique challenges and opportunities in orphan diseases, including the potential for personalized medicine, gene therapy, and phenotypic screening.

  15. Using ChEMBL web services for building applications and data processing workflows relevant to drug discovery.

    Science.gov (United States)

    Nowotka, Michał M; Gaulton, Anna; Mendez, David; Bento, A Patricia; Hersey, Anne; Leach, Andrew

    2017-08-01

    ChEMBL is a manually curated database of bioactivity data on small drug-like molecules, used by drug discovery scientists. Among many access methods, a REST API provides programmatic access, allowing the remote retrieval of ChEMBL data and its integration into other applications. This approach allows scientists to move from a world where they go to the ChEMBL web site to search for relevant data, to one where ChEMBL data can be simply integrated into their everyday tools and work environment. Areas covered: This review highlights some of the audiences who may benefit from using the ChEMBL API, and the goals they can address, through the description of several use cases. The examples cover a team communication tool (Slack), a data analytics platform (KNIME), batch job management software (Luigi) and Rich Internet Applications. Expert opinion: The advent of web technologies, cloud computing and micro services oriented architectures have made REST APIs an essential ingredient of modern software development models. The widespread availability of tools consuming RESTful resources have made them useful for many groups of users. The ChEMBL API is a valuable resource of drug discovery bioactivity data for professional chemists, chemistry students, data scientists, scientific and web developers.

  16. IMG-ABC: An Atlas of Biosynthetic Gene Clusters to Fuel the Discovery of Novel Secondary Metabolites

    Energy Technology Data Exchange (ETDEWEB)

    Chen, I-Min; Chu, Ken; Ratner, Anna; Palaniappan, Krishna; Huang, Jinghua; Reddy, T. B.K.; Cimermancic, Peter; Fischbach, Michael; Ivanova, Natalia; Markowitz, Victor; Kyrpides, Nikos; Pati, Amrita

    2014-10-28

    In the discovery of secondary metabolites (SMs), large-scale analysis of sequence data is a promising exploration path that remains largely underutilized due to the lack of relevant computational resources. We present IMG-ABC (https://img.jgi.doe.gov/abc/) -- An Atlas of Biosynthetic gene Clusters within the Integrated Microbial Genomes (IMG) system1. IMG-ABC is a rich repository of both validated and predicted biosynthetic clusters (BCs) in cultured isolates, single-cells and metagenomes linked with the SM chemicals they produce and enhanced with focused analysis tools within IMG. The underlying scalable framework enables traversal of phylogenetic dark matter and chemical structure space -- serving as a doorway to a new era in the discovery of novel molecules.

  17. Evaluation of CYP1A1 and CYP2B1/2 m-RNA induction in rat liver slices using the NanoString technology: a novel tool for drug discovery lead optimization.

    Science.gov (United States)

    Palamanda, Jairam R; Kumari, Pramila; Murgolo, Nicholas; Benbow, Larry; Lin, Xinjie; Nomeir, Amin A

    2009-08-01

    Cytochrome P450 (CYP) induction in rodents and humans is considered a liability for new chemical entities (NCEs) in drug discovery. In particular, CYP1A1 and CYP2B1/2 have been associated with the induction of liver tumors in oncogenicity studies during safety evaluation studies of potential drugs. In our laboratory, real time PCR (Taqman) has been used to quantify the induction of rat hepatic CYP1A1 and CYP2B1/2 in precision -cut rat liver slices. A novel technology that does not require m-RNA isolation or RT-PCR, (developed by NanoString Technologies) has been investigated to quantify CYP1A1 and CYP2B1/2 induction in rat liver slices. Seventeen commercially available compounds were evaluated using both Taqman and NanoString technologies. Precision-cut rat liver slices were incubated with individual compounds for 24 hr at 37 degrees C in a humidified CO(2) incubator and CYP1A1 and CYP2B1/2 m-RNA were quantified. The results from the NanoString technology were similar to those of the Taqman(R) with a high degree of correlation for both CYP isoforms (r(2)>0.85). Therefore, NanoString provides an additional new technology to evaluate the induction of CYP1A1 and 2B1/2, as well as potentially other enzymes or transporters in rat liver slices.

  18. Bioinformatics tools and database resources for systems genetics analysis in mice-a short review and an evaluation of future needs

    NARCIS (Netherlands)

    Durrant, Caroline; Swertz, Morris A.; Alberts, Rudi; Arends, Danny; Moeller, Steffen; Mott, Richard; Prins, Pjotr; van der Velde, K. Joeri; Jansen, Ritsert C.; Schughart, Klaus

    During a meeting of the SYSGENET working group 'Bioinformatics', currently available software tools and databases for systems genetics in mice were reviewed and the needs for future developments discussed. The group evaluated interoperability and performed initial feasibility studies. To aid future

  19. Bioinformatics tools and database resources for systems genetics analysis in miceça short review and an evaluation of future needs

    NARCIS (Netherlands)

    Durrant, M.C.; Swertz, M.A.; Alberts, R.; Arends, D.; Möller, S.; Mott, R.; Prins, J.C.P.; Velde, van der K.J.; Jansen, R.C.; Schughart, K.

    2012-01-01

    During a meeting of the SYSGENET working group ‘Bioinformatics’, currently available software tools and databases for systems genetics in mice were reviewed and the needs for future developments discussed. The group evaluated interoperability and performed initial feasibility studies. To aid future

  20. The Use of Online Quizlet.com Resource Tools to Support Native English Speaking Students of Engineering and Medical Departments in Accelerated RFL Teaching and Learning

    Directory of Open Access Journals (Sweden)

    Kh.E. Ismailova

    2016-05-01

    Full Text Available The paper presents a description of the methodology and some results of the application of tools of the language learning support portal Quizlet.com to improve the effectiveness of the accelerated development of the basic communicative skills in Russian as a foreign language (RFL for the group of the English-speaking students who arrived to study in Russia engineering, medicine and other areas. The application of the development is the basics of Russian teaching and learning in the classroom as well as in the mode of self-education and out-of-classroom events. Special attention is paid to the use of cloud-based tools to organize and conduct extracurricular activities. Particularly in the promising subject connected with the use of 3D printers for the solution of engineering problems of prosthetics of the lost bodies of animals and birds on the example of the Toucan key restoration. Analysis of the results of the use of flash cards, tests, and group games showed the promise of using the sets of Quizlet.com tools for accelerated assimilation of the native English speaking students in the area of General and special RFL vocabulary, as well as students showed that in a short time they can get and develop their basic skills of listening, reading and writing in Russian communication when Quizlet tools being used in different modes.

  1. The neutron discovery

    International Nuclear Information System (INIS)

    Six, J.

    1987-01-01

    The neutron: who had first the idea, who discovered it, who established its main properties. To these apparently simple questions, multiple answers exist. The progressive discovery of the neutron is a marvellous illustration of some characteristics of the scientific research, where the unforeseen may be combined with the expected. This discovery is replaced in the context of the 1930's scientific effervescence that succeeded the revolutionary introduction of quantum mechanics. This book describes the works of Bothe, the Joliot-Curie and Chadwick which led to the neutron in an unexpected way. A historical analysis allows to give a new interpretation on the hypothesis suggested by the Joliot-Curie. Some texts of these days will help the reader to revive this fascinating story [fr

  2. Atlas of Astronomical Discoveries

    CERN Document Server

    Schilling, Govert

    2011-01-01

    Four hundred years ago in Middelburg, in the Netherlands, the telescope was invented. The invention unleashed a revolution in the exploration of the universe. Galileo Galilei discovered mountains on the Moon, spots on the Sun, and moons around Jupiter. Christiaan Huygens saw details on Mars and rings around Saturn. William Herschel discovered a new planet and mapped binary stars and nebulae. Other astronomers determined the distances to stars, unraveled the structure of the Milky Way, and discovered the expansion of the universe. And, as telescopes became bigger and more powerful, astronomers delved deeper into the mysteries of the cosmos. In his Atlas of Astronomical Discoveries, astronomy journalist Govert Schilling tells the story of 400 years of telescopic astronomy. He looks at the 100 most important discoveries since the invention of the telescope. In his direct and accessible style, the author takes his readers on an exciting journey encompassing the highlights of four centuries of astronomy. Spectacul...

  3. Viral pathogen discovery

    Science.gov (United States)

    Chiu, Charles Y

    2015-01-01

    Viral pathogen discovery is of critical importance to clinical microbiology, infectious diseases, and public health. Genomic approaches for pathogen discovery, including consensus polymerase chain reaction (PCR), microarrays, and unbiased next-generation sequencing (NGS), have the capacity to comprehensively identify novel microbes present in clinical samples. Although numerous challenges remain to be addressed, including the bioinformatics analysis and interpretation of large datasets, these technologies have been successful in rapidly identifying emerging outbreak threats, screening vaccines and other biological products for microbial contamination, and discovering novel viruses associated with both acute and chronic illnesses. Downstream studies such as genome assembly, epidemiologic screening, and a culture system or animal model of infection are necessary to establish an association of a candidate pathogen with disease. PMID:23725672

  4. Harvest: an open platform for developing web-based biomedical data discovery and reporting applications.

    Science.gov (United States)

    Pennington, Jeffrey W; Ruth, Byron; Italia, Michael J; Miller, Jeffrey; Wrazien, Stacey; Loutrel, Jennifer G; Crenshaw, E Bryan; White, Peter S

    2014-01-01

    Biomedical researchers share a common challenge of making complex data understandable and accessible as they seek inherent relationships between attributes in disparate data types. Data discovery in this context is limited by a lack of query systems that efficiently show relationships between individual variables, but without the need to navigate underlying data models. We have addressed this need by developing Harvest, an open-source framework of modular components, and using it for the rapid development and deployment of custom data discovery software applications. Harvest incorporates visualizations of highly dimensional data in a web-based interface that promotes rapid exploration and export of any type of biomedical information, without exposing researchers to underlying data models. We evaluated Harvest with two cases: clinical data from pediatric cardiology and demonstration data from the OpenMRS project. Harvest's architecture and public open-source code offer a set of rapid application development tools to build data discovery applications for domain-specific biomedical data repositories. All resources, including the OpenMRS demonstration, can be found at http://harvest.research.chop.edu.

  5. Fateful discovery almost forgotten

    International Nuclear Information System (INIS)

    Anon.

    1989-01-01

    The paper reviews the discovery of the fission of uranium, which took place fifty years ago. A description is given of the work of Meitner and Frisch in interpreting the Fermi data on the bombardment of uranium nuclei with neutrons, i.e. proposing fission. The historical events associated with the development and exploitation of uranium fission are described, including the Manhattan Project, Hiroshima and Nagasaki, Shippingport, and Chernobyl. (U.K.)

  6. Discovery as a process

    Energy Technology Data Exchange (ETDEWEB)

    Loehle, C.

    1994-05-01

    The three great myths, which form a sort of triumvirate of misunderstanding, are the Eureka! myth, the hypothesis myth, and the measurement myth. These myths are prevalent among scientists as well as among observers of science. The Eureka! myth asserts that discovery occurs as a flash of insight, and as such is not subject to investigation. This leads to the perception that discovery or deriving a hypothesis is a moment or event rather than a process. Events are singular and not subject to description. The hypothesis myth asserts that proper science is motivated by testing hypotheses, and that if something is not experimentally testable then it is not scientific. This myth leads to absurd posturing by some workers conducting empirical descriptive studies, who dress up their study with a ``hypothesis`` to obtain funding or get it published. Methods papers are often rejected because they do not address a specific scientific problem. The fact is that many of the great breakthroughs in silence involve methods and not hypotheses or arise from largely descriptive studies. Those captured by this myth also try to block funding for those developing methods. The third myth is the measurement myth, which holds that determining what to measure is straightforward, so one doesn`t need a lot of introspection to do science. As one ecologist put it to me ``Don`t give me any of that philosophy junk, just let me out in the field. I know what to measure.`` These myths lead to difficulties for scientists who must face peer review to obtain funding and to get published. These myths also inhibit the study of science as a process. Finally, these myths inhibit creativity and suppress innovation. In this paper I first explore these myths in more detail and then propose a new model of discovery that opens the supposedly miraculous process of discovery to doser scrutiny.

  7. An integrative data analysis platform for gene set analysis and knowledge discovery in a data warehouse framework.

    Science.gov (United States)

    Chen, Yi-An; Tripathi, Lokesh P; Mizuguchi, Kenji

    2016-01-01

    Data analysis is one of the most critical and challenging steps in drug discovery and disease biology. A user-friendly resource to visualize and analyse high-throughput data provides a powerful medium for both experimental and computational biologists to understand vastly different biological data types and obtain a concise, simplified and meaningful output for better knowledge discovery. We have previously developed TargetMine, an integrated data warehouse optimized for target prioritization. Here we describe how upgraded and newly modelled data types in TargetMine can now survey the wider biological and chemical data space, relevant to drug discovery and development. To enhance the scope of TargetMine from target prioritization to broad-based knowledge discovery, we have also developed a new auxiliary toolkit to assist with data analysis and visualization in TargetMine. This toolkit features interactive data analysis tools to query and analyse the biological data compiled within the TargetMine data warehouse. The enhanced system enables users to discover new hypotheses interactively by performing complicated searches with no programming and obtaining the results in an easy to comprehend output format. Database URL: http://targetmine.mizuguchilab.org. © The Author(s) 2016. Published by Oxford University Press.

  8. Novel molecular diagnostic tools for malaria elimination: a review of options from the point of view of high-throughput and applicability in resource limited settings.

    Science.gov (United States)

    Britton, Sumudu; Cheng, Qin; McCarthy, James S

    2016-02-16

    As malaria transmission continues to decrease, an increasing number of countries will enter pre-elimination and elimination. To interrupt transmission, changes in control strategies are likely to require more accurate identification of all carriers of Plasmodium parasites, both symptomatic and asymptomatic, using diagnostic tools that are highly sensitive, high throughput and with fast turnaround times preferably performed in local health service settings. Currently available immunochromatographic lateral flow rapid diagnostic tests and field microscopy are unlikely to consistently detect infections at parasite densities less than 100 parasites/µL making them insufficiently sensitive for detecting all carriers. Molecular diagnostic platforms, such as PCR and LAMP, are currently available in reference laboratories, but at a cost both financially and in turnaround time. This review describes the recent progress in developing molecular diagnostic tools in terms of their capacity for high throughput and potential for performance in non-reference laboratories for malaria elimination.

  9. Understanding the Role of Medical Experts during a Public Health Crisis Digital Tools and Library Resources for Research on the 1918 Spanish Influenza.

    Science.gov (United States)

    Ewing, E Thomas; Gad, Samah; Ramakrishnan, Naren; Reznick, Jeffrey S

    2014-10-01

    Humanities scholars, particularly historians of health and disease, can benefit from digitized library collections and tools such as topic modeling. Using a case study from the 1918 Spanish Flu epidemic, this paper explores the application of a big humanities approach to understanding the impact of a public health official on the course of the disease and the response of the public, as documented through digitized newspapers and medical periodicals.

  10. The Use of Online Quizlet.com Resource Tools to Support Native English Speaking Students of Engineering and Medical Departments in Accelerated RFL Teaching and Learning

    OpenAIRE

    Kh.E. Ismailova; K. Gleason; E.A. Provotorova; P.G. Matukhin

    2016-01-01

    The paper presents a description of the methodology and some results of the application of tools of the language learning support portal Quizlet.com to improve the effectiveness of the accelerated development of the basic communicative skills in Russian as a foreign language (RFL) for the group of the English-speaking students who arrived to study in Russia engineering, medicine and other areas. The application of the development is the basics of Russian teaching and learning in the classroom...

  11. The Use of Online Quizlet.com Resource Tools to Support Native English Speaking Students of Engineering and Medical Departments in Accelerated RFL Teaching and Learning

    OpenAIRE

    Ismailova , Kh ,; Gleason , K ,; Provotorova , P ,; Matukhin , P ,

    2017-01-01

    International audience; The paper presents a description of the methodology and some results of the application of tools of the language learning support portal Quizlet.com to improve the effectiveness of the accelerated development of the basic communicative skills in Russian as a foreign language (RFL) for the group of the English-speaking students who arrived to study in Russia engineering, medicine and other areas. The application of the development is the basics of Russian teaching and l...

  12. Two decision-support tools for assessing the potential effects of energy development on hydrologic resources as part of the Energy and Environment in the Rocky Mountain Area interactive energy atlas

    Science.gov (United States)

    Linard, Joshua I.; Matherne, Anne Marie; Leib, Kenneth J.; Carr, Natasha B.; Diffendorfer, James E.; Hawkins, Sarah J.; Latysh, Natalie; Ignizio, Drew A.; Babel, Nils C.

    2014-01-01

    The U.S. Geological Survey project—Energy and Environment in the Rocky Mountain Area (EERMA)—has developed a set of virtual tools in the form of an online interactive energy atlas for Colorado and New Mexico to facilitate access to geospatial data related to energy resources, energy infrastructure, and natural resources that may be affected by energy development. The interactive energy atlas currently (2014) consists of three components: (1) a series of interactive maps; (2) downloadable geospatial datasets; and (3) decison-support tools, including two maps related to hydrologic resources discussed in this report. The hydrologic-resource maps can be used to examine the potential effects of energy development on hydrologic resources with respect to (1) groundwater vulnerability, by using the depth to water, recharge, aquifer media, soil media, topography, impact of the vadose zone, and hydraulic conductivity of the aquifer (DRASTIC) model, and (2) landscape erosion potential, by using the revised universal soil loss equation (RUSLE). The DRASTIC aquifer vulnerability index value for the two-State area ranges from 48 to 199. Higher values, indicating greater relative aquifer vulnerability, are centered in south-central Colorado, areas in southeastern New Mexico, and along riparian corridors in both States—all areas where the water table is relatively close to the land surface and the aquifer is more susceptible to surface influences. As calculated by the RUSLE model, potential mean annual erosion, as soil loss in units of tons per acre per year, ranges from 0 to 12,576 over the two-State area. The RUSLE model calculated low erosion potential over most of Colorado and New Mexico, with predictions of highest erosion potential largely confined to areas of mountains or escarpments. An example is presented of how a fully interactive RUSLE model could be further used as a decision-support tool to evaluate the potential hydrologic effects of energy development on a

  13. Application of electronic learning tools for training of specialists in the field of information technologies for enterprises of mineral resources sector

    Directory of Open Access Journals (Sweden)

    Е. В. Катунцов

    2017-08-01

    Full Text Available The article shows the advantages of using modern electronic learning tools in the training of specialists for the mineral and raw materials complex and considers the basic principles of organizing training using these tools. The experience of using electronic learning tools using foreign teaching materials and involving foreign professors is described. A special attention is given to the electronic learning environment of the Cisco Networking Academy – Cisco NetAcad. The experience of teaching at the Networking Academy of the Saint-Petersburg Mining University is described. Details are given to modern virtual environments for laboratory work, such as Cisco Packet Tracer, GNS3 and Emulated Virtual Environment. The experience of using electronic learning technologies at the University of Economics of Bratislava is considered. It actively cooperates with a number of universities of other countries, such as the University of International Business (Almaty, the Eurasian National University named after LN Gumilyov (Astana and the Institute of Social and Humanitarian Knowledge (Kazan.

  14. Online Resources

    Indian Academy of Sciences (India)

    Home; Journals; Journal of Genetics; Online Resources. Journal of Genetics. Online Resources. Volume 97. 2018 | Online resources. Volume 96. 2017 | Online resources. Volume 95. 2016 | Online resources. Volume 94. 2015 | Online resources. Volume 93. 2014 | Online resources. Volume 92. 2013 | Online resources ...

  15. Paediatric musculoskeletal matters (pmm)--collaborative development of an online evidence based interactive learning tool and information resource for education in paediatric musculoskeletal medicine.

    Science.gov (United States)

    Smith, Nicola; Rapley, Tim; Jandial, Sharmila; English, Christine; Davies, Barbara; Wyllie, Ruth; Foster, Helen E

    2016-01-05

    We describe the collaborative development of an evidence based, free online resource namely 'paediatric musculoskeletal matters' (pmm). This resource was developed with the aim of reaching a wide range of health professionals to increase awareness, knowledge and skills within paediatric musculoskeletal medicine, thereby facilitating early diagnosis and referral to specialist care. Engagement with stakeholder groups (primary care, paediatrics, musculoskeletal specialties and medical students) informed the essential 'core' learning outcomes to derive content of pmm. Representatives from stakeholder groups, social science and web development experts transformed the learning outcomes into a suitable framework. Target audience representatives reviewed the framework and their opinion was gathered using an online survey (n = 74) and focus groups (n = 2). Experts in paediatric musculoskeletal medicine peer reviewed the content and design. User preferences informed design with mobile, tablet and web compatible versions to facilitate access, various media and formats to engage users and the content presented in module format (i.e. Clinical assessment, Investigations and management, Limping child, Joint pain by site, Swollen joint(s) and Resources). We propose that our collaborative and evidence-based approach has ensured that pmm is user-friendly, with readily accessible, suitable content, and will help to improve access to paediatric musculoskeletal medicine education. The content is evidence-based with the design and functionality of pmm to facilitate optimal and 'real life' access to information. pmm is targeted at medical students and the primary care environment although messages are transferable to all health care professionals involved in the care of children and young people.

  16. Animal Resource Program | Center for Cancer Research

    Science.gov (United States)

    CCR Animal Resource Program The CCR Animal Resource Program plans, develops, and coordinates laboratory animal resources for CCR’s research programs. We also provide training, imaging, and technology development in support of moving basic discoveries to the clinic. The ARP Manager:

  17. Animal Resource Program | Center for Cancer Research

    Science.gov (United States)

    CCR Animal Resource Program The CCR Animal Resource Program plans, develops, and coordinates laboratory animal resources for CCR’s research programs. We also provide training, imaging, and technology development in support of moving basic discoveries to the clinic. The ARP Office:

  18. BEI Resource Repository

    Data.gov (United States)

    U.S. Department of Health & Human Services — BEI Resources provides reagents, tools and information for studying Category A, B, and C priority pathogens, emerging infectious disease agents, non-pathogenic...

  19. Development of a decision support tool for water and resource management using biotic, abiotic, and hydrological assessments of Topock Marsh, Arizona

    Science.gov (United States)

    Holmquist-Johnson, Christopher; Hanson, Leanne; Daniels, Joan; Talbert, Colin; Haegele, Jeanette

    2016-05-23

    Topock Marsh is a large wetland adjacent to the Colorado River and the main feature of Havasu National Wildlife Refuge (Havasu NWR) in southern Arizona. In 2010, the U.S. Fish and Wildlife Service (FWS) and Bureau of Reclamation began a project to improve water management capabilities at Topock Marsh and protect habitats and species. Initial construction required a drawdown, which caused below-average inflows and water depths in 2010–11. U.S. Geological Survey Fort Collins Science Center (FORT) scientists collected an assemblage of biotic, abiotic, and hydrologic data from Topock Marsh during the drawdown and immediately after, thus obtaining valuable information needed by FWS.Building upon that work, FORT developed a decision support system (DSS) to better understand ecosystem health and function of Topock Marsh under various hydrologic conditions. The DSS was developed using a spatially explicit geographic information system package of historical data, habitat indices, and analytical tools to synthesize outputs for hydrologic time periods. Deliverables include high-resolution orthorectified imagery of Topock Marsh; a DSS tool that can be used by Havasu NWR to compare habitat availability associated with three hydrologic scenarios (dry, average, wet years); and this final report which details study results. This project, therefore, has addressed critical FWS management questions by integrating ecologic and hydrologic information into a DSS framework. This DSS will assist refuge management to make better informed decisions about refuge operations and better understand the ecological results of those decisions by providing tools to identify the effects of water operations on species-specific habitat and ecological processes. While this approach was developed to help FWS use the best available science to determine more effective water management strategies at Havasu NWR, technologies used in this study could be applied elsewhere within the region.

  20. Using insects for STEM outreach: Development and evaluation of the UA Insect Discovery Program

    Science.gov (United States)

    Beal, Benjamin D.

    Science and technology impact most aspects of modern daily life. It is therefore important to create a scientifically literate society. Since the majority of Americans do not take college-level science courses, strong K-12 science education is essential. At the K-5 level, however, many teachers lack the time, resources and background for effective science teaching. Elementary teachers and students may benefit from scientist-led outreach programs created by Cooperative Extension or other institutions. One example is the University of Arizona Insect Discovery Program, which provides short-duration programing that uses insects to support science content learning, teach critical thinking and spark interest in science. We conducted evaluations of the Insect Discovery programming to determine whether the activities offered were accomplishing program goals. Pre-post tests, post program questionnaires for teachers, and novel assessments of children's drawings were used as assessment tools. Assessments were complicated by the short duration of the program interactions with the children as well as their limited literacy. In spite of these difficulties, results of the pre-post tests indicated a significant impact on content knowledge and critical thinking skills. Based on post-program teacher questionnaires, positive impacts on interest in science learning were noted as much as a month after the children participated in the program. New programming and resources developed to widen the potential for impact are also described.