WorldWideScience

Sample records for open science grid

  1. The Open Science Grid

    Energy Technology Data Exchange (ETDEWEB)

    Pordes, Ruth; /Fermilab; Kramer, Bill; Olson, Doug; / /LBL, Berkeley; Livny, Miron; Roy, Alain; /Wisconsin U., Madison; Avery, Paul; /Florida U.; Blackburn, Kent; /Caltech; Wenaus, Torre; /Brookhaven; Wurthwein, Frank; /UC, San Diego; Gardner, Rob; Wilde, Mike; /Chicago U. /Indiana U.

    2007-06-01

    The Open Science Grid (OSG) provides a distributed facility where the Consortium members provide guaranteed and opportunistic access to shared computing and storage resources. OSG provides support for and evolution of the infrastructure through activities that cover operations, security, software, troubleshooting, addition of new capabilities, and support for existing and engagement with new communities. The OSG SciDAC-2 project provides specific activities to manage and evolve the distributed infrastructure and support its use. The innovative aspects of the project are the maintenance and performance of a collaborative (shared & common) petascale national facility over tens of autonomous computing sites, for many hundreds of users, transferring terabytes of data a day, executing tens of thousands of jobs a day, and providing robust and usable resources for scientific groups of all types and sizes. More information can be found at the OSG web site: www.opensciencegrid.org.

  2. The Open Science Grid

    CERN Document Server

    Pordes, Ruth; Olson, Doug; Livny, Miron; Roy, Alain; Avery, Paul; Blackburn, Kent; Wenaus, Torre; Wuerthwein, Frank K.; Gardner, Rob; Wilde, Mike; Blatecky, Alan; McGee, John; Quick, Rob

    2007-01-01

    The Open Science Grid (OSG) provides a distributed facility where the Consortium members provide guaranteed and opportunistic access to shared computing and storage resources. OSG provides support for and evolution of the infrastructure through activities that cover operations, security, software, troubleshooting, addition of new capabilities, and support for existing and engagement with new communities. The OSG SciDAC-2 project provides specific activities to manage and evolve the distributed infrastructure and support it's use. The innovative aspects of the project are the maintenance and performance of a collaborative (shared & common) petascale national facility over tens of autonomous computing sites, for many hundreds of users, transferring terabytes of data a day, executing tens of thousands of jobs a day, and providing robust and usable resources for scientific groups of all types and sizes. More information can be found at the OSG web site: www.opensciencegrid.org.

  3. The open science grid

    Energy Technology Data Exchange (ETDEWEB)

    Pordes, Ruth [Fermi National Accelerator Laboratory (United States); Petravick, Don [Fermi National Accelerator Laboratory (United States); Kramer, Bill [Lawrence Berkeley National Laboratory (United States); Olson, Doug [Lawrence Berkeley National Laboratory (United States); Livny, Miron [University of Wisconsin, Madison (United States); Roy, Alain [University of Wisconsin, Madison (United States); Avery, Paul [University of Florida (United States); Blackburn, Kent [California Institute of Technology (United States); Wenaus, Torre [Brookhaven National Laboratory (United States); Wuerthwein, Frank [University of California, San Diego (United States); Foster, Ian [University of Chicago (United States); Gardner, Rob [University of Chicago (United States); Wilde, Mike [University of Chicago (United States); Blatecky, Alan [Renaissance Computing Institute (United States); McGee, John [Renaissance Computing Institute (United States); Quick, Rob [Indiana University (United States)

    2007-07-15

    The Open Science Grid (OSG) provides a distributed facility where the Consortium members provide guaranteed and opportunistic access to shared computing and storage resources. OSG provides support for and evolution of the infrastructure through activities that cover operations, security, software, troubleshooting, addition of new capabilities, and support for existing and engagement with new communities. The OSG SciDAC-2 project provides specific activities to manage and evolve the distributed infrastructure and support it's use. The innovative aspects of the project are the maintenance and performance of a collaborative (shared and common) petascale national facility over tens of autonomous computing sites, for many hundreds of users, transferring terabytes of data a day, executing tens of thousands of jobs a day, and providing robust and usable resources for scientific groups of all types and sizes. More information can be found at the OSG web site: www.opensciencegrid.org.

  4. New Science on the Open Science Grid

    CERN Document Server

    Board, The Open Science Grid Executive; Pordes, Ruth; Altunay, Mine; Avery, Paul; Bejan, Alina; Blackburn, Kent; Blatecky, Alan; Gardner, Rob; Kramer, Bill; Livny, Miron; McGee, John; Potekhin, Maxim; Quick, Rob; Olson, Doug; Roy, Alain; Sehgal, Chander; Wenaus, Torre; Wilde, Mike; Wuerthwein, Frank

    2012-01-01

    The Open Science Grid (OSG) includes work to enable new science, new scientists, and new modalities in support of computationally based research. There are frequently significant sociological and organizational changes required in transformation from the existing to the new. OSG leverages its deliverables to the large scale physics experiment member communities to benefit new communities at all scales through activities in education, engagement and the distributed facility. As a partner to the poster and tutorial at SciDAC 2008, this paper gives both a brief general description and some specific examples of new science enabled on the OSG. More information is available at the OSG web site: (http://www.opensciencegrid.org).

  5. New science on the Open Science Grid

    Energy Technology Data Exchange (ETDEWEB)

    Pordes, R; Altunay, M; Sehgal, C [Fermi National Accelerator Laboratory, Batavia, IL 60510 (United States); Avery, P [University of Florida, Gainesville, FL 32611 (United States); Bejan, A; Gardner, R; Wilde, M [University of Chicago, Chicago, IL 60607 (United States); Blackburn, K [California Institute of Technology, Pasadena, CA 91125 (United States); Blatecky, A; McGee, J [Renaissance Computing Institute, Chapel Hill, NC 27517 (United States); Kramer, B; Olson, D; Roy, A [Lawrence Berkeley National Laboratory, Berkeley, CA 94720 (United States); Livny, M [University of Wisconsin, Madison, Madison, WI 53706 (United States); Potekhin, M; Quick, R; Wenaus, T [Indiana University, Bloomington, IN 47405 (United States); Wuerthwein, F [University of California, San Diego, La Jolla, CA 92093 (United States)], E-mail: ruth@fnal.gov

    2008-07-15

    The Open Science Grid (OSG) includes work to enable new science, new scientists, and new modalities in support of computationally based research. There are frequently significant sociological and organizational changes required in transformation from the existing to the new. OSG leverages its deliverables to the large-scale physics experiment member communities to benefit new communities at all scales through activities in education, engagement, and the distributed facility. This paper gives both a brief general description and specific examples of new science enabled on the OSG. More information is available at the OSG web site: www.opensciencegrid.org.

  6. New Science on the Open Science Grid

    Energy Technology Data Exchange (ETDEWEB)

    Pordes, Ruth; Altunay, Mine; Avery, Paul; Bejan, Alina; Blackburn, Kent; Blatecky, Alan; Gardner, Rob; Kramer, Bill; Livny, Miron; McGee, John; Potekhin, Maxim; /Fermilab /Florida U. /Chicago U. /Caltech /LBL, Berkeley /Wisconsin U., Madison /Indiana U. /Brookhaven /UC, San Diego

    2008-06-01

    The Open Science Grid (OSG) includes work to enable new science, new scientists, and new modalities in support of computationally based research. There are frequently significant sociological and organizational changes required in transformation from the existing to the new. OSG leverages its deliverables to the large scale physics experiment member communities to benefit new communities at all scales through activities in education, engagement and the distributed facility. As a partner to the poster and tutorial at SciDAC 2008, this paper gives both a brief general description and some specific examples of new science enabled on the OSG. More information is available at the OSG web site: www.opensciencegrid.org.

  7. The Open Science Grid status and architecture

    Energy Technology Data Exchange (ETDEWEB)

    Pordes, R; Petravick, D [Fermi National Accelerator Laboratory (United States); Kramer, B; Olson, D [Lawrence Berkeley National Laboratory (United States); Livny, M; Roy, A [University of Wisconsin, Madison (United States); Avery, P [University of Florida (United States); Blackburn, K [California Institute of Technology (United States); Wenaus, T [Brookhaven National Laboratory (United States); Wuerthwein, F [University of California, San Diego (United States); Foster, I; Gardner, R; Wilde, M [University of Chicago (United States); Blatecky, A; McGee, J [Renaissance Computing Institute (United States); Quick, R [Indiana University (United States)], E-mail: ruth@fnal.gov

    2008-07-15

    The Open Science Grid (OSG) provides a distributed facility where the Consortium members provide guaranteed and opportunistic access to shared computing and storage resources. The OSG project[1] is funded by the National Science Foundation and the Department of Energy Scientific Discovery through Advanced Computing program. The OSG project provides specific activities for the operation and evolution of the common infrastructure. The US ATLAS and US CMS collaborations contribute to and depend on OSG as the US infrastructure contributing to the World Wide LHC Computing Grid on which the LHC experiments distribute and analyze their data. Other stakeholders include the STAR RHIC experiment, the Laser Interferometer Gravitational-Wave Observatory (LIGO), the Dark Energy Survey (DES) and several Fermilab Tevatron experiments- CDF, D0, MiniBoone etc. The OSG implementation architecture brings a pragmatic approach to enabling vertically integrated community specific distributed systems over a common horizontal set of shared resources and services. More information can be found at the OSG web site: www.opensciencegrid.org.

  8. The Open Science Grid status and architecture

    Energy Technology Data Exchange (ETDEWEB)

    Pordes, Ruth; Petravick, Don; /Fermilab; Kramer, Bill; Olsen, James D.; /LBL, Berkeley; Livny, Miron; Roy, Gordon A.; /Wisconsin U., Madison; Avery, Paul Ralph; /Florida U.; Blackburn, Kent; /Caltech; Wenaus, Torre J.; /Brookhaven; Wuerthwein, Frank K.; /UC, San Diego; Foster, Ian; /Chicago U. /Indiana U.

    2007-09-01

    The Open Science Grid (OSG) provides a distributed facility where the Consortium members provide guaranteed and opportunistic access to shared computing and storage resources. The OSG project[1] is funded by the National Science Foundation and the Department of Energy Scientific Discovery through Advanced Computing program. The OSG project provides specific activities for the operation and evolution of the common infrastructure. The US ATLAS and US CMS collaborations contribute to and depend on OSG as the US infrastructure contributing to the World Wide LHC Computing Grid on which the LHC experiments distribute and analyze their data. Other stakeholders include the STAR RHIC experiment, the Laser Interferometer Gravitational-Wave Observatory (LIGO), the Dark Energy Survey (DES) and several Fermilab Tevatron experiments- CDF, D0, MiniBoone etc. The OSG implementation architecture brings a pragmatic approach to enabling vertically integrated community specific distributed systems over a common horizontal set of shared resources and services. More information can be found at the OSG web site: www.opensciencegrid.org.

  9. Public storage for the Open Science Grid

    Science.gov (United States)

    Levshina, T.; Guru, A.

    2014-06-01

    The Open Science Grid infrastructure doesn't provide efficient means to manage public storage offered by participating sites. A Virtual Organization that relies on opportunistic storage has difficulties finding appropriate storage, verifying its availability, and monitoring its utilization. The involvement of the production manager, site administrators and VO support personnel is required to allocate or rescind storage space. One of the main requirements for Public Storage implementation is that it should use SRM or GridFTP protocols to access the Storage Elements provided by the OSG Sites and not put any additional burden on sites. By policy, no new services related to Public Storage can be installed and run on OSG sites. Opportunistic users also have difficulties in accessing the OSG Storage Elements during the execution of jobs. A typical users' data management workflow includes pre-staging common data on sites before a job's execution, then storing for a subsequent download to a local institution the output data produced by a job on a worker node. When the amount of data is significant, the only means to temporarily store the data is to upload it to one of the Storage Elements. In order to do that, a user's job should be aware of the storage location, availability, and free space. After a successful data upload, users must somehow keep track of the data's location for future access. In this presentation we propose solutions for storage management and data handling issues in the OSG. We are investigating the feasibility of using the integrated Rule-Oriented Data System developed at RENCI as a front-end service to the OSG SEs. The current architecture, state of deployment and performance test results will be discussed. We will also provide examples of current usage of the system by beta-users.

  10. Open computing grid for molecular science and engineering.

    Science.gov (United States)

    Sild, Sulev; Maran, Uko; Lomaka, Andre; Karelson, Mati

    2006-01-01

    Grid is an emerging infrastructure for distributed computing that provides secure and scalable mechanisms for discovering and accessing remote software and data resources. Applications built on this infrastructure have great potential for addressing and solving large scale chemical, pharmaceutical, and material science problems. The article describes the concept behind grid computing and will present the OpenMolGRID system that is an open computing grid for molecular science and engineering. This system provides grid enabled components, such as a data warehouse for chemical data, software for building QSPR/QSAR models, and molecular engineering tools for generating compounds with predefined chemical properties or biological activities. The article also provides an overview about the availability of chemical applications in the grid.

  11. Pilot job accounting and auditing in Open Science Grid

    Energy Technology Data Exchange (ETDEWEB)

    Sfiligoi, Igor; Green, Chris; /Fermilab; Quinn, Greg; Thain, Greg; /Wisconsin U., Madison

    2008-06-01

    The Grid accounting and auditing mechanisms were designed under the assumption that users would submit their jobs directly to the Grid gatekeepers. However, many groups are starting to use pilot-based systems, where users submit jobs to a centralized queue and are successively transferred to the Grid resources by the pilot infrastructure. While this approach greatly improves the user experience, it does disrupt the established accounting and auditing procedures. Open Science Grid deploys gLExec on the worker nodes to keep the pilot-related accounting and auditing information and centralizes the accounting collection with GRATIA.

  12. DZero data-intensive computing on the Open Science Grid

    Science.gov (United States)

    Abbott, B.; Baranovski, A.; Diesburg, M.; Garzoglio, G.; Kurca, T.; Mhashilkar, P.

    2008-07-01

    High energy physics experiments periodically reprocess data, in order to take advantage of improved understanding of the detector and the data processing code. Between February and May 2007, the DZero experiment has reprocessed a substantial fraction of its dataset. This consists of half a billion events, corresponding to about 100 TB of data, organized in 300,000 files. The activity utilized resources from sites around the world, including a dozen sites participating to the Open Science Grid consortium (OSG). About 1,500 jobs were run every day across the OSG, consuming and producing hundreds of Gigabytes of data. Access to OSG computing and storage resources was coordinated by the SAM-Grid system. This system organized job access to a complex topology of data queues and job scheduling to clusters, using a SAM-Grid to OSG job forwarding infrastructure. For the first time in the lifetime of the experiment, a data intensive production activity was managed on a general purpose grid, such as OSG. This paper describes the implications of using OSG, where all resources are granted following an opportunistic model, the challenges of operating a data intensive activity over such large computing infrastructure, and the lessons learned throughout the project.

  13. Automatic Integration Testbeds validation on Open Science Grid

    Science.gov (United States)

    Caballero, J.; Thapa, S.; Gardner, R.; Potekhin, M.

    2011-12-01

    A recurring challenge in deploying high quality production middleware is the extent to which realistic testing occurs before release of the software into the production environment. We describe here an automated system for validating releases of the Open Science Grid software stack that leverages the (pilot-based) PanDA job management system developed and used by the ATLAS experiment. The system was motivated by a desire to subject the OSG Integration Testbed to more realistic validation tests. In particular those which resemble to every extent possible actual job workflows used by the experiments thus utilizing job scheduling at the compute element (CE), use of the worker node execution environment, transfer of data to/from the local storage element (SE), etc. The context is that candidate releases of OSG compute and storage elements can be tested by injecting large numbers of synthetic jobs varying in complexity and coverage of services tested. The native capabilities of the PanDA system can thus be used to define jobs, monitor their execution, and archive the resulting run statistics including success and failure modes. A repository of generic workflows and job types to measure various metrics of interest has been created. A command-line toolset has been developed so that testbed managers can quickly submit "VO-like" jobs into the system when newly deployed services are ready for testing. A system for automatic submission has been crafted to send jobs to integration testbed sites, collecting the results in a central service and generating regular reports for performance and reliability.

  14. Snowmass Energy Frontier Simulations using the Open Science Grid (A Snowmass 2013 whitepaper)

    Energy Technology Data Exchange (ETDEWEB)

    Avetisyan, Aram [Boston Univ., MA (United States); Bhattacharya, Saptaparna [Brown Univ., Providence, RI (United States); Narain, Meenakshi [Brown Univ., Providence, RI (United States); Padhi, Sanjay [Univ. of California, San Diego, CA (United States); Hirschauer, Jim [Fermi National Accelerator Lab. (FNAL), Batavia, IL (United States); Levshina, Tanya [Fermi National Accelerator Lab. (FNAL), Batavia, IL (United States); McBride, Patricia [Fermi National Accelerator Lab. (FNAL), Batavia, IL (United States); Sehgal, Chander [Fermi National Accelerator Lab. (FNAL), Batavia, IL (United States); Slyz, Marko [Fermi National Accelerator Lab. (FNAL), Batavia, IL (United States); Rynge, Mats [Information Sciences Inst., Marina del Rey, CA (United States); Malik, Sudhir [Univ. of Nebraska, Lincoln, NE (United States); Stupak, III, John [Purdue Univ. Northwest, Hammond, IN (United States)

    2013-08-04

    Snowmass is a US long-term planning study for the high-energy community by the American Physical Society's Division of Particles and Fields. For its simulation studies, opportunistic resources are harnessed using the Open Science Grid infrastructure. Late binding grid technology, GlideinWMS, was used for distributed scheduling of the simulation jobs across many sites mainly in the US. The pilot infrastructure also uses the Parrot mechanism to dynamically access CvmFS in order to ascertain a homogeneous environment across the nodes. This report presents the resource usage and the storage model used for simulating large statistics Standard Model backgrounds needed for Snowmass Energy Frontier studies.

  15. Snowmass Energy Frontier Simulations using the Open Science Grid (A Snowmass 2013 whitepaper)

    CERN Document Server

    Avetisyan, A; Narain, M; Padhi, S; Hirschauer, J; Levshina, T; McBride, P; Sehgal, C; Slyz, M; Rynge, M; Malik, S; Stupak, J

    2013-01-01

    Snowmass is a US long-term planning study for the high-energy community by the American Physical Society's Division of Particles and Fields. For its simulation studies, opportunistic resources are harnessed using the Open Science Grid infrastructure. Late binding grid technology, GlideinWMS, was used for distributed scheduling of the simulation jobs across many sites mainly in the US. The pilot infrastructure also uses the Parrot mechanism to dynamically access CvmFS in order to ascertain a homogeneous environment across the nodes. This report presents the resource usage and the storage model used for simulating large statistics Standard Model backgrounds needed for Snowmass Energy Frontier studies.

  16. Open science grid: Building and sustaining general cyberinfrastructure using a collaborative approach

    OpenAIRE

    2007-01-01

    I describe in this paper the creation and operation of the Open Science Grid (OSG [1]), a distributed shared cyberinfrastructure driven by the milestones of a diverse group of research communities. The effort is fundamentally collaborative, with domain scientists, computer scientists and technology specialists and providers from more than 70 U.S. universities, national laboratories and organizations providing resources, tools and expertise. The evolving OSG facility provides computing and sto...

  17. The Open Science Grid – Support for Multi-Disciplinary Team Science – the Adolescent Years

    CERN Document Server

    CERN. Geneva

    2012-01-01

    As it enters adolescence the Open Science Grid (OSG) is bringing a maturing fabric of Distributed High Throughput Computing (DHTC) services that supports an expanding HEP community to an increasingly diverse spectrum of domain scientists. Working closely with researchers on campuses throughout the US and in collaboration with national cyberinfrastructure initiatives, we transform their computing environment through new concepts, advanced tools and deep experience. We discuss examples of these including: the pilot-job overlay concepts and technologies now in use throughout OSG and delivering 1.4 Million CPU hours/day; the role of campus infrastructures- built out from concepts of sharing across multiple local faculty clusters (made good use of already by many of the HEP Tier-2 sites in the US); the work towards the use of clouds and access to high throughput parallel (multi-core and GPU) compute resources; and the progress we are making towards meeting the data management and access needs of non-HEP communiti...

  18. CMS Usage of the Open Science Grid and the US Tier-2 Centers

    CERN Document Server

    Mohapatra, A

    2009-01-01

    The CMS experiment has been using the Open Science Grid, through its US Tier-2 computing centers, from its very beginning for production of Monte Carlo simulations. In this talk we will describe the evolution of the usage patterns indicating the best practices that have been identified. In addition to describing the production metrics and how they have been met, we will also present the problems encountered and mitigating solutions. Data handling and the user analysis patterns on the Tier-2 and OSG computing will be described.

  19. The event notification and alarm system for the Open Science Grid operations center

    Science.gov (United States)

    Hayashi, S.; Teige and, S.; Quick, R.

    2012-12-01

    The Open Science Grid Operations (OSG) Team operates a distributed set of services and tools that enable the utilization of the OSG by several HEP projects. Without these services users of the OSG would not be able to run jobs, locate resources, obtain information about the status of systems or generally use the OSG. For this reason these services must be highly available. This paper describes the automated monitoring and notification systems used to diagnose and report problems. Described here are the means used by OSG Operations to monitor systems such as physical facilities, network operations, server health, service availability and software error events. Once detected, an error condition generates a message sent to, for example, Email, SMS, Twitter, an Instant Message Server, etc. The mechanism being developed to integrate these monitoring systems into a prioritized and configurable alarming system is emphasized.

  20. OASIS: a data and software distribution service for Open Science Grid

    Science.gov (United States)

    Bockelman, B.; Caballero Bejar, J.; De Stefano, J.; Hover, J.; Quick, R.; Teige, S.

    2014-06-01

    The Open Science Grid encourages the concept of software portability: a user's scientific application should be able to run at as many sites as possible. It is necessary to provide a mechanism for OSG Virtual Organizations to install software at sites. Since its initial release, the OSG Compute Element has provided an application software installation directory to Virtual Organizations, where they can create their own sub-directory, install software into that sub-directory, and have the directory shared on the worker nodes at that site. The current model has shortcomings with regard to permissions, policies, versioning, and the lack of a unified, collective procedure or toolset for deploying software across all sites. Therefore, a new mechanism for data and software distributing is desirable. The architecture for the OSG Application Software Installation Service (OASIS) is a server-client model: the software and data are installed only once in a single place, and are automatically distributed to all client sites simultaneously. Central file distribution offers other advantages, including server-side authentication and authorization, activity records, quota management, data validation and inspection, and well-defined versioning and deletion policies. The architecture, as well as a complete analysis of the current implementation, will be described in this paper.

  1. Open Science Grid (OSG) Ticket Synchronization: Keeping Your Home Field Advantage In A Distributed Environment

    Science.gov (United States)

    Gross, Kyle; Hayashi, Soichi; Teige, Scott; Quick, Robert

    2012-12-01

    Large distributed computing collaborations, such as the Worldwide LHC Computing Grid (WLCG), face many issues when it comes to providing a working grid environment for their users. One of these is exchanging tickets between various ticketing systems in use by grid collaborations. Ticket systems such as Footprints, RT, Remedy, and ServiceNow all have different schema that must be addressed in order to provide a reliable exchange of information between support entities and users in different grid environments. To combat this problem, OSG Operations has created a ticket synchronization interface called GOC-TX that relies on web services instead of error-prone email parsing methods of the past. Synchronizing tickets between different ticketing systems allows any user or support entity to work on a ticket in their home environment, thus providing a familiar and comfortable place to provide updates without having to learn another ticketing system. The interface is built in a way that it is generic enough that it can be customized for nearly any ticketing system with a web-service interface with only minor changes. This allows us to be flexible and rapidly bring new ticket synchronization online. Synchronization can be triggered by different methods including mail, web services interface, and active messaging. GOC-TX currently interfaces with Global Grid User Support (GGUS) for WLCG, Remedy at Brookhaven National Lab (BNL), and Request Tracker (RT) at the Virtual Data Toolkit (VDT). Work is progressing on the Fermi National Accelerator Laboratory (FNAL) ServiceNow synchronization. This paper will explain the problems faced by OSG and how they led OSG to create and implement this ticket synchronization system along with the technical details that allow synchronization to be preformed at a production level.

  2. Grid for Earth Science Applications

    Science.gov (United States)

    Petitdidier, Monique; Schwichtenberg, Horst

    2013-04-01

    The civil society at large has addressed to the Earth Science community many strong requirements related in particular to natural and industrial risks, climate changes, new energies. The main critical point is that on one hand the civil society and all public ask for certainties i.e. precise values with small error range as it concerns prediction at short, medium and long term in all domains; on the other hand Science can mainly answer only in terms of probability of occurrence. To improve the answer or/and decrease the uncertainties, (1) new observational networks have been deployed in order to have a better geographical coverage and more accurate measurements have been carried out in key locations and aboard satellites. Following the OECD recommendations on the openness of research and public sector data, more and more data are available for Academic organisation and SMEs; (2) New algorithms and methodologies have been developed to face the huge data processing and assimilation into simulations using new technologies and compute resources. Finally, our total knowledge about the complex Earth system is contained in models and measurements, how we put them together has to be managed cleverly. The technical challenge is to put together databases and computing resources to answer the ES challenges. However all the applications are very intensive computing. Different compute solutions are available and depend on the characteristics of the applications. One of them is Grid especially efficient for independent or embarrassingly parallel jobs related to statistical and parametric studies. Numerous applications in atmospheric chemistry, meteorology, seismology, hydrology, pollution, climate and biodiversity have been deployed successfully on Grid. In order to fulfill requirements of risk management, several prototype applications have been deployed using OGC (Open geospatial Consortium) components with Grid middleware. The Grid has permitted via a huge number of runs to

  3. Sustaining and Extending the Open Science Grid: Science Innovation on a PetaScale Nationwide Facility (DE-FC02-06ER41436) SciDAC-2 Closeout Report

    Energy Technology Data Exchange (ETDEWEB)

    Livny, Miron [Univ. of Wisconsin, Madison, WI (United States); Shank, James [Boston Univ., MA (United States); Ernst, Michael [Brookhaven National Lab. (BNL), Upton, NY (United States); Blackburn, Kent [California Inst. of Technology (CalTech), Pasadena, CA (United States); Goasguen, Sebastien [Clemson Univ., SC (United States); Tuts, Michael [Columbia Univ., New York, NY (United States); Gibbons, Lawrence [Cornell Univ., Ithaca, NY (United States); Pordes, Ruth [Fermi National Accelerator Lab. (FNAL), Batavia, IL (United States); Sliz, Piotr [Harvard Medical School, Boston, MA (United States); Deelman, Ewa [Univ. of Southern California, Los Angeles, CA (United States). Information Sciences Inst.; Barnett, William [Indiana Univ., Bloomington, IN (United States); Olson, Doug [Lawrence Berkeley National Lab. (LBNL), Berkeley, CA (United States); McGee, John [Univ. of North Carolina, Chapel Hill, NC (United States). Renaissance Computing Inst.; Cowles, Robert [SLAC National Accelerator Lab., Menlo Park, CA (United States); Wuerthwein, Frank [Univ. of California, San Diego, CA (United States); Gardner, Robert [Univ. of Chicago, IL (United States); Avery, Paul [Univ. of Florida, Gainesville, FL (United States); Wang, Shaowen [Univ. of Illinois, Champaign, IL (United States); Univ. of Iowa, Iowa City, IA (United States); Lincoln, David Swanson [Univ. of Nebraska, Lincoln, NE (United States)

    2015-02-11

    Under this SciDAC-2 grant the project’s goal w a s t o stimulate new discoveries by providing scientists with effective and dependable access to an unprecedented national distributed computational facility: the Open Science Grid (OSG). We proposed to achieve this through the work of the Open Science Grid Consortium: a unique hands-on multi-disciplinary collaboration of scientists, software developers and providers of computing resources. Together the stakeholders in this consortium sustain and use a shared distributed computing environment that transforms simulation and experimental science in the US. The OSG consortium is an open collaboration that actively engages new research communities. We operate an open facility that brings together a broad spectrum of compute, storage, and networking resources and interfaces to other cyberinfrastructures, including the US XSEDE (previously TeraGrid), the European Grids for ESciencE (EGEE), as well as campus and regional grids. We leverage middleware provided by computer science groups, facility IT support organizations, and computing programs of application communities for the benefit of consortium members and the US national CI.

  4. Scalable Open Source Smart Grid Simulator (SGSim)

    DEFF Research Database (Denmark)

    Ebeid, Emad Samuel Malki; Jacobsen, Rune Hylsberg; Quaglia, Davide

    2017-01-01

    . This paper presents an open source smart grid simulator (SGSim). The simulator is based on open source SystemC Network Simulation Library (SCNSL) and aims to model scalable smart grid applications. SGSim has been tested under different smart grid scenarios that contain hundreds of thousands of households...

  5. Opening science to the world; opening the world to science

    CERN Multimedia

    Andrew Purcell

    2015-01-01

    ‘Engaging the research community towards an Open Science Commons’ was the main theme of the European Grid Infrastructure (EGI) annual conference that was held in Lisbon from 18 to 22 May. At the conference, the EGI­Engage project was launched and the European Open Science Cloud was discussed.   Tiziana Ferrari, technical director of EGI.eu, speaks at the EGI Annual conference in Lisbon this year. The EGI­Engage project was launched during the opening session of the conference by Tiziana Ferrari, technical director of EGI.eu. This project, which has been funded through the EU’s Horizon 2020 Framework Programme for Research and Innovation, aims to accelerate progress towards the implementation of the Open Science Commons. It seeks to do so by expanding the capabilities of a European backbone of federated services for computing, storage, data, communication, knowledge and expertise, as well as related community­-specific capabilities. &l...

  6. Trends in life science grid: from computing grid to knowledge grid

    OpenAIRE

    Konagaya Akihiko

    2006-01-01

    Abstract Background Grid computing has great potential to become a standard cyberinfrastructure for life sciences which often require high-performance computing and large data handling which exceeds the computing capacity of a single institution. Results This survey reviews the latest grid technologies from the viewpoints of computing grid, data grid and knowledge grid. Computing grid technologies have been matured enough to solve high-throughput real-world life scientific problems. Data grid...

  7. Open Education and the Open Science Economy

    Science.gov (United States)

    Peters, Michael A.

    2009-01-01

    Openness as a complex code word for a variety of digital trends and movements has emerged as an alternative mode of "social production" based on the growing and overlapping complexities of open source, open access, open archiving, open publishing, and open science. This paper argues that the openness movement with its reinforcing structure of…

  8. Open Media Science

    DEFF Research Database (Denmark)

    Møller Moltke Martiny, Kristian; Pedersen, David Budtz; Hansted, Allan Alfred Birkegaard

    2016-01-01

    In this article, we present three challenges to the emerging Open Science (OS) movement: the challenge of communication, collaboration and cultivation of scientific research. We argue that to address these challenges OS needs to include other forms of data than what can be captured in a text...... and extend into a fully-fledged Open Media movement engaging with new media and non-traditional formats of science communication. We discuss two cases where experiments with open media have driven new collaborations between scientists and documentarists. We use the cases to illustrate different advantages...

  9. Open Media Science

    DEFF Research Database (Denmark)

    Møller Moltke Martiny, Kristian; Pedersen, David Budtz; Hansted, Allan Alfred Birkegaard

    2016-01-01

    In this article, we present three challenges to the emerging Open Science (OS) movement: the challenge of communication, collaboration and cultivation of scientific research. We argue that to address these challenges OS needs to include other forms of data than what can be captured in a text...... and extend into a fully-fledged Open Media movement engaging with new media and non-traditional formats of science communication. We discuss two cases where experiments with open media have driven new collaborations between scientists and documentarists. We use the cases to illustrate different advantages...

  10. Changing from computing grid to knowledge grid in life-science grid.

    Science.gov (United States)

    Talukdar, Veera; Konar, Amit; Datta, Ayan; Choudhury, Anamika Roy

    2009-09-01

    Grid computing has a great potential to become a standard cyber infrastructure for life sciences that often require high-performance computing and large data handling, which exceeds the computing capacity of a single institution. Grid computer applies the resources of many computers in a network to a single problem at the same time. It is useful to scientific problems that require a great number of computer processing cycles or access to a large amount of data.As biologists,we are constantly discovering millions of genes and genome features, which are assembled in a library and distributed on computers around the world.This means that new, innovative methods must be developed that exploit the re-sources available for extensive calculations - for example grid computing.This survey reviews the latest grid technologies from the viewpoints of computing grid, data grid and knowledge grid. Computing grid technologies have been matured enough to solve high-throughput real-world life scientific problems. Data grid technologies are strong candidates for realizing a "resourceome" for bioinformatics. Knowledge grids should be designed not only from sharing explicit knowledge on computers but also from community formulation for sharing tacit knowledge among a community. By extending the concept of grid from computing grid to knowledge grid, it is possible to make use of a grid as not only sharable computing resources, but also as time and place in which people work together, create knowledge, and share knowledge and experiences in a community.

  11. Open hardware for open science

    CERN Multimedia

    CERN Bulletin

    2011-01-01

    Inspired by the open source software movement, the Open Hardware Repository was created to enable hardware developers to share the results of their R&D activities. The recently published CERN Open Hardware Licence offers the legal framework to support this knowledge and technology exchange.   Two years ago, a group of electronics designers led by Javier Serrano, a CERN engineer, working in experimental physics laboratories created the Open Hardware Repository (OHR). This project was initiated in order to facilitate the exchange of hardware designs across the community in line with the ideals of “open science”. The main objectives include avoiding duplication of effort by sharing results across different teams that might be working on the same need. “For hardware developers, the advantages of open hardware are numerous. For example, it is a great learning tool for technologies some developers would not otherwise master, and it avoids unnecessary work if someone ha...

  12. Secure Interoperable Open Smart Grid Demonstration Project

    Energy Technology Data Exchange (ETDEWEB)

    Magee, Thoman

    2014-12-31

    The Consolidated Edison, Inc., of New York (Con Edison) Secure Interoperable Open Smart Grid Demonstration Project (SGDP), sponsored by the United States (US) Department of Energy (DOE), demonstrated that the reliability, efficiency, and flexibility of the grid can be improved through a combination of enhanced monitoring and control capabilities using systems and resources that interoperate within a secure services framework. The project demonstrated the capability to shift, balance, and reduce load where and when needed in response to system contingencies or emergencies by leveraging controllable field assets. The range of field assets includes curtailable customer loads, distributed generation (DG), battery storage, electric vehicle (EV) charging stations, building management systems (BMS), home area networks (HANs), high-voltage monitoring, and advanced metering infrastructure (AMI). The SGDP enables the seamless integration and control of these field assets through a common, cyber-secure, interoperable control platform, which integrates a number of existing legacy control and data systems, as well as new smart grid (SG) systems and applications. By integrating advanced technologies for monitoring and control, the SGDP helps target and reduce peak load growth, improves the reliability and efficiency of Con Edison’s grid, and increases the ability to accommodate the growing use of distributed resources. Con Edison is dedicated to lowering costs, improving reliability and customer service, and reducing its impact on the environment for its customers. These objectives also align with the policy objectives of New York State as a whole. To help meet these objectives, Con Edison’s long-term vision for the distribution grid relies on the successful integration and control of a growing penetration of distributed resources, including demand response (DR) resources, battery storage units, and DG. For example, Con Edison is expecting significant long-term growth of DG

  13. Secure Interoperable Open Smart Grid Demonstration Project

    Energy Technology Data Exchange (ETDEWEB)

    Magee, Thoman [Consolidated Edison Company Of New York, Inc., NY (United States)

    2014-12-28

    The Consolidated Edison, Inc., of New York (Con Edison) Secure Interoperable Open Smart Grid Demonstration Project (SGDP), sponsored by the United States (US) Department of Energy (DOE), demonstrated that the reliability, efficiency, and flexibility of the grid can be improved through a combination of enhanced monitoring and control capabilities using systems and resources that interoperate within a secure services framework. The project demonstrated the capability to shift, balance, and reduce load where and when needed in response to system contingencies or emergencies by leveraging controllable field assets. The range of field assets includes curtailable customer loads, distributed generation (DG), battery storage, electric vehicle (EV) charging stations, building management systems (BMS), home area networks (HANs), high-voltage monitoring, and advanced metering infrastructure (AMI). The SGDP enables the seamless integration and control of these field assets through a common, cyber-secure, interoperable control platform, which integrates a number of existing legacy control and data systems, as well as new smart grid (SG) systems and applications. By integrating advanced technologies for monitoring and control, the SGDP helps target and reduce peak load growth, improves the reliability and efficiency of Con Edison’s grid, and increases the ability to accommodate the growing use of distributed resources. Con Edison is dedicated to lowering costs, improving reliability and customer service, and reducing its impact on the environment for its customers. These objectives also align with the policy objectives of New York State as a whole. To help meet these objectives, Con Edison’s long-term vision for the distribution grid relies on the successful integration and control of a growing penetration of distributed resources, including demand response (DR) resources, battery storage units, and DG. For example, Con Edison is expecting significant long-term growth of DG

  14. Transdisciplinary electric power grid science

    CERN Document Server

    Brummitt, Charles D; Dobson, Ian; Moore, Cristopher; D'Souza, Raissa M

    2013-01-01

    The 20th-century engineering feat that most improved the quality of human life, the electric power system, now faces discipline-spanning challenges that threaten that distinction. So multilayered and complex that they resemble ecosystems, power grids face risks from their interdependent cyber, physical, social and economic layers. Only with a holistic understanding of the dynamics of electricity infrastructure and human operators, automatic controls, electricity markets, weather, climate and policy can we fortify worldwide access to electricity.

  15. Transdisciplinary electric power grid science

    OpenAIRE

    Brummitt, Charles D.; Hines, Paul D. H.; Dobson, Ian; Moore, Cristopher; D'Souza, Raissa M.

    2013-01-01

    The 20th-century engineering feat that most improved the quality of human life, the electric power system, now faces discipline-spanning challenges that threaten that distinction. So multilayered and complex that they resemble ecosystems, power grids face risks from their interdependent cyber, physical, social and economic layers. Only with a holistic understanding of the dynamics of electricity infrastructure and human operators, automatic controls, electricity markets, weather, climate and ...

  16. Open life science research, open software and the open century

    Institute of Scientific and Technical Information of China (English)

    Youhua Chen

    2015-01-01

    At the age of knowledge explosion and mass scientific information, I highlighted the importance of conducting open science in life and medical researches through the extensive usage of open software and documents. The proposal of conducting open science is to reduce the limited repeatability of researches in life science. I outlined the essential steps for conducting open life science and the necessary standards for creating, reusing and reproducing open materials. Different Creative Commons licenses were presented and compared of their usage scope and restriction. As a conclusion, I argued that open materials should be widely adopted in doing life and medical researches.

  17. Space-based Science Operations Grid Prototype

    Science.gov (United States)

    Bradford, Robert N.; Welch, Clara L.; Redman, Sandra

    2004-01-01

    Grid technology is the up and coming technology that is enabling widely disparate services to be offered to users that is very economical, easy to use and not available on a wide basis. Under the Grid concept disparate organizations generally defined as "virtual organizations" can share services i.e. sharing discipline specific computer applications, required to accomplish the specific scientific and engineering organizational goals and objectives. Grids are emerging as the new technology of the future. Grid technology has been enabled by the evolution of increasingly high speed networking. Without the evolution of high speed networking Grid technology would not have emerged. NASA/Marshall Space Flight Center's (MSFC) Flight Projects Directorate, Ground Systems Department is developing a Space-based Science Operations Grid prototype to provide to scientists and engineers the tools necessary to operate space-based science payloads/experiments and for scientists to conduct public and educational outreach. In addition Grid technology can provide new services not currently available to users. These services include mission voice and video, application sharing, telemetry management and display, payload and experiment commanding, data mining, high order data processing, discipline specific application sharing and data storage, all from a single grid portal. The Prototype will provide most of these services in a first step demonstration of integrated Grid and space-based science operations technologies. It will initially be based on the International Space Station science operational services located at the Payload Operations Integration Center at MSFC, but can be applied to many NASA projects including free flying satellites and future projects. The Prototype will use the Internet2 Abilene Research and Education Network that is currently a 10 Gb backbone network to reach the University of Alabama at Huntsville and several other, as yet unidentified, Space Station based

  18. Grids for Dummies: Featuring Earth Science Data Mining Application

    Science.gov (United States)

    Hinke, Thomas H.

    2002-01-01

    This viewgraph presentation discusses the concept and advantages of linking computers together into data grids, an emerging technology for managing information across institutions, and potential users of data grids. The logistics of access to a grid, including the use of the World Wide Web to access grids, and security concerns are also discussed. The potential usefulness of data grids to the earth science community is also discussed, as well as the Global Grid Forum, and other efforts to establish standards for data grids.

  19. OpenADR Open Source Toolkit: Developing Open Source Software for the Smart Grid

    Energy Technology Data Exchange (ETDEWEB)

    McParland, Charles

    2011-02-01

    Demand response (DR) is becoming an increasingly important part of power grid planning and operation. The advent of the Smart Grid, which mandates its use, further motivates selection and development of suitable software protocols to enable DR functionality. The OpenADR protocol has been developed and is being standardized to serve this goal. We believe that the development of a distributable, open source implementation of OpenADR will benefit this effort and motivate critical evaluation of its capabilities, by the wider community, for providing wide-scale DR services

  20. Neutron Science TeraGrid Gateway

    Science.gov (United States)

    Lynch, Vickie; Chen, Meili; Cobb, John; Kohl, Jim; Miller, Steve; Speirs, David; Vazhkudai, Sudharshan

    2010-11-01

    The unique contributions of the Neutron Science TeraGrid Gateway (NSTG) are the connection of national user facility instrument data sources to the integrated cyberinfrastructure of the National Science FoundationTeraGrid and the development of a neutron science gateway that allows neutron scientists to use TeraGrid resources to analyze their data, including comparison of experiment with simulation. The NSTG is working in close collaboration with the Spallation Neutron Source (SNS) at Oak Ridge as their principal facility partner. The SNS is a next-generation neutron source. It has completed construction at a cost of 1.4 billion and is ramping up operations. The SNS will provide an order of magnitude greater flux than any previous facility in the world and will be available to all of the nation's scientists, independent of funding source, on a peer-reviewed merit basis. With this new capability, the neutron science community is facing orders of magnitude larger data sets and is at a critical point for data analysis and simulation. There is a recognized need for new ways to manage and analyze data to optimize both beam time and scientific output. The TeraGrid is providing new capabilities in the gateway for simulations using McStas and a fitting service on distributed TeraGrid resources to improved turnaround. NSTG staff are also exploring replicating experimental data in archival storage. As part of the SNS partnership, the NSTG provides access to gateway support, cyberinfrastructure outreach, community development, and user support for the neutron science community. This community includes not only SNS staff and users but extends to all the major worldwide neutron scattering centers.

  1. Neutron Science TeraGrid Gateway

    Energy Technology Data Exchange (ETDEWEB)

    Lynch, Vickie E [ORNL; Chen, Meili [ORNL; Cobb, John W [ORNL; Kohl, James Arthur [ORNL; Miller, Stephen D [ORNL; Speirs, David A [ORNL; Vazhkudai, Sudharshan S [ORNL

    2010-01-01

    The unique contributions of the Neutron Science TeraGrid Gateway (NSTG) are the connection of national user facility instrument data sources to the integrated cyberinfrastructure of the National Science FoundationTeraGrid and the development of a neutron science gateway that allows neutron scientists to use TeraGrid resources to analyze their data, including comparison of experiment with simulation. The NSTG is working in close collaboration with the Spallation Neutron Source (SNS) at Oak Ridge as their principal facility partner. The SNS is a next-generation neutron source. It has completed construction at a cost of $1.4 billion and is ramping up operations. The SNS will provide an order of magnitude greater flux than any previous facility in the world and will be available to all of the nation's scientists, independent of funding source, on a peer-reviewed merit basis. With this new capability, the neutron science community is facing orders of magnitude larger data sets and is at a critical point for data analysis and simulation. There is a recognized need for new ways to manage and analyze data to optimize both beam time and scientific output. The TeraGrid is providing new capabilities in the gateway for simulations using McStas and a fitting service on distributed TeraGrid resources to improved turnaround. NSTG staff are also exploring replicating experimental data in archival storage. As part of the SNS partnership, the NSTG provides access to gateway support, cyberinfrastructure outreach, community development, and user support for the neutron science community. This community includes not only SNS staff and users but extends to all the major worldwide neutron scattering centers.

  2. Neutron Science TeraGrid Gateway

    Energy Technology Data Exchange (ETDEWEB)

    Lynch, Vickie; Chen Meili; Cobb, John; Kohl, Jim; Miller, Steve; Speirs, David; Vazhkudai, Sudharshan, E-mail: lynchve@ornl.gov [Oak Ridge National Laboratory, Oak Ridge, TN (United States)

    2010-11-01

    The unique contributions of the Neutron Science TeraGrid Gateway (NSTG) are the connection of national user facility instrument data sources to the integrated cyberinfrastructure of the National Science FoundationTeraGrid and the development of a neutron science gateway that allows neutron scientists to use TeraGrid resources to analyze their data, including comparison of experiment with simulation. The NSTG is working in close collaboration with the Spallation Neutron Source (SNS) at Oak Ridge as their principal facility partner. The SNS is a next-generation neutron source. It has completed construction at a cost of $1.4 billion and is ramping up operations. The SNS will provide an order of magnitude greater flux than any previous facility in the world and will be available to all of the nation's scientists, independent of funding source, on a peer-reviewed merit basis. With this new capability, the neutron science community is facing orders of magnitude larger data sets and is at a critical point for data analysis and simulation. There is a recognized need for new ways to manage and analyze data to optimize both beam time and scientific output. The TeraGrid is providing new capabilities in the gateway for simulations using McStas and a fitting service on distributed TeraGrid resources to improved turnaround. NSTG staff are also exploring replicating experimental data in archival storage. As part of the SNS partnership, the NSTG provides access to gateway support, cyberinfrastructure outreach, community development, and user support for the neutron science community. This community includes not only SNS staff and users but extends to all the major worldwide neutron scattering centers.

  3. Open life science research, open software and the open century

    National Research Council Canada - National Science Library

    Youhua Chen

    2015-01-01

    At the age of knowledge explosion and mass scientific information, I highlighted the importance of conducting open science in life and medical researches through the extensive usage of open software and documents...

  4. Grid Integration of Electric Vehicles in Open Electricity Markets

    DEFF Research Database (Denmark)

    Presenting the policy drivers, benefits and challenges for grid integration of electric vehicles (EVs) in the open electricity market environment, this book provides a comprehensive overview of existing electricity markets and demonstrates how EVs are integrated into these different markets...

  5. Openness, Web 2.0 Technology, and Open Science

    Science.gov (United States)

    Peters, Michael A.

    2010-01-01

    Open science is a term that is being used in the literature to designate a form of science based on open source models or that utilizes principles of open access, open archiving and open publishing to promote scientific communication. Open science increasingly also refers to open governance and more democratized engagement and control of science…

  6. The Grid is open, so please come in…

    CERN Multimedia

    Caroline Duc

    2012-01-01

    During the week of 17 to 21 September 2012, the European Grid Infrastructure Technical Forum was held in Prague. At this event, organised by EGI (European Grid Infrastructure), grid computing experts set about tackling the challenge of opening their doors to a still wider community. This provided an excellent opportunity to look back at similar initiatives by EGI in the past.   EGI's aim is to coordinate the computing resources of the European Grid Infrastructure and to encourage exchanges between the collaboration and users. Initially dedicated mainly to high-energy particle physics, the European Grid Infrastructure is now welcoming new disciplines and communities. The EGI Technical Forum is organised once a year and is a key date in the community's calendar. The 2012 edition, organised in Prague, was an opportunity to review the advances made and to look constructively into a future where the use of computing grids becomes more widespread. Since 2010, EGI has supported the ...

  7. Grid3: An Application Grid Laboratory for Science

    CERN Document Server

    CERN. Geneva

    2004-01-01

    level services required by the participating experiments. The deployed infrastructure has been operating since November 2003 with 27 sites, a peak of 2800 processors, work loads from 10 different applications exceeding 1300 simultaneous jobs, and data transfers among sites of greater than 2 TB/day. The Grid3 infrastructure was deployed from grid level services provided by groups and applications within the collaboration. The services were organized into four distinct "grid level services" including: Grid3 Packaging, Monitoring and Information systems, User Authentication and the iGOC Grid Operatio...

  8. The International Symposium on Grids and Clouds and the Open Grid Forum

    Science.gov (United States)

    addressed while OGF exposed the state of current developments and issues to be resolved if commonalities are to be exploited. Another first is for the Proceedings for 2011, an open access online publishing scheme will ensure these Proceedings will appear more quickly and more people will have access to the results, providing a long-term online archive of the event. The symposium attracted more than 212 participants from 29 countries spanning Asia, Europe and the Americas. Coming so soon after the earthquake and tsunami in Japan, the participation of our Japanese colleagues was particularly appreciated. Keynotes by invited speakers highlighted the impact of distributed computing infrastructures in the social sciences and humanities, high energy physics, earth and life sciences. Plenary sessions entitled Grid Activities in Asia Pacific surveyed the state of grid deployment across 11 Asian countries. Through the parallel sessions, the impact of distributed computing infrastructures in a range of research disciplines was highlighted. Operational procedures, middleware and security aspects were addressed in a dedicated sessions. The symposium was covered online in real-time by the GridCast team from the GridTalk project. A running blog including summarises of specific sessions as well as video interviews with keynote speakers and personalities and photos. As with all regions of the world, grid and cloud computing has to be prove it is adding value to researchers if it is be accepted by them and demonstrate its impact on society as a while if it to be supported by national governments, funding agencies and the general public. ISGC has helped foster the emergence of a strong regional interest in the earth and life sciences, notably for natural disaster mitigation and bioinformatics studies. Prof. Simon C. Lin organised an intense social programme with a gastronomic tour of Taipei culminating with a banquet for all the symposium's participants at the hotel Palais de Chine. I would

  9. LHC Databases on the Grid: Achievements and Open Issues

    CERN Document Server

    Vaniachine, A V

    2010-01-01

    To extract physics results from the recorded data, the LHC experiments are using Grid computing infrastructure. The event data processing on the Grid requires scalable access to non-event data (detector conditions, calibrations, etc.) stored in relational databases. The database-resident data are critical for the event data reconstruction processing steps and often required for physics analysis. This paper reviews LHC experience with database technologies for the Grid computing. List of topics includes: database integration with Grid computing models of the LHC experiments; choice of database technologies; examples of database interfaces; distributed database applications (data complexity, update frequency, data volumes and access patterns); scalability of database access in the Grid computing environment of the LHC experiments. The review describes areas in which substantial progress was made and remaining open issues.

  10. European grid services for global earth science

    Science.gov (United States)

    Brewer, S.; Sipos, G.

    2012-04-01

    This presentation will provide an overview of the distributed computing services that the European Grid Infrastructure (EGI) offers to the Earth Sciences community and also explain the processes whereby Earth Science users can engage with the infrastructure. One of the main overarching goals for EGI over the coming year is to diversify its user-base. EGI therefore - through the National Grid Initiatives (NGIs) that provide the bulk of resources that make up the infrastructure - offers a number of routes whereby users, either individually or as communities, can make use of its services. At one level there are two approaches to working with EGI: either users can make use of existing resources and contribute to their evolution and configuration; or alternatively they can work with EGI, and hence the NGIs, to incorporate their own resources into the infrastructure to take advantage of EGI's monitoring, networking and managing services. Adopting this approach does not imply a loss of ownership of the resources. Both of these approaches are entirely applicable to the Earth Sciences community. The former because researchers within this field have been involved with EGI (and previously EGEE) as a Heavy User Community and the latter because they have very specific needs, such as incorporating HPC services into their workflows, and these will require multi-skilled interventions to fully provide such services. In addition to the technical support services that EGI has been offering for the last year or so - the applications database, the training marketplace and the Virtual Organisation services - there now exists a dynamic short-term project framework that can be utilised to establish and operate services for Earth Science users. During this talk we will present a summary of various on-going projects that will be of interest to Earth Science users with the intention that suggestions for future projects will emerge from the subsequent discussions: • The Federated Cloud Task

  11. OPEN DATA FOR DISCOVERY SCIENCE.

    Science.gov (United States)

    Payne, Philip R O; Huang, Kun; Shah, Nigam H; Tenenbaum, Jessica

    2016-01-01

    The modern healthcare and life sciences ecosystem is moving towards an increasingly open and data-centric approach to discovery science. This evolving paradigm is predicated on a complex set of information needs related to our collective ability to share, discover, reuse, integrate, and analyze open biological, clinical, and population level data resources of varying composition, granularity, and syntactic or semantic consistency. Such an evolution is further impacted by a concomitant growth in the size of data sets that can and should be employed for both hypothesis discovery and testing. When such open data can be accessed and employed for discovery purposes, a broad spectrum of high impact end-points is made possible. These span the spectrum from identification of de novo biomarker complexes that can inform precision medicine, to the repositioning or repurposing of extant agents for new and cost-effective therapies, to the assessment of population level influences on disease and wellness. Of note, these types of uses of open data can be either primary, wherein open data is the substantive basis for inquiry, or secondary, wherein open data is used to augment or enrich project-specific or proprietary data that is not open in and of itself. This workshop is concerned with the key challenges, opportunities, and methodological best practices whereby open data can be used to drive the advancement of discovery science in all of the aforementioned capacities.

  12. Earth Science applications on Grid -advantages and limitations

    Science.gov (United States)

    Petitdidier, M.; Schwichtenberg, H.

    2012-04-01

    The civil society at large has addressed to the Earth Science community many strong requirements related in particular to natural and industrial risks, climate changes, new energies…. Our total knowledge about the complex Earth system is contained in models and measurements, how we put them together has to be managed cleverly… The technical challenge is to put together databases and computing resources to answer the ES challenges. The main critical point is that on one hand the civil society and all public ask for certainties i.e. precise values with small error range as it concerns prediction at short, medium and long term in all domains; on the other hand Science can mainly answer only in terms of probability of occurrence. To improve the answer or/and decrease the uncertainties, (1) new observational networks have been deployed in order to have a better geographical coverage and more accurate measurements have been carried out in key locations and aboard satellites, (2) new algorithms and methodologies have been developed using new technologies and compute resources. Numerous applications in atmospheric chemistry, meteorology, seismology, hydrology, pollution, climate and biodiversity were deployed successfully on Grid. In order to fulfill requirements of risk management, several prototype applications have been deployed using OGC (Open geospatial Consortium) components with Grid middleware. The Grid has permitted to decrease uncertainties by increasing the probability of occurrence via a larger number of runs. Some limitations are related to the combination of databases-outside the grid infrastructure- and grid compute resources; and to real-time applications that need resource reservation in order to insure results at given time. As a matter of fact ES scientists use different compute resources according to the phase of their application are used to work in large projects and share their results. They need a service-oriented architecture and a platform of

  13. AstroGrid-D: Grid Technology for Astronomical Science

    CERN Document Server

    Enke, Harry; Adorf, Hans-Martin; Beck-Ratzka, Alexander; Breitling, Frank; Bruesemeister, Thomas; Carlson, Arthur; Ensslin, Torsten; Hoegqvist, Mikael; Nickelt, Iliya; Radke, Thomas; Reinefeld, Alexander; Reiser, Angelika; Scholl, Tobias; Spurzem, Rainer; Steinacker, Juergen; Voges, Wolfgang; Wambsganss, Joachim; White, Steve

    2010-01-01

    We present status and results of AstroGrid-D, a joint effort of astrophysicists and computer scientists to employ grid technology for scientific applications. AstroGrid-D provides access to a network of distributed machines with a set of commands as well as software interfaces. It allows simple use of computer and storage facilities and to schedule or monitor compute tasks and data management. It is based on the Globus Toolkit middleware (GT4). Chapter 1 describes the context which led to the demand for advanced software solutions in Astrophysics, and we state the goals of the project. We then present characteristic astrophysical applications that have been implemented on AstroGrid-D in chapter 2. We describe simulations of different complexity, compute-intensive calculations running on multiple sites, and advanced applications for specific scientific purposes, such as a connection to robotic telescopes. We can show from these examples how grid execution improves e.g. the scientific workflow. Chapter 3 explai...

  14. Towards "open applied" Earth sciences

    Science.gov (United States)

    Ziegler, C. R.; Schildhauer, M.

    2014-12-01

    Concepts of open science -- in the context of cyber/digital technology and culture -- could greatly benefit applied and secondary Earth science efforts. However, international organizations (e.g., environmental agencies, conservation groups and sustainable development organizations) that are focused on applied science have been slow to incorporate open practices across the spectrum of scientific activities, from data to decisions. Myriad benefits include transparency, reproducibility, efficiency (timeliness and cost savings), stakeholder engagement, direct linkages between research and environmental outcomes, reduction in bias and corruption, improved simulation of Earth systems and improved availability of science in general. We map out where and how open science can play a role, providing next steps, with specific emphasis on applied science efforts and processes such as environmental assessment, synthesis and systematic reviews, meta-analyses, decision support and emerging cyber technologies. Disclaimer: The views expressed in this paper are those of the authors and do not necessarily reflect the views or policies of the organizations for which they work and/or represent.

  15. Approaches to Open Data for Science in Spain

    Directory of Open Access Journals (Sweden)

    E Wulff-Barreiro

    2011-10-01

    Full Text Available As observational data has attained new legal status, allowing their integration into open Internet systems, and experimental data continues to be assembled in common and free platforms, state of the art, easy to access data repositories have been designed in Spain. These repositories have removed many obstacles to re-utilization of GIS and other data. European legislation has also made advances in opening biodiversity data, including a European space in the Latin-American grid infrastructure. Open access biomedical repositories attract commercial attention while astronomical, meteorological, and oncological institutions promote data quality and access. This paper describes recent approaches to open access data for science in Spain.

  16. Open Bibliography for Science, Technology, and Medicine

    Directory of Open Access Journals (Sweden)

    Jones Richard

    2011-10-01

    Full Text Available Abstract The concept of Open Bibliography in science, technology and medicine (STM is introduced as a combination of Open Source tools, Open specifications and Open bibliographic data. An Openly searchable and navigable network of bibliographic information and associated knowledge representations, a Bibliographic Knowledge Network, across all branches of Science, Technology and Medicine, has been designed and initiated. For this large scale endeavour, the engagement and cooperation of the multiple stakeholders in STM publishing - authors, librarians, publishers and administrators - is sought.

  17. Open bibliography for science, technology, and medicine.

    Science.gov (United States)

    Jones, Richard; Macgillivray, Mark; Murray-Rust, Peter; Pitman, Jim; Sefton, Peter; O'Steen, Ben; Waites, William

    2011-10-14

    The concept of Open Bibliography in science, technology and medicine (STM) is introduced as a combination of Open Source tools, Open specifications and Open bibliographic data. An Openly searchable and navigable network of bibliographic information and associated knowledge representations, a Bibliographic Knowledge Network, across all branches of Science, Technology and Medicine, has been designed and initiated. For this large scale endeavour, the engagement and cooperation of the multiple stakeholders in STM publishing - authors, librarians, publishers and administrators - is sought.

  18. Unlocking the potential of smart grid technologies with behavioral science

    Directory of Open Access Journals (Sweden)

    Nicole eSintov

    2015-04-01

    Full Text Available Smart grid systems aim to provide a more stable and adaptable electricity infrastructure, and to maximize energy efficiency. Grid-linked technologies vary widely in form and function, but generally share common potentials: to reduce energy consumption via efficiency and/or curtailment, to shift use to off-peak times of day, and to enable distributed storage and generation options. Although end users are key players in these systems, they tend to be overlooked. Behavioral science is therefore key to engaging end-users and maximizing the impact of smart grid technologies. In this paper, we highlight several ways in which behavioral science can be applied to better understand and engage customers in smart grid systems.

  19. Unlocking the potential of smart grid technologies with behavioral science.

    Science.gov (United States)

    Sintov, Nicole D; Schultz, P Wesley

    2015-01-01

    Smart grid systems aim to provide a more stable and adaptable electricity infrastructure, and to maximize energy efficiency. Grid-linked technologies vary widely in form and function, but generally share common potentials: to reduce energy consumption via efficiency and/or curtailment, to shift use to off-peak times of day, and to enable distributed storage and generation options. Although end users are central players in these systems, they are sometimes not central considerations in technology or program design, and in some cases, their motivations for participating in such systems are not fully appreciated. Behavioral science can be instrumental in engaging end-users and maximizing the impact of smart grid technologies. In this paper, we present emerging technologies made possible by a smart grid infrastructure, and for each we highlight ways in which behavioral science can be applied to enhance their impact on energy savings.

  20. The LIGO Open Science Center

    CERN Document Server

    Vallisneri, Michele; Williams, Roy; Weinstein, Alan; Stephens, Branson

    2014-01-01

    The LIGO Open Science Center (LOSC) fulfills LIGO's commitment to release, archive, and serve LIGO data in a broadly accessible way to the scientific community and to the public, and to provide the information and tools necessary to understand and use the data. In August 2014, the LOSC published the full dataset from Initial LIGO's "S5" run at design sensitivity, the first such large-scale release and a valuable testbed to explore the use of LIGO data by non-LIGO researchers and by the public, and to help teach gravitational-wave data analysis to students across the world. In addition to serving the S5 data, the LOSC web portal (losc.ligo.org) now offers documentation, data-location and data-quality queries, tutorials and example code, and more. We review the mission and plans of the LOSC, focusing on the S5 data release.

  1. Open Genetic Code: on open source in the life sciences

    NARCIS (Netherlands)

    Deibel, E.

    2014-01-01

    The introduction of open source in the life sciences is increasingly being suggested as an alternative to patenting. This is an alternative, however, that takes its shape at the intersection of the life sciences and informatics. Numerous examples can be identified wherein open source in the life

  2. Grid Integration of Electric Vehicles in Open Electricity Markets

    DEFF Research Database (Denmark)

    Presenting the policy drivers, benefits and challenges for grid integration of electric vehicles (EVs) in the open electricity market environment, this book provides a comprehensive overview of existing electricity markets and demonstrates how EVs are integrated into these different markets...... and power systems. Unlike other texts, this book analyses EV integration in parallel with electricity market design, showing the interaction between EVs and differing electricity markets. Future regulating power market and distribution system operator (DSO) market design is covered, with up-to-date case...... companies with the knowledge they need when facing the challenges introduced by large scale EV deployment, and demonstrates how transmission system operators (TSOs) can develop the existing system service market in order to fully utilize the potential of EV flexibility. With thorough coverage...

  3. EMERGE - ESnet/MREN Regional Science Grid Experimental NGI Testbed

    Energy Technology Data Exchange (ETDEWEB)

    Mambretti, Joe; DeFanti, Tom; Brown, Maxine

    2001-07-31

    This document is the final report on the EMERGE Science Grid testbed research project from the perspective of the International Center for Advanced Internet Research (iCAIR) at Northwestern University, which was a subcontractor to this UIC project. This report is a compilation of information gathered from a variety of materials related to this project produced by multiple EMERGE participants, especially those at Electronic Visualization Lab (EVL) at the University of Illinois at Chicago (UIC), Argonne National Lab and iCAIR. The EMERGE Science Grid project was managed by Tom DeFanti, PI from EVL at UIC.

  4. Open Genetic Code: on open source in the life sciences.

    Science.gov (United States)

    Deibel, Eric

    2014-01-01

    The introduction of open source in the life sciences is increasingly being suggested as an alternative to patenting. This is an alternative, however, that takes its shape at the intersection of the life sciences and informatics. Numerous examples can be identified wherein open source in the life sciences refers to access, sharing and collaboration as informatic practices. This includes open source as an experimental model and as a more sophisticated approach of genetic engineering. The first section discusses the greater flexibly in regard of patenting and the relationship to the introduction of open source in the life sciences. The main argument is that the ownership of knowledge in the life sciences should be reconsidered in the context of the centrality of DNA in informatic formats. This is illustrated by discussing a range of examples of open source models. The second part focuses on open source in synthetic biology as exemplary for the re-materialization of information into food, energy, medicine and so forth. The paper ends by raising the question whether another kind of alternative might be possible: one that looks at open source as a model for an alternative to the commodification of life that is understood as an attempt to comprehensively remove the restrictions from the usage of DNA in any of its formats.

  5. Grid enablement of OpenGeospatial Web Services: the G-OWS Working Group

    Science.gov (United States)

    Mazzetti, Paolo

    2010-05-01

    integration on existing solutions. More specifically, the Open Geospatial Consortium (OGC) Web Services (OWS) specifications play a fundamental role in geospatial information sharing (e.g. in INSPIRE Implementing Rules, GEOSS architecture, GMES Services, etc.). On the Grid side, the gLite middleware, developed in the European EGEE (Enabling Grids for E-sciencE) Projects, is widely spread in Europe and beyond, proving its high scalability and it is one of the middleware chosen for the future European Grid Infrastructure (EGI) initiative. Therefore the convergence between OWS and gLite technologies would be desirable for a seamless access to the Grid capabilities through OWS-compliant systems. Anyway, to achieve this harmonization there are some obstacles to overcome. Firstly, a semantics mismatch must be addressed: gLite handle low-level (e.g. close to the machine) concepts like "file", "data", "instruments", "job", etc., while geo-information services handle higher-level (closer to the human) concepts like "coverage", "observation", "measurement", "model", etc. Secondly, an architectural mismatch must be addressed: OWS implements a Web Service-Oriented-Architecture which is stateless, synchronous and with no embedded security (which is demanded to other specs), while gLite implements the Grid paradigm in an architecture which is stateful, asynchronous (even not fully event-based) and with strong embedded security (based on the VO paradigm). In recent years many initiatives and projects have worked out possible approaches for implementing Grid-enabled OWSs. Just to mention some: (i) in 2007 the OGC has signed a Memorandum of Understanding with the Open Grid Forum, "a community of users, developers, and vendors leading the global standardization effort for grid computing."; (ii) the OGC identified "WPS Profiles - Conflation; and Grid processing" as one of the tasks in the Geo Processing Workflow theme of the OWS Phase 6 (OWS-6); (iii) several national, European and

  6. BOOK REVIEW: OPENING SCIENCE, THE EVOLVING GUIDE ...

    Science.gov (United States)

    The way we get our funding, collaborate, do our research, and get the word out has evolved over hundreds of years but we can imagine a more open science world, largely facilitated by the internet. The movement towards this more open way of doing and presenting science is coming, and it is not taking hundreds of years. If you are interested in these trends, and would like to find out more about where this is all headed and what it means to you, consider downloding Opening Science, edited by Sönke Bartling and Sascha Friesike, subtitled The Evolving Guide on How the Internet is Changing Research, Collaboration, and Scholarly Publishing. In 26 chapters by various authors from a range of disciplines the book explores the developing world of open science, starting from the first scientific revolution and bringing us to the next scientific revolution, sometimes referred to as “Science 2.0”. Some of the articles deal with the impact of the changing landscape of how science is done, looking at the impact of open science on Academia, or journal publishing, or medical research. Many of the articles look at the uses, pitfalls, and impact of specific tools, like microblogging (think Twitter), social networking, and reference management. There is lots of discussion and definition of terms you might use or misuse like “altmetrics” and “impact factor”. Science will probably never be completely open, and Twitter will probably never replace the journal article,

  7. BOOK REVIEW: OPENING SCIENCE, THE EVOLVING GUIDE ...

    Science.gov (United States)

    The way we get our funding, collaborate, do our research, and get the word out has evolved over hundreds of years but we can imagine a more open science world, largely facilitated by the internet. The movement towards this more open way of doing and presenting science is coming, and it is not taking hundreds of years. If you are interested in these trends, and would like to find out more about where this is all headed and what it means to you, consider downloding Opening Science, edited by Sönke Bartling and Sascha Friesike, subtitled The Evolving Guide on How the Internet is Changing Research, Collaboration, and Scholarly Publishing. In 26 chapters by various authors from a range of disciplines the book explores the developing world of open science, starting from the first scientific revolution and bringing us to the next scientific revolution, sometimes referred to as “Science 2.0”. Some of the articles deal with the impact of the changing landscape of how science is done, looking at the impact of open science on Academia, or journal publishing, or medical research. Many of the articles look at the uses, pitfalls, and impact of specific tools, like microblogging (think Twitter), social networking, and reference management. There is lots of discussion and definition of terms you might use or misuse like “altmetrics” and “impact factor”. Science will probably never be completely open, and Twitter will probably never replace the journal article,

  8. Grid computing and e-science: a view from inside

    Directory of Open Access Journals (Sweden)

    Stefano Cozzini

    2008-06-01

    Full Text Available My intention is to analyze how, where and if grid computing technology is truly enabling a new way of doing science (so-called ‘e-science’. I will base my views on the experiences accumulated thus far in a number of scientific communities, which we have provided with the opportunity of using grid computing. I shall first define some basic terms and concepts and then discuss a number of specific cases in which the use of grid computing has actually made possible a new method for doing science. I will then present a case in which this did not result in a change in research methods. I will try to identify the reasons for these failures and analyze the future evolution of grid computing. I will conclude by introducing and commenting the concept of ‘cloud computing’, the approach offered and provided by major industrial actors (Google/IBM and Amazon being among the most important and what impact this technology might have on the world of research.

  9. Achieving open access to conservation science.

    Science.gov (United States)

    Fuller, Richard A; Lee, Jasmine R; Watson, James E M

    2014-12-01

    Conservation science is a crisis discipline in which the results of scientific enquiry must be made available quickly to those implementing management. We assessed the extent to which scientific research published since the year 2000 in 20 conservation science journals is publicly available. Of the 19,207 papers published, 1,667 (8.68%) are freely downloadable from an official repository. Moreover, only 938 papers (4.88%) meet the standard definition of open access in which material can be freely reused providing attribution to the authors is given. This compares poorly with a comparable set of 20 evolutionary biology journals, where 31.93% of papers are freely downloadable and 7.49% are open access. Seventeen of the 20 conservation journals offer an open access option, but fewer than 5% of the papers are available through open access. The cost of accessing the full body of conservation science runs into tens of thousands of dollars per year for institutional subscribers, and many conservation practitioners cannot access pay-per-view science through their workplace. However, important initiatives such as Research4Life are making science available to organizations in developing countries. We urge authors of conservation science to pay for open access on a per-article basis or to choose publication in open access journals, taking care to ensure the license allows reuse for any purpose providing attribution is given. Currently, it would cost $51 million to make all conservation science published since 2000 freely available by paying the open access fees currently levied to authors. Publishers of conservation journals might consider more cost effective models for open access and conservation-oriented organizations running journals could consider a broader range of options for open access to nonmembers such as sponsorship of open access via membership fees.

  10. Open access: changing global science publishing.

    Science.gov (United States)

    Gasparyan, Armen Yuri; Ayvazyan, Lilit; Kitas, George D

    2013-08-01

    The article reflects on open access as a strategy of changing the quality of science communication globally. Successful examples of open-access journals are presented to highlight implications of archiving in open digital repositories for the quality and citability of research output. Advantages and downsides of gold, green, and hybrid models of open access operating in diverse scientific environments are described. It is assumed that open access is a global trend which influences the workflow in scholarly journals, changing their quality, credibility, and indexability.

  11. Doing science in the open

    Science.gov (United States)

    Nielsen, Michael

    2009-05-01

    In your high-school science classes you almost certainly learned Hooke's law, relating a spring's length to how hard you pull on it. What your high-school science teacher probably did not tell you is that when Robert Hooke discovered his law in 1676, he published it as an anagram, "ceiiinossssttuv", which he revealed two years later as the Latin "ut tensio, sic vis", meaning "as the extension, so the force". This ensured that if someone else made the same discovery, then Hooke could reveal the anagram and claim priority, thus buying time in which he alone could build upon the discovery.

  12. ScienceSoft: Open software for open science

    CERN Document Server

    Di Meglio, Alberto

    2012-01-01

    Most of the software developed today by research institutes, university, research projects, etc. is typically stored in local source and binary repositories and available for the duration of a project lifetime only. Finding software based on given functional characteristics is almost impossible and binary packages are mostly available from local university or project repositories rather than the open source community repositories like Fedora/EPEL or Debian. Furthermore general information about who develops, contributes to and most importantly uses a given software program is very difficult to find out and yet the widespread availability of such information would give more visibility and credibility to the software products. The creation of links or relationships not only among pieces of software, but equally among the people interacting with the software across and beyond specific project and communities would foster a more active community and create the conditions for sharing ideas and skills, a ...

  13. Open Science: a first step towards Science Communication

    Science.gov (United States)

    Grigorov, Ivo; Tuddenham, Peter

    2015-04-01

    As Earth Science communicators gear up to adopt the new tools and captivating approaches to engage citizen scientists, budding entrepreneurs, policy makers and the public in general, researchers have the responsibility, and opportunity, to fully adopt Open Science principles and capitalize on its full societal impact and engagement. Open Science is about removing all barriers to basic research, whatever its formats, so that it can be freely used, re-used and re-hashed, thus fueling discourse and accelerating generation of innovative ideas. The concept is central to EU's Responsible Research and Innovation philosophy, and removing barriers to basic research measurably contributes to engaging citizen scientists into the research process, it sets the scene for co-creation of solutions to societal challenges, and raises the general science literacy level of the public. Despite this potential, only 50% of today's basic research is freely available. Open Science can be the first passive step of communicating marine research outside academia. Full and unrestricted access to our knowledge including data, software code and scientific publications is not just an ethical obligation, but also gives solid credibility to a more sophisticated communication strategy on engaging society. The presentation will demonstrate how Open Science perfectly compliments a coherent communication strategy for placing Marine Research in societal context, and how it underpin an effective integration of Ocean & Earth Literacy principles in standard educational, as well mobilizing citizen marine scientists, thus making marine science Open Science.

  14. Editorial: Science Popularization through Open Access

    Directory of Open Access Journals (Sweden)

    Alireza Noruzi

    2008-03-01

    Full Text Available The science plays a crucial role in the modern society, and the popularization of science in its electronic form is closely related to the rise and development of the World Wide Web. Since 1990s -the introduction of the Web as a part of the Internet- the science popularization has become more and more involved in the web-based society. Therefore, the Web has become an important technical support of the science popularization. The Web, on the one hand, has increased the accessibility, visibility and popularity of science and scientific research. On the other hand, the increased accessibility and visibility has also increased the citations and research impact or educational impact received by a popular journal or a paper. Science popularization has now made an important step forward, because the Web contributes today as an effective means of improving the public understanding of science. The Web has made it possible to popularize the science via popular search engines. The Web creates a link between specialists and the public; in short, between science and common sense, just by a hyperlink. The science popularization is an attempt to reduce the distance standing between science specialists and the public. Science popularization is interpretation of scientific information (science intended for a general audience, rather than for other experts or students. Science popularization via the Web is a programme that takes science to web users with an objective of making them aware of the efforts, achievements and advances of science. Such programmes could include e-books, e-conferences, e-newspapers, online journalism, online workshops, seminars, and meetings, electronic forums, open-access journals, audio-visiual material, etc.

  15. Collaborative Web between open and closed science

    OpenAIRE

    2008-01-01

    “Web 2.0” is the mantra enthusiastically repeated in the past few years on anything concerning the production of culture, dialogue and online communication. Even science is changing, along with the processes involving the communication, collaboration and cooperation created through the web, yet rooted in some of its historical features of openness. For this issue, JCOM has asked some experts on the most recent changes in science to analyse the potential and the contradictions lying in online ...

  16. Using Computing and Data Grids for Large-Scale Science and Engineering

    Science.gov (United States)

    Johnston, William E.

    2001-01-01

    We use the term "Grid" to refer to a software system that provides uniform and location independent access to geographically and organizationally dispersed, heterogeneous resources that are persistent and supported. These emerging data and computing Grids promise to provide a highly capable and scalable environment for addressing large-scale science problems. We describe the requirements for science Grids, the resulting services and architecture of NASA's Information Power Grid (IPG) and DOE's Science Grid, and some of the scaling issues that have come up in their implementation.

  17. Open Access: (Social Sciences as Public Good

    Directory of Open Access Journals (Sweden)

    Katja Mruck

    2004-05-01

    Full Text Available The need to provide open access to articles published in peer-reviewed scholarly journals is becoming apparent to researchers as well as the non-scientific public as a result of "Budapest Open Access Initiative," the "Berlin Declaration on Open Access to Knowledge in the Sciences and Humanities" and other initiatives. The core question that concerns open access is the following: since scientific information is usually financed by public funding, and therefore a public good, shouldn't the access be free of cost to all interested parties. Currently the open access movement is encountering the movement against the "Digital Divide," and therefore it is not surprising that the demand for open access has extended to a political level as reflected in the "WSIS Declaration of Principles" and the "WSIS Plan of Action." This article begins by providing a brief summary of the historical background of the open access movement and its major aims (Section 2. It then lists examples that explain possible links between the open access movement and the initiatives against the "Digital Divide" (Section 3. Section 4 considers some important barriers responsible for the fact that open access publishing is still not part of the everyday scientific publishing practices. This has various consequences. Selected consequences concerning the recent debate on redistribution processes between "information poor" and "information rich" are summarized in Section 5. URN: urn:nbn:de:0114-fqs0402141

  18. Influence of Grid Resolution in Modeling of Air Pollution from Open Burning

    Directory of Open Access Journals (Sweden)

    Duanpen Sirithian

    2016-07-01

    Full Text Available Influences of different computational grid resolutions on modeled ambient benzene concentrations from open burning were assessed in this study. The CALPUFF (California Puff Mesoscale Dispersion Model was applied to simulate maximum ground level concentration over the modeling domain of 100 × 100 km2. Meteorological data of the year 2014 was simulated from the Weather Research and Forecasting (WRF model. Four different grid resolutions were tested including 0.75 km, 1 km, 2 km and 3 km resolutions. Predicted values of the maximum 24-h average concentrations obtained from the finest grid resolution (0.75 km were set as reference values. In total, there were 1089 receptors used as reference locations for comparison of the results from different computational grid resolutions. Comparative results revealed that the larger the grid resolution, the higher the over-prediction of the results. Nevertheless, it was found that increasing the grid resolution from the finest resolution (0.75 km to coarser resolutions (1 km, 2 km and 3 km resulted in reduction of computational time by approximately 66%, 97% and >99% as compared with the reference grid resolution, respectively. Results revealed that the grid resolution of 1 km is the most appropriate resolution with regard to both accuracy of predicted data and acceptable computational time for the model simulation of the open burning source.

  19. Open-circuit fault diagnosis for a grid-connected NPC inverter with unity Power Factor

    DEFF Research Database (Denmark)

    Choi, Uimin; Blaabjerg, Frede; Lee, June-Seok;

    2015-01-01

    This paper presents an open-circuit fault detection method for a grid-connected Neutral-Point Clamped (NPC) inverter. The open-circuit fault mainly occurs due to bonding wire failures like lift-off and crack in the power module where the thermal stress is major factor among the stresses that can ...

  20. Smart grids clouds, communications, open source, and automation

    CERN Document Server

    Bakken, David

    2014-01-01

    The utilization of sensors, communications, and computer technologies to create greater efficiency in the generation, transmission, distribution, and consumption of electricity will enable better management of the electric power system. As the use of smart grid technologies grows, utilities will be able to automate meter reading and billing and consumers will be more aware of their energy usage and the associated costs. The results will require utilities and their suppliers to develop new business models, strategies, and processes.With an emphasis on reducing costs and improving return on inve

  1. The Earth System Grid Federation : an Open Infrastructure for Access to Distributed Geospatial Data

    Science.gov (United States)

    Cinquini, Luca; Crichton, Daniel; Mattmann, Chris; Harney, John; Shipman, Galen; Wang, Feiyi; Ananthakrishnan, Rachana; Miller, Neill; Denvil, Sebastian; Morgan, Mark; Pobre, Zed; Bell, Gavin M.; Drach, Bob; Williams, Dean; Kershaw, Philip; Pascoe, Stephen; Gonzalez, Estanislao; Fiore, Sandro; Schweitzer, Roland

    2012-01-01

    The Earth System Grid Federation (ESGF) is a multi-agency, international collaboration that aims at developing the software infrastructure needed to facilitate and empower the study of climate change on a global scale. The ESGF's architecture employs a system of geographically distributed peer nodes, which are independently administered yet united by the adoption of common federation protocols and application programming interfaces (APIs). The cornerstones of its interoperability are the peer-to-peer messaging that is continuously exchanged among all nodes in the federation; a shared architecture and API for search and discovery; and a security infrastructure based on industry standards (OpenID, SSL, GSI and SAML). The ESGF software is developed collaboratively across institutional boundaries and made available to the community as open source. It has now been adopted by multiple Earth science projects and allows access to petabytes of geophysical data, including the entire model output used for the next international assessment report on climate change (IPCC-AR5) and a suite of satellite observations (obs4MIPs) and reanalysis data sets (ANA4MIPs).

  2. The Earth System Grid Federation: An Open Infrastructure for Access to Distributed Geo-Spatial Data

    Energy Technology Data Exchange (ETDEWEB)

    Cinquini, Luca [Jet Propulsion Laboratory, Pasadena, CA; Crichton, Daniel [Jet Propulsion Laboratory, Pasadena, CA; Miller, Neill [Argonne National Laboratory (ANL); Mattmann, Chris [Jet Propulsion Laboratory, Pasadena, CA; Harney, John F [ORNL; Shipman, Galen M [ORNL; Wang, Feiyi [ORNL; Bell, Gavin [Lawrence Livermore National Laboratory (LLNL); Drach, Bob [Lawrence Livermore National Laboratory (LLNL); Ananthakrishnan, Rachana [Argonne National Laboratory (ANL); Pascoe, Stephen [STFC Rutherford Appleton Laboratory, NCAS/BADC; Kershaw, Philip [STFC Rutherford Appleton Laboratory, NCAS/BADC; Gonzalez, Estanislao [German Climate Computing Center; Fiore, Sandro [Euro-Mediterranean Center on Climate Change; Schweitzer, Roland [Pacific Marine Environmental Laboratory, National Oceanic and Atmospheric Administration; Danvil, Sebastian [Institut Pierre Simon Laplace (IPSL), Des Sciences de L' Environnement; Morgan, Mark [Institut Pierre Simon Laplace (IPSL), Des Sciences de L' Environnement

    2012-01-01

    The Earth System Grid Federation (ESGF) is a multi-agency, international collaboration that aims at developing the software infrastructure needed to facilitate and empower the study of climate change on a global scale. The ESGF s architecture employs a system of geographically distributed peer nodes, which are independently administered yet united by the adoption of common federation protocols and application programming interfaces (APIs). The cornerstones of its interoperability are the peer-to-peer messaging that is continuously exchanged among all nodes in the federation; a shared architecture and API for search and discovery; and a security infrastructure based on industry standards (OpenID, SSL, GSI and SAML). The ESGF software is developed collaboratively across institutional boundaries and made available to the community as open source. It has now been adopted by multiple Earth science projects and allows access to petabytes of geophysical data, including the entire model output used for the next international assessment report on climate change (IPCC-AR5) and a suite of satellite observations (obs4MIPs) and reanalysis data sets (ANA4MIPs).

  3. The Earth System Grid Federation: An Open Infrastructure for Access to Distributed Geospatial Data

    Energy Technology Data Exchange (ETDEWEB)

    Ananthakrishnan, Rachana [Argonne National Laboratory (ANL); Bell, Gavin [Lawrence Livermore National Laboratory (LLNL); Cinquini, Luca [Jet Propulsion Laboratory, Pasadena, CA; Crichton, Daniel [Jet Propulsion Laboratory, Pasadena, CA; Danvil, Sebastian [Institut Pierre Simon Laplace (IPSL), Des Sciences de L' Environnement; Drach, Bob [Lawrence Livermore National Laboratory (LLNL); Fiore, Sandro [Euro-Mediterranean Center on Climate Change; Gonzalez, Estanislao [German Climate Computing Center; Harney, John F [ORNL; Mattmann, Chris [Jet Propulsion Laboratory, Pasadena, CA; Kershaw, Philip [STFC Rutherford Appleton Laboratory, NCAS/BADC; Miller, Neill [Argonne National Laboratory (ANL); Morgan, Mark [Institut Pierre Simon Laplace (IPSL), Des Sciences de L' Environnement; Pascoe, Stephen [STFC Rutherford Appleton Laboratory, NCAS/BADC; Schweitzer, Roland [Pacific Marine Environmental Laboratory, National Oceanic and Atmospheric Administration; Shipman, Galen M [ORNL; Wang, Feiyi [ORNL

    2013-01-01

    The Earth System Grid Federation (ESGF) is a multi-agency, international collaboration that aims at developing the software infrastructure needed to facilitate and empower the study of climate change on a global scale. The ESGF s architecture employs a system of geographically distributed peer nodes, which are independently administered yet united by the adoption of common federation protocols and application programming interfaces (APIs). The cornerstones of its interoperability are the peer-to-peer messaging that is continuously exchanged among all nodes in the federation; a shared architecture and API for search and discovery; and a security infrastructure based on industry standards (OpenID, SSL, GSI and SAML). The ESGF software is developed collaboratively across institutional boundaries and made available to the community as open source. It has now been adopted by multiple Earth science projects and allows access to petabytes of geophysical data, including the entire model output used for the next international assessment report on climate change (IPCC-AR5) and a suite of satellite observations (obs4MIPs) and reanalysis data sets (ANA4MIPs).

  4. The Earth System Grid Federation : an Open Infrastructure for Access to Distributed Geospatial Data

    Science.gov (United States)

    Cinquini, Luca; Crichton, Daniel; Mattmann, Chris; Harney, John; Shipman, Galen; Wang, Feiyi; Ananthakrishnan, Rachana; Miller, Neill; Denvil, Sebastian; Morgan, Mark; hide

    2012-01-01

    The Earth System Grid Federation (ESGF) is a multi-agency, international collaboration that aims at developing the software infrastructure needed to facilitate and empower the study of climate change on a global scale. The ESGF's architecture employs a system of geographically distributed peer nodes, which are independently administered yet united by the adoption of common federation protocols and application programming interfaces (APIs). The cornerstones of its interoperability are the peer-to-peer messaging that is continuously exchanged among all nodes in the federation; a shared architecture and API for search and discovery; and a security infrastructure based on industry standards (OpenID, SSL, GSI and SAML). The ESGF software is developed collaboratively across institutional boundaries and made available to the community as open source. It has now been adopted by multiple Earth science projects and allows access to petabytes of geophysical data, including the entire model output used for the next international assessment report on climate change (IPCC-AR5) and a suite of satellite observations (obs4MIPs) and reanalysis data sets (ANA4MIPs).

  5. Rethinking Open Science: The Role of Communication

    OpenAIRE

    Kulczycki, Emanuel

    2016-01-01

    The first version of this text was presented in the “Philosophy of Communication” section at the ECREA’s 5th European Communication Conference, “Communication for Empowerment,” in Lisbon in November 2014. I would like to thank the audience for the lively post-presentation discussion. The aim of this study is to present discourses on Open Science. My reconstruction emphasizes the role of communication in science. I use two models of communication for the analysis: the transmission model and...

  6. An open science cloud for scientific research

    Science.gov (United States)

    Jones, Bob

    2016-04-01

    The Helix Nebula initiative was presented at EGU 2013 (http://meetingorganizer.copernicus.org/EGU2013/EGU2013-1510-2.pdf) and has continued to expand with more research organisations, providers and services. The hybrid cloud model deployed by Helix Nebula has grown to become a viable approach for provisioning ICT services for research communities from both public and commercial service providers (http://dx.doi.org/10.5281/zenodo.16001). The relevance of this approach for all those communities facing societal challenges in explained in a recent EIROforum publication (http://dx.doi.org/10.5281/zenodo.34264). This presentation will describe how this model brings together a range of stakeholders to implement a common platform for data intensive services that builds upon existing public funded e-infrastructures and commercial cloud services to promote open science. It explores the essential characteristics of a European Open Science Cloud if it is to address the big data needs of the latest generation of Research Infrastructures. The high-level architecture and key services as well as the role of standards is described. A governance and financial model together with the roles of the stakeholders, including commercial service providers and downstream business sectors, that will ensure a European Open Science Cloud can innovate, grow and be sustained beyond the current project cycles is described.

  7. Research collaboration and the expanding science grid: Measuring globalization processes worldwide

    CERN Document Server

    Tijssen, Robert J W; van Eck, Nees Jan

    2012-01-01

    This paper applies a new model and analytical tool to measure and study contemporary globalization processes in collaborative science - a world in which scientists, scholars, technicians and engineers interact within a 'grid' of interconnected research sites and collaboration networks. The building blocks of our metrics are the cities where scientific research is conducted, as mentioned in author addresses on research publications. The unit of analysis is the geographical distance between those cities. In our macro-level trend analysis, covering the years 2000-2010, we observe that research collaboration distances have been increasing, while the share of collaborative contacts with foreign cities has leveled off. Collaboration distances and growth rates differ significantly between countries and between fields of science. The application of a distance metrics to compare and track these processes opens avenues for further studies, both at the meso-level and at the micro-level, into how research collaboration p...

  8. Collaborative Science Using Web Services and the SciFlo Grid Dataflow Engine

    Science.gov (United States)

    Wilson, B. D.; Manipon, G.; Xing, Z.; Yunck, T.

    2006-12-01

    The General Earth Science Investigation Suite (GENESIS) project is a NASA-sponsored partnership between the Jet Propulsion Laboratory, academia, and NASA data centers to develop a new suite of Web Services tools to facilitate multi-sensor investigations in Earth System Science. The goal of GENESIS is to enable large-scale, multi-instrument atmospheric science using combined datasets from the AIRS, MODIS, MISR, and GPS sensors. Investigations include cross-comparison of spaceborne climate sensors, cloud spectral analysis, study of upper troposphere-stratosphere water transport, study of the aerosol indirect cloud effect, and global climate model validation. The challenges are to bring together very large datasets, reformat and understand the individual instrument retrievals, co-register or re-grid the retrieved physical parameters, perform computationally-intensive data fusion and data mining operations, and accumulate complex statistics over months to years of data. To meet these challenges, we have developed a Grid computing and dataflow framework, named SciFlo, in which we are deploying a set of versatile and reusable operators for data access, subsetting, registration, mining, fusion, compression, and advanced statistical analysis. SciFlo leverages remote Web Services, called via Simple Object Access Protocol (SOAP) or REST (one-line) URLs, and the Grid Computing standards (WS-* &Globus Alliance toolkits), and enables scientists to do multi-instrument Earth Science by assembling reusable Web Services and native executables into a distributed computing flow (tree of operators). The SciFlo client &server engines optimize the execution of such distributed data flows and allow the user to transparently find and use datasets and operators without worrying about the actual location of the Grid resources. In particular, SciFlo exploits the wealth of datasets accessible by OpenGIS Consortium (OGC) Web Mapping Servers & Web Coverage Servers (WMS/WCS), and by Open Data

  9. Open Science as a Knowledge Transfer strategy

    Science.gov (United States)

    Grigorov, Ivo; Dalmeier-Thiessen, Suenje

    2015-04-01

    Beyond providing basic understanding of how our Blue Planet functions, flows and breathes, the collection of Earth & Marine Research disciplines are of major service to most of today's Societal Challenges: from Food Security and Sustainable Resource Management, to Renewable Energies, Climate Mitigation & Ecosystem Services and Hazards. Natural Resources are a key commodity in the long-term strategy of the EU Innovation Union(1), and better understanding of the natural process governing them, as well as science-based management are seen as a key area for stimulating future economic growth. Such potential places responsibility on research project managers to devise innovative methods to ensure effective transfer of new research to public and private sector users, and society at large. Open Science is about removing all barriers to full sphere basic research knowledge and outputs, not just the publishable part of research but also the data, the software code, and failed experiments. The concept is central to EU's Responsible Research and Innovation philosophy(2), and removing barriers to basic research measurably contributes to the EU's Blue Growth Agenda(3). Despite the potential of the internet age to deliver on that promise, only 50% of today's basic research is freely available(4). The talk will demonstrate how and why Open Science can be a first, passive but effective strategy for any research project to transfer knowledge to society by allowing access and dicoverability to the full sphere of new knowledge, not just the published outputs. Apart from contributing to economic growth, Open Science can also optimize collaboration, within academia, assist with better engagement of citizen scientists into the research process and co-creation of solutions to societal challenges, as well as providing a solid ground for more sophisticated communication strategies and Ocean/Earth Literacy initiatives targeting policy makers and the public at large. (1)EC Digital Agenda

  10. Data Grid tools: enabling science on big distributed data

    Energy Technology Data Exchange (ETDEWEB)

    Allcock, Bill [Mathematics and Computer Science, Argonne National Laboratory, Argonne, IL 60439 (United States); Chervenak, Ann [Information Sciences Institute, University of Southern California, Marina del Rey, CA 90291 (United States); Foster, Ian [Mathematics and Computer Science, Argonne National Laboratory, Argonne, IL 60439 (United States); Department of Computer Science, University of Chicago, Chicago, IL 60615 (United States); Kesselman, Carl [Information Sciences Institute, University of Southern California, Marina del Rey, CA 90291 (United States); Livny, Miron [Department of Computer Science, University of Wisconsin, Madison, WI 53705 (United States)

    2005-01-01

    A particularly demanding and important challenge that we face as we attempt to construct the distributed computing machinery required to support SciDAC goals is the efficient, high-performance, reliable, secure, and policy-aware management of large-scale data movement. This problem is fundamental to diverse application domains including experimental physics (high energy physics, nuclear physics, light sources), simulation science (climate, computational chemistry, fusion, astrophysics), and large-scale collaboration. In each case, highly distributed user communities require high-speed access to valuable data, whether for visualization or analysis. The quantities of data involved (terabytes to petabytes), the scale of the demand (hundreds or thousands of users, data-intensive analyses, real-time constraints), and the complexity of the infrastructure that must be managed (networks, tertiary storage systems, network caches, computers, visualization systems) make the problem extremely challenging. Data management tools developed under the auspices of the SciDAC Data Grid Middleware project have become the de facto standard for data management in projects worldwide. Day in and day out, these tools provide the 'plumbing' that allows scientists to do more science on an unprecedented scale in production environments.

  11. Persistent Identifiers, Discoverability and Open Science (Communication)

    Science.gov (United States)

    Murphy, Fiona; Lehnert, Kerstin; Hanson, Brooks

    2016-04-01

    Early in 2016, the American Geophysical Union announced it was incorporating ORCIDs into its submission workflows. This was accompanied by a strong statement supporting the use of other persistent identifiers - such as IGSNs, and the CrossRef open registry 'funding data'. This was partly in response to funders' desire to track and manage their outputs. However the more compelling argument, and the reason why the AGU has also signed up to the Center for Open Science's Transparency and Openness Promotion (TOP) Guidelines (http://cos.io/top), is that ultimately science and scientists will be the richer for these initiatives due to increased opportunities for interoperability, reproduceability and accreditation. The AGU has appealed to the wider community to engage with these initiatives, recognising that - unlike the introduction of Digital Object Identifiers (DOIs) for articles by CrossRef - full, enriched use of persistent identifiers throughout the scientific process requires buy-in from a range of scholarly communications stakeholders. At the same time, across the general research landscape, initiatives such as Project CRediT (contributor roles taxonomy), Publons (reviewer acknowledgements) and the forthcoming CrossRef DOI Event Tracker are contributing to our understanding and accreditation of contributions and impact. More specifically for earth science and scientists, the cross-functional Coalition for Publishing Data in the Earth and Space Sciences (COPDESS) was formed in October 2014 and is working to 'provide an organizational framework for Earth and space science publishers and data facilities to jointly implement and promote common policies and procedures for the publication and citation of data across Earth Science journals'. Clearly, the judicious integration of standards, registries and persistent identifiers such as ORCIDs and International Geo Sample Numbers (IGSNs) to the research and research output processes is key to the success of this venture

  12. Open data science technical and cultural aspects

    CERN Document Server

    CERN. Geneva

    2005-01-01

    Research in STM fields routinely generates and requires large amounts of data in electronic form. The growth of scientific research using infrastructures such as the Grid, UK's eScience programme and cyber infrastructure requires the re-use, repurposing and redissemination of this information. Fields like bioinformatics, astronomy, physics, and earth/environmental sciences routinely use such data as primary research input. Much of this is now carried out by machines which harvest data from multiple sources in dynamic and iterative ways, validate, filter compute and republish it. The current publication process and legal infrastructure is now a serious hindrance to this. Most STM data are never published and the re-usability of those that are is often unclear as authors and publishers give no explicit permission. However almost all authors intend that published data (non-copyrightable “facts”) are for the re-use of and redissemination to the STM community and the world in general. Many publishers agree wit...

  13. Secure Grid Services for Cooperative Work in Medicine and Life Science

    Science.gov (United States)

    Weisbecker, Anette; Falkner, Jürgen

    MediGRID provides a grid infrastructure to solve challenging problems in medical and life sciences by enhancing the productivity and by enabling locationindependent, interdisciplinary collaboration. The usage of grid technology has enabled the development of new application and services for research in medical and life sciences. In order to enlarge the range of services and to get a broader range of users sustainable business models are needed. In Services@MediGRID methods for monitoring, accounting, and billing which fulfilled the high security demands within medicine and life sciences will be developed. Also different requirements of academic and industrial grid customers are considered in order to establish the sustainable business models for grid computing.

  14. OpenZika: An IBM World Community Grid Project to Accelerate Zika Virus Drug Discovery.

    OpenAIRE

    Sean Ekins; Perryman, Alexander L.; Carolina Horta Andrade

    2016-01-01

    The Zika virus outbreak in the Americas has caused global concern. To help accelerate this fight against Zika, we launched the OpenZika project. OpenZika is an IBM World Community Grid Project that uses distributed computing on millions of computers and Android devices to run docking experiments, in order to dock tens of millions of drug-like compounds against crystal structures and homology models of Zika proteins (and other related flavivirus targets). This will enable the identification of...

  15. A Real-Time Open Access Platform Towards Proof of Concept for Smart Grid Applications

    DEFF Research Database (Denmark)

    Kemal, Mohammed Seifu; Petersen, Lennart; Iov, Florin

    2017-01-01

    This paper presents development of real time open access platform towards proof of concept of smart grid applications deployed at Smart Energy System Laboratory of Aalborg University. Discussed on the paper is the architecture and set-up of the platform by elaborating the three main layers...

  16. Method for detecting an open-switch fault in a grid-connected NPC inverter system

    DEFF Research Database (Denmark)

    Choi, Ui-Min; Jeong, Hae-Gwang; Lee, Kyo-Beum;

    2012-01-01

    This paper proposes a fault-detection method for an open-switch fault in the switches of grid-connected neutral-point-clamped inverter systems. The proposed method can not only detect the fault condition but also identify the location of the faulty switch. In the proposed method, which is designed...

  17. Winning Horizon2020 with Open Science

    DEFF Research Database (Denmark)

    Grigorov, Ivo; Elbæk, Mikael Karstensen; Rettberg, Najla

    2016-01-01

    Open Science (OS) offers researchers tools and workflows for transparency, reproducibility, dissemination and transfer of new knowledge. Ultimately, this can also have an impact on in research evaluation exercises, e.g. Research Excellence Framework (REF), set to demand greater “societal impact......” in future, rather than just research output. OS can also be an effective tool for research managers to transfer knowledge to society, and optimize the use and re-use by unforeseen collaborators. For funders, OS offers a better return on investment (ROI) for public funding, and underpins the EU Digital...

  18. An open science peer review oath

    DEFF Research Database (Denmark)

    Aleksic, Jelena; Adrian Alexa, Adrian Alexa; Attwood, Teresa K.

    2015-01-01

    : specifically, we introduce a peer-review oath and accompanying manifesto. These have been designed to offer guidelines to enable reviewers (with the minimum friction or bias) to follow and apply open science principles, and support the ideas of transparency, reproducibility and ultimately greater societal...... impact. Introducing the oath and manifesto at the stage of peer review will help to check that the research being published includes everything that other researchers would need to successfully repeat the work. Peer review is the lynchpin of the publishing system: encouraging the community to consciously...

  19. Developing international open science collaborations: Funder reflections on the Open Science Prize.

    Science.gov (United States)

    Kittrie, Elizabeth; Atienza, Audie A; Kiley, Robert; Carr, David; MacFarlane, Aki; Pai, Vinay; Couch, Jennifer; Bajkowski, Jared; Bonner, Joseph F; Mietchen, Daniel; Bourne, Philip E

    2017-08-01

    The Open Science Prize was established with the following objectives: first, to encourage the crowdsourcing of open data to make breakthroughs that are of biomedical significance; second, to illustrate that funders can indeed work together when scientific interests are aligned; and finally, to encourage international collaboration between investigators with the intent of achieving important innovations that would not be possible otherwise. The process for running the competition and the successes and challenges that arose are presented.

  20. Achieving Open Access to Conservation Science

    Science.gov (United States)

    Fuller, Richard A; Lee, Jasmine R; Watson, James E M

    2014-01-01

    Conservation science is a crisis discipline in which the results of scientific enquiry must be made available quickly to those implementing management. We assessed the extent to which scientific research published since the year 2000 in 20 conservation science journals is publicly available. Of the 19,207 papers published, 1,667 (8.68%) are freely downloadable from an official repository. Moreover, only 938 papers (4.88%) meet the standard definition of open access in which material can be freely reused providing attribution to the authors is given. This compares poorly with a comparable set of 20 evolutionary biology journals, where 31.93% of papers are freely downloadable and 7.49% are open access. Seventeen of the 20 conservation journals offer an open access option, but fewer than 5% of the papers are available through open access. The cost of accessing the full body of conservation science runs into tens of thousands of dollars per year for institutional subscribers, and many conservation practitioners cannot access pay-per-view science through their workplace. However, important initiatives such as Research4Life are making science available to organizations in developing countries. We urge authors of conservation science to pay for open access on a per-article basis or to choose publication in open access journals, taking care to ensure the license allows reuse for any purpose providing attribution is given. Currently, it would cost $51 million to make all conservation science published since 2000 freely available by paying the open access fees currently levied to authors. Publishers of conservation journals might consider more cost effective models for open access and conservation-oriented organizations running journals could consider a broader range of options for open access to nonmembers such as sponsorship of open access via membership fees. Obtención de Acceso Abierto a la Ciencia de la Conservación Resumen La ciencia de la conservación es una

  1. A simple grid implementation with Berkeley Open Infrastructure for Network Computing using BLAST as a model.

    Science.gov (United States)

    Pinthong, Watthanai; Muangruen, Panya; Suriyaphol, Prapat; Mairiang, Dumrong

    2016-01-01

    Development of high-throughput technologies, such as Next-generation sequencing, allows thousands of experiments to be performed simultaneously while reducing resource requirement. Consequently, a massive amount of experiment data is now rapidly generated. Nevertheless, the data are not readily usable or meaningful until they are further analysed and interpreted. Due to the size of the data, a high performance computer (HPC) is required for the analysis and interpretation. However, the HPC is expensive and difficult to access. Other means were developed to allow researchers to acquire the power of HPC without a need to purchase and maintain one such as cloud computing services and grid computing system. In this study, we implemented grid computing in a computer training center environment using Berkeley Open Infrastructure for Network Computing (BOINC) as a job distributor and data manager combining all desktop computers to virtualize the HPC. Fifty desktop computers were used for setting up a grid system during the off-hours. In order to test the performance of the grid system, we adapted the Basic Local Alignment Search Tools (BLAST) to the BOINC system. Sequencing results from Illumina platform were aligned to the human genome database by BLAST on the grid system. The result and processing time were compared to those from a single desktop computer and HPC. The estimated durations of BLAST analysis for 4 million sequence reads on a desktop PC, HPC and the grid system were 568, 24 and 5 days, respectively. Thus, the grid implementation of BLAST by BOINC is an efficient alternative to the HPC for sequence alignment. The grid implementation by BOINC also helped tap unused computing resources during the off-hours and could be easily modified for other available bioinformatics software.

  2. Cuban Science and the Open Access Alternative

    CERN Document Server

    Arencibia Jorge, Ricardo; Torricella-Morales, Raúl G

    2004-01-01

    Science in Cuba has experienced extraordinary development since the triumph of the Cuban Revolution, in spite of the blockade to which Cuba has been subjected by the United States Government, and thanks to the support and cooperation of the countries that were part of the former Socialist Block. However, after the destruction of the Socialist Block, the Cuban economy suffered through a restructuring process that included the reorganization of the traditional systems for spreading scientific information. At that moment, it was necessary to use alternative means to effectively publicise, to the international scientific community, the information generated by Cuban scientists and scholars. This paper briefly reviews this new era, the institutions that led the process of change, and the future projections based on knowledge of the digital environment and the creation of electronic and open access information sources.

  3. Trinity Phase 2 Open Science: CTH

    Energy Technology Data Exchange (ETDEWEB)

    Ruggirello, Kevin Patrick [Sandia National Lab. (SNL-NM), Albuquerque, NM (United States); Vogler, Tracy [Sandia National Lab. (SNL-NM), Albuquerque, NM (United States)

    2017-08-01

    CTH is an Eulerian hydrocode developed by Sandia National Laboratories (SNL) to solve a wide range of shock wave propagation and material deformation problems. Adaptive mesh refinement is also used to improve efficiency for problems with a wide range of spatial scales. The code has a history of running on a variety of computing platforms ranging from desktops to massively parallel distributed-data systems. For the Trinity Phase 2 Open Science campaign, CTH was used to study mesoscale simulations of the hypervelocity penetration of granular SiC powders. The simulations were compared to experimental data. A scaling study of CTH up to 8192 KNL nodes was also performed, and several improvements were made to the code to improve the scalability.

  4. The Smart Grid using in the Kuzbass open-pit coalmine

    Directory of Open Access Journals (Sweden)

    Semykina Irina

    2017-01-01

    Full Text Available The Smart Grid system become to apply all over the world. It provides significant efficiency increasing of power supply networks by supporting its balanced load. The key design features of each Smart Grid system depend on individual characteristics of particular power supply network which are determined by the structure of its electrical energy consumers. Relating to Kuzbass open-pit coalmines it is necessary to take into account the nonstationarity of its power supply scheme, the strict requirements of power supply reliability and the high capacity of consumers. This article is connected with the “Kedrovskiy” open-pit coalmine and describes its power supply scheme and the structure of its consumers. The downtimes and electrical equipment failures are analyzed and the connection between the number of emergency downtime and the number of working excavators is found out. The load of “Kedrovskiy” power distribution network is calculated for the strip mining operation. The results show power distribution network under consideration does not provide the effectiveness functioning and the implementation of Smart Grid is a good decision. There are the structure and design calculation of proposed Smart Grid in the article. It is finally depicted that Smart Grid” system decreases the downtimes of electric equipment and increases the power supply reliability.

  5. The Low Down on e-Science and Grids for Biology

    Directory of Open Access Journals (Sweden)

    Carole Goble

    2006-04-01

    Full Text Available The Grid is touted as a next generation Internet/Web, designed primarily to support e-Science. I hope to shed some light on what the Grid is, its purpose, and its potential impact on scientific practice in biology. The key message is that biologists are already primarily working in a manner that the Grid is intended to support. However, to ensure that the Grid’s good intentions are appropriate and fulfilled in practice, biologists must become engaged in the process of its development.

  6. The Low Down on e-Science and Grids for Biology

    OpenAIRE

    Carole Goble

    2001-01-01

    The Grid is touted as a next generation Internet/Web, designed primarily to support e-Science. I hope to shed some light on what the Grid is, its purpose, and its potential impact on scientific practice in biology. The key message is that biologists are already primarily working in a manner that the Grid is intended to support. However, to ensure that the Grid’s good intentions are appropriate and fulfilled in practice, biologists must become engaged in the process of its development....

  7. Grid Integration Science, NREL Power Systems Engineering Center

    Energy Technology Data Exchange (ETDEWEB)

    Kroposki, Benjamin [National Renewable Energy Lab. (NREL), Golden, CO (United States)

    2017-04-25

    This report highlights journal articles published in 2016 by researchers in the Power Systems Engineering Center. NREL's Power Systems Engineering Center published 47 journal and magazine articles in the past year, highlighting recent research in grid modernization.

  8. Topological grid structure - A data structure for earth science modeling

    Science.gov (United States)

    Goldberg, M.; Hallada, W. A.; Marcell, R. F.; Lindboe, W.

    1984-01-01

    The automated analysis of land surface features is increasingly important to earth scientists. User-friendly algorithms for studying these features can be integrated into geographic information systems through the use of topological grid structure, which maintains the simplicity and transportability of standard grid structure while providing the essential capability to treat groups of contiguous, identically-classified pixels (corresponding to lakes, forests, fields, etc.) as distinct spatial entities.

  9. Development of a Parallel Overset Grid Framework for Moving Body Simulations in OpenFOAM

    Directory of Open Access Journals (Sweden)

    Dominic Chandar

    2015-12-01

    Full Text Available OpenFOAM is an industry-standard Open-Source fluid dynamics code that is used to solve the Navier-Stokes equations for a variety of flow situations. It is currently being used extensively by researchers to study a plethora of physical problems ranging from fundamental fluid dynamics to complex multiphase flows. When it comes to modeling the flow surrounding moving bodies that involve large displacements such as that of ocean risers, sinking of a ship, or the free-flight of an insect, it is cumbersome to utilize a single computational grid and move the body of interest. In this work, we discuss a high-fidelity approach based on overset or overlapping grids which overcomes the necessity of using a single computational grid. The overset library is parallelized using the Message Passing Interface (MPI and Pthreads and is linked dynamically to OpenFOAM. Computational results are presented to demonstrate the potential of this method for simulating problems with large displacements.

  10. Open Automated Demand Response Technologies for Dynamic Pricing and Smart Grid

    Energy Technology Data Exchange (ETDEWEB)

    Ghatikar, Girish; Mathieu, Johanna L.; Piette, Mary Ann; Kiliccote, Sila

    2010-06-02

    We present an Open Automated Demand Response Communications Specifications (OpenADR) data model capable of communicating real-time prices to electricity customers. We also show how the same data model could be used to for other types of dynamic pricing tariffs (including peak pricing tariffs, which are common throughout the United States). Customers participating in automated demand response programs with building control systems can respond to dynamic prices by using the actual prices as inputs to their control systems. Alternatively, prices can be mapped into"building operation modes," which can act as inputs to control systems. We present several different strategies customers could use to map prices to operation modes. Our results show that OpenADR can be used to communicate dynamic pricing within the Smart Grid and that OpenADR allows for interoperability with existing and future systems, technologies, and electricity markets.

  11. Necobelac supporting open access, a path to open science

    OpenAIRE

    Agudelo-Calderon, Carlos Alberto

    2012-01-01

    On the occasion of the International Open Access week, a global event now in its 5th year, promoting Open Access (OA) as a new norm in scholarship and research, we, the NECOBELAC (Network of collaboration between Europe and Latin American Caribbean (LAC) countries to spread know-how in scientific writing and provide the best tools to exploit open access information for the safeguard of public health) project partners, wish to stimulate the scientific community to take action and make their wo...

  12. Regulations concerning open access to transmission grid in U.S.. Analysis of order No.888; Beikoku ni okeru open access no hokisei. Order No.888 no kento

    Energy Technology Data Exchange (ETDEWEB)

    Maruyama, M.

    1999-02-01

    Recently, countries aiming at deregulation of the electricity supply industry tend to choose `open access models` that allow final customers the freedom to select suppliers. For example, in 1996, U.S. Federal Energy Regulatory Commission (FERC) issued Order No. 888, which requires electric utilities to open their transmission grid for third parties. However, there are a lot of issues to be addressed before we adopt such models in our country. In this paper, we discuss open access of the transmission grid from the view points of utility regulation, taking account of the debates over regulations on grid access, especially concerning Order No.888 and previous laws and regulations in the United States. The results are: 1. At the end of 1980s, laws and regulations were established to require electric utilities to open their transmission grid for third parties. However, propriety of the wheeling order was decided on a case-by-case basis before the enactment of Order No.888. 2. Under tile provisions of Order No. 888 issued in 1996, electric utilities are required to open their transmission grid any time by request. Nevertheless, that obligation is limitative because (1) the electric utilities have a preferential access to the grid, (2) eligible customers can access to the grid only if there is available transmission capacity. Hence, one of the critical issues to be addressed is how to calculate their available transmission capacity. (author)

  13. Open Science Project in White Dwarf Research

    CERN Document Server

    Vornanen, Tommi

    2012-01-01

    I will propose a new way of advancing white dwarf research. Open science is a method of doing research that lets everyone who has something to say about the subject take part in the problem solving process. Already now, the amount of information we gather from observations, theory and modelling is too vast for any one individual to comprehend and turn into knowledge. And the amount of information just keeps growing in the future. A platform that promotes sharing of thoughts and ideas allows us to pool our collective knowledge of white dwarfs and get a clear picture of our research field. It will also make it possible for researchers in fields closely related to ours (AGB stars, planetary nebulae etc.) to join the scientific discourse. In the first stage this project would allow us to summarize what we know and what we don't, and what we should search for next. Later, it could grow into a large collaboration that would have the impact to, for example, suggest instrument requirements for future telescopes to sa...

  14. Open Science Project in White Dwarf Research

    Science.gov (United States)

    Vornanen, T.

    2013-01-01

    I will propose a new way of advancing white dwarf research. Open science is a method of doing research that lets everyone who has something to say about the subject take part in the problem solving process. Already now, the amount of information we gather from observations, theory and modeling is too vast for any one individual to comprehend and turn into knowledge. And the amount of information just keeps growing in the future. A platform that promotes sharing of thoughts and ideas allows us to pool our collective knowledge of white dwarfs and get a clear picture of our research field. It will also make it possible for researchers in fields closely related to ours (AGB stars, planetary nebulae etc.) to join the scientific discourse. In the first stage this project would allow us to summarize what we know and what we don't, and what we should search for next. Later, it could grow into a large collaboration that would have the impact to, for example, suggest instrument requirements for future telescopes to satisfy the needs of the white dwarf community, or propose large surveys. A simple implementation would be a wiki page for collecting knowledge combined with a forum for more extensive discussions. These would be simple and cheap to maintain. A large community effort on the whole would be needed for the project to succeed, but individual workload should stay at a low level.

  15. Managing competing elastic Grid and Cloud scientific computing applications using OpenNebula

    Science.gov (United States)

    Bagnasco, S.; Berzano, D.; Lusso, S.; Masera, M.; Vallero, S.

    2015-12-01

    Elastic cloud computing applications, i.e. applications that automatically scale according to computing needs, work on the ideal assumption of infinite resources. While large public cloud infrastructures may be a reasonable approximation of this condition, scientific computing centres like WLCG Grid sites usually work in a saturated regime, in which applications compete for scarce resources through queues, priorities and scheduling policies, and keeping a fraction of the computing cores idle to allow for headroom is usually not an option. In our particular environment one of the applications (a WLCG Tier-2 Grid site) is much larger than all the others and cannot autoscale easily. Nevertheless, other smaller applications can benefit of automatic elasticity; the implementation of this property in our infrastructure, based on the OpenNebula cloud stack, will be described and the very first operational experiences with a small number of strategies for timely allocation and release of resources will be discussed.

  16. OpenZika: An IBM World Community Grid Project to Accelerate Zika Virus Drug Discovery

    Science.gov (United States)

    Perryman, Alexander L.; Horta Andrade, Carolina

    2016-01-01

    The Zika virus outbreak in the Americas has caused global concern. To help accelerate this fight against Zika, we launched the OpenZika project. OpenZika is an IBM World Community Grid Project that uses distributed computing on millions of computers and Android devices to run docking experiments, in order to dock tens of millions of drug-like compounds against crystal structures and homology models of Zika proteins (and other related flavivirus targets). This will enable the identification of new candidates that can then be tested in vitro, to advance the discovery and development of new antiviral drugs against the Zika virus. The docking data is being made openly accessible so that all members of the global research community can use it to further advance drug discovery studies against Zika and other related flaviviruses. PMID:27764115

  17. How open science helps researchers succeed.

    Science.gov (United States)

    McKiernan, Erin C; Bourne, Philip E; Brown, C Titus; Buck, Stuart; Kenall, Amye; Lin, Jennifer; McDougall, Damon; Nosek, Brian A; Ram, Karthik; Soderberg, Courtney K; Spies, Jeffrey R; Thaney, Kaitlin; Updegrove, Andrew; Woo, Kara H; Yarkoni, Tal

    2016-07-07

    Open access, open data, open source and other open scholarship practices are growing in popularity and necessity. However, widespread adoption of these practices has not yet been achieved. One reason is that researchers are uncertain about how sharing their work will affect their careers. We review literature demonstrating that open research is associated with increases in citations, media attention, potential collaborators, job opportunities and funding opportunities. These findings are evidence that open research practices bring significant benefits to researchers relative to more traditional closed practices.

  18. Joining Networks in the World of Open Science

    Directory of Open Access Journals (Sweden)

    Riitta Liisa Maijala

    2016-12-01

    Full Text Available Whereas the first digital revolution of science by digitisation changed the scientific practices of data collection, analysis and reporting of results, the second digital revolution, i.e. open science, will also challenge the current roles of researchers, research organisations, libraries and publishers. From the early days of development, research libraries have joined different networks and been among the most active stakeholders working towards open science. Cohesive networks are needed for coordinated actions and support, whereas bridging networks can provide new approaches and novel information. The Finnish Open Science and Research Initiative is presented in this paper as an example of joining networks, motivating individuals and organisations to deliver high-quality services, infrastructures and competence building to promote a transition towards open science. This paper also presents milestones such as the publication of the academic publishing costs of Finnish research organisations and the maturity level of open science operating cultures in HEIs. Based on the experience of the Finnish open science initiative, joining different networks at the national level on an open mode of operation can significantly speed up the transition towards the era of open science.

  19. How Do Scientists Define Openness? Exploring the Relationship Between Open Science Policies and Research Practice.

    Science.gov (United States)

    Levin, Nadine; Leonelli, Sabina; Weckowska, Dagmara; Castle, David; Dupré, John

    2016-06-01

    This article documents how biomedical researchers in the United Kingdom understand and enact the idea of "openness." This is of particular interest to researchers and science policy worldwide in view of the recent adoption of pioneering policies on Open Science and Open Access by the U.K. government-policies whose impact on and implications for research practice are in need of urgent evaluation, so as to decide on their eventual implementation elsewhere. This study is based on 22 in-depth interviews with U.K. researchers in systems biology, synthetic biology, and bioinformatics, which were conducted between September 2013 and February 2014. Through an analysis of the interview transcripts, we identify seven core themes that characterize researchers' understanding of openness in science and nine factors that shape the practice of openness in research. Our findings highlight the implications that Open Science policies can have for research processes and outcomes and provide recommendations for enhancing their content, effectiveness, and implementation.

  20. Challenges facing production grids

    Energy Technology Data Exchange (ETDEWEB)

    Pordes, Ruth; /Fermilab

    2007-06-01

    Today's global communities of users expect quality of service from distributed Grid systems equivalent to that their local data centers. This must be coupled to ubiquitous access to the ensemble of processing and storage resources across multiple Grid infrastructures. We are still facing significant challenges in meeting these expectations, especially in the underlying security, a sustainable and successful economic model, and smoothing the boundaries between administrative and technical domains. Using the Open Science Grid as an example, I examine the status and challenges of Grids operating in production today.

  1. Stepping up Open Science Training for European Research

    Directory of Open Access Journals (Sweden)

    Birgit Schmidt

    2016-06-01

    Full Text Available Open science refers to all things open in research and scholarly communication: from publications and research data to code, models and methods as well as quality evaluation based on open peer review. However, getting started with implementing open science might not be as straightforward for all stakeholders. For example, what do research funders expect in terms of open access to publications and/or research data? Where and how to publish research data? How to ensure that research results are reproducible? These are all legitimate questions and, in particular, early career researchers may benefit from additional guidance and training. In this paper we review the activities of the European-funded FOSTER project which organized and supported a wide range of targeted trainings for open science, based on face-to-face events and on a growing suite of e-learning courses. This article reviews the approach and experiences gained from the first two years of the project.

  2. Computational investigations and grid refinement study of 3D transient flow in a cylindrical tank using OpenFOAM

    Science.gov (United States)

    Mohd Sakri, F.; Mat Ali, M. S.; Sheikh Salim, S. A. Z.

    2016-10-01

    The study of physic fluid for a liquid draining inside a tank is easily accessible using numerical simulation. However, numerical simulation is expensive when the liquid draining involves the multi-phase problem. Since an accurate numerical simulation can be obtained if a proper method for error estimation is accomplished, this paper provides systematic assessment of error estimation due to grid convergence error using OpenFOAM. OpenFOAM is an open source CFD-toolbox and it is well-known among the researchers and institutions because of its free applications and ready to use. In this study, three types of grid resolution are used: coarse, medium and fine grids. Grid Convergence Index (GCI) is applied to estimate the error due to the grid sensitivity. A monotonic convergence condition is obtained in this study that shows the grid convergence error has been progressively reduced. The fine grid has the GCI value below 1%. The extrapolated value from Richardson Extrapolation is in the range of the GCI obtained.

  3. OpenSesame : An open-source, graphical experiment builder for the social sciences

    NARCIS (Netherlands)

    Mathot, Sebastiaan; Schreij, Daniel; Theeuwes, Jan

    2012-01-01

    In the present article, we introduce OpenSesame, a graphical experiment builder for the social sciences. OpenSesame is free, open-source, and cross-platform. It features a comprehensive and intuitive graphical user interface and supports Python scripting for complex tasks. Additional functionality,

  4. The Historical Origins and Economic Logic of 'Open Science'

    CERN Document Server

    CERN. Geneva

    2008-01-01

    Modern "big science" projects, such as the LHC experiments in physics that are being prepared to run at CERN, embody the distinctive ethos of cooperation and mechanisms of coordination among distributed groups of researchers that are characteristic of 'open science'. Much has been written about the institutions of open science, their supporting social norms, and their effectiveness in generating additions to the stock of reliable knowledge. But from where have these institutions and their supporting ethos come? How robust can we assume them to be in the face of the recent trends for universities and research institutes in some domains of science to seek to appropriate the benefits of new discoveries and inventions by asserting intellectual property claims? A search for the historical origins of the institutions of open science throws some new light on these issues, and the answers may offer some lessons for contemporary science and technology policy-making.

  5. The EPOS Vision for the Open Science Cloud

    Science.gov (United States)

    Jeffery, Keith; Harrison, Matt; Cocco, Massimo

    2016-04-01

    Cloud computing offers dynamic elastic scalability for data processing on demand. For much research activity, demand for computing is uneven over time and so CLOUD computing offers both cost-effectiveness and capacity advantages. However, as reported repeatedly by the EC Cloud Expert Group, there are barriers to the uptake of Cloud Computing: (1) security and privacy; (2) interoperability (avoidance of lock-in); (3) lack of appropriate systems development environments for application programmers to characterise their applications to allow CLOUD middleware to optimize their deployment and execution. From CERN, the Helix-Nebula group has proposed the architecture for the European Open Science Cloud. They are discussing with other e-Infrastructure groups such as EGI (GRIDs), EUDAT (data curation), AARC (network authentication and authorisation) and also with the EIROFORUM group of 'international treaty' RIs (Research Infrastructures) and the ESFRI (European Strategic Forum for Research Infrastructures) RIs including EPOS. Many of these RIs are either e-RIs (electronic-RIs) or have an e-RI interface for access and use. The EPOS architecture is centred on a portal: ICS (Integrated Core Services). The architectural design already allows for access to e-RIs (which may include any or all of data, software, users and resources such as computers or instruments). Those within any one domain (subject area) of EPOS are considered within the TCS (Thematic Core Services). Those outside, or available across multiple domains of EPOS, are ICS-d (Integrated Core Services-Distributed) since the intention is that they will be used by any or all of the TCS via the ICS. Another such service type is CES (Computational Earth Science); effectively an ICS-d specializing in high performance computation, analytics, simulation or visualization offered by a TCS for others to use. Already discussions are underway between EPOS and EGI, EUDAT, AARC and Helix-Nebula for those offerings to be

  6. The Computing and Data Grid Approach: Infrastructure for Distributed Science Applications

    Science.gov (United States)

    Johnston, William E.

    2002-01-01

    With the advent of Grids - infrastructure for using and managing widely distributed computing and data resources in the science environment - there is now an opportunity to provide a standard, large-scale, computing, data, instrument, and collaboration environment for science that spans many different projects and provides the required infrastructure and services in a relatively uniform and supportable way. Grid technology has evolved over the past several years to provide the services and infrastructure needed for building 'virtual' systems and organizations. We argue that Grid technology provides an excellent basis for the creation of the integrated environments that can combine the resources needed to support the large- scale science projects located at multiple laboratories and universities. We present some science case studies that indicate that a paradigm shift in the process of science will come about as a result of Grids providing transparent and secure access to advanced and integrated information and technologies infrastructure: powerful computing systems, large-scale data archives, scientific instruments, and collaboration tools. These changes will be in the form of services that can be integrated with the user's work environment, and that enable uniform and highly capable access to these computers, data, and instruments, regardless of the location or exact nature of these resources. These services will integrate transient-use resources like computing systems, scientific instruments, and data caches (e.g., as they are needed to perform a simulation or analyze data from a single experiment); persistent-use resources. such as databases, data catalogues, and archives, and; collaborators, whose involvement will continue for the lifetime of a project or longer. While we largely address large-scale science in this paper, Grids, particularly when combined with Web Services, will address a broad spectrum of science scenarios. both large and small scale.

  7. Evolution of stellar collision products in open clusters. II. A grid of low-mass collisions

    CERN Document Server

    Glebbeek, E

    2008-01-01

    In a companion paper we studied the detailed evolution of stellar collision products that occurred in an $N$-body simulation of the old open cluster M67 and compared our detailed models to simple prescriptions. In this paper we extend this work by studying the evolution of the collision products in open clusters as a function of mass and age of the progenitor stars. We calculated a grid of head-on collisions covering the section of parameter space relevant for collisions in open clusters. We create detailed models of the merger remnants using an entropy-sorting algorithm and follow their subsequent evolution during the initial contraction phase, through the main sequence and up to the giant branch with our detailed stellar evolution code. We compare the location of our models in a colour-magnitude diagram to the observed blue straggler population of the old open clusters M67 and NGC 188 and find that they cover the observed blue straggler region of both clusters. For M67, collisions need to have taken place r...

  8. Developing institutional repositories network: Taking IR Grid at Chinese Academy of Sciences as an example

    Institute of Scientific and Technical Information of China (English)

    Zhongming; ZHU; Dongrong; ZHANG; Lin; LI; Jianxia; MA; Xiwen; LIU

    2011-01-01

    This paper introduces the current practice of building a network of institutional repositories(IRs)at Chinese Academy of Sciences(CAS),which is named CAS IR Grid.National Science Library(NSL)of CAS plays a leading role in the construction,promotion and implementation of CAS IR Grid.It aims to promote each institute of CAS to build IR of its own,and finally form the IR network of CAS institutes.NSL’s experience is introduced in coordinating and supporting institutes’building of their respective IRs and promoting IR services by adopting collaborative and progressive development strategies.Achievements made during the development of CAS IR Grid are described and challenges for its future development are discussed.The authors aim to provide best practices for developing a network of institutional repositories in research institute settings,which can serve as a practical reference to other institutions engaged in the similar task.

  9. Research and Deployment a Hospital Open Software Platform for e-Health on the Grid System at VAST/IAMI

    Science.gov (United States)

    van Tuyet, Dao; Tuan, Ngo Anh; van Lang, Tran

    Grid computing has been an increasing topic in recent years. It attracts the attention of many scientists from many fields. As a result, many Grid systems have been built for serving people's demands. At present, many tools for developing the Grid systems such as Globus, gLite, Unicore still developed incessantly. Especially, gLite - the Grid Middleware - was developed by the Europe Community scientific in recent years. Constant growth of Grid technology opened the way for new opportunities in term of information and data exchange in a secure and collaborative context. These new opportunities can be exploited to offer physicians new telemedicine services in order to improve their collaborative capacities. Our platform gives physicians an easy method to use telemedicine environment to manage and share patient's information (such as electronic medical record, images formatted DICOM) between remote locations. This paper presents the Grid Infrastructure based on gLite; some main components of gLite; the challenge scenario in which new applications can be developed to improve collaborative work between scientists; the process of deploying Hospital Open software Platform for E-health (HOPE) on the Grid.

  10. Open Science: Dimensions to a new scientific practice

    Directory of Open Access Journals (Sweden)

    Adriana Carla Silva de Oliveira

    2016-08-01

    Full Text Available Introduction:The practices of e-science and the use and reuse of scientific data have constituted a new scientific work that leads to the reflection on new regulatory, legal, institutional and technological frameworks for open science. Objective: This study shows the following research question: which dimensions provide sustainability for the formulation of a policy geared to open science and its practices in the Brazilian context? The aim of this study is to discuss the dimensions that support transversely the formulation of a policy for open science and its scientific practices. Methodology:Theoretically, the study is guided by the fourth scientific paradigm grounded in the e-Science. The methodology is supported by Bufrem’s studies (2013, which propose an alternative and multidimensional model for analysis and discussion of scientific research. Technically, the literature review and documentary survey were the methods used on the Data Lifecycle scientific model, laws and international agreements.For this study purpose, five dimensions were proposed, namely: epistemological, political, ethical-legal-cultural, morphological, and technological. Results: This studyunderstands that these dimensions substantiate an information policy or the development of minimum guidelines for the open science agenda in Brazil. Conclusions: The dimensions put away the reductionist perspective on survey data and they conducted the study for the multi-dimensional and multi-relational vision of open science.

  11. Open-science projects get kickstarted at CERN

    CERN Multimedia

    Achintya Rao

    2015-01-01

    CERN is one of the host sites for the Mozilla Science Lab Global Sprint to be held on 4 and 5 June, which will see participants around the world work on projects to further open science and educational tools.   IdeaSquare will be hosting the event at CERN. The Mozilla Science Lab Global Sprint was first held in 2014 to bring together open-science practitioners and enthusiasts to collaborate on projects designed to advance science on the open web. The sprint is a loosely federated event, and CERN is participating in the 2015 edition, hosting sprinters in the hacker-friendly IdeaSquare. Five projects have been formally proposed and CERN users and staff are invited to participate in a variety of ways. A special training session will also be held to introduce the CERN community to existing open-science and collaborative tools, including ones that have been deployed at CERN. 1. GitHub Science Badges: Sprinters will work on developing a badge-style visual representation of how open a software pro...

  12. An Open Framework for Low-Latency Communications across the Smart Grid Network

    Science.gov (United States)

    Sturm, John Andrew

    2011-01-01

    The recent White House (2011) policy paper for the Smart Grid that was released on June 13, 2011, "A Policy Framework for the 21st Century Grid: Enabling Our Secure Energy Future," defines four major problems to be solved and the one that is addressed in this dissertation is Securing the Grid. Securing the Grid is referred to as one of…

  13. An Open Framework for Low-Latency Communications across the Smart Grid Network

    Science.gov (United States)

    Sturm, John Andrew

    2011-01-01

    The recent White House (2011) policy paper for the Smart Grid that was released on June 13, 2011, "A Policy Framework for the 21st Century Grid: Enabling Our Secure Energy Future," defines four major problems to be solved and the one that is addressed in this dissertation is Securing the Grid. Securing the Grid is referred to as one of…

  14. The Open Access Availability of Library and Information Science Literature

    Science.gov (United States)

    Way, Doug

    2010-01-01

    To examine the open access availability of Library and Information Science (LIS) research, a study was conducted using Google Scholar to search for articles from 20 top LIS journals. The study examined whether Google Scholar was able to find any links to full text, if open access versions of the articles were available and where these articles…

  15. The Open Access Availability of Library and Information Science Literature

    Science.gov (United States)

    Way, Doug

    2010-01-01

    To examine the open access availability of Library and Information Science (LIS) research, a study was conducted using Google Scholar to search for articles from 20 top LIS journals. The study examined whether Google Scholar was able to find any links to full text, if open access versions of the articles were available and where these articles…

  16. The Impact of Open Textbooks on Secondary Science Learning Outcomes

    Science.gov (United States)

    Robinson, T. Jared; Fischer, Lane; Wiley, David; Hilton, John, III

    2014-01-01

    Given the increasing costs associated with commercial textbooks and decreasing financial support of public schools, it is important to better understand the impacts of open educational resources on student outcomes. The purpose of this quantitative study is to analyze whether the adoption of open science textbooks significantly affects science…

  17. Teaching Particle Physics in the Open University's Science Foundation Course.

    Science.gov (United States)

    Farmelo, Graham

    1992-01-01

    Discusses four topics presented in the science foundation course of the Open University that exemplify current developments in particle physics, in particular, and that describe important issues about the nature of science, in general. Topics include the omega minus particle, the diversity of quarks, the heavy lepton, and the discovery of the W…

  18. Leveraging Open Standards and Technologies to Enhance Community Access to Earth Science Lidar Data

    Science.gov (United States)

    Crosby, C. J.; Nandigam, V.; Krishnan, S.; Cowart, C.; Baru, C.; Arrowsmith, R.

    2011-12-01

    Lidar (Light Detection and Ranging) data, collected from space, airborne and terrestrial platforms, have emerged as an invaluable tool for a variety of Earth science applications ranging from ice sheet monitoring to modeling of earth surface processes. However, lidar present a unique suite of challenges from the perspective of building cyberinfrastructure systems that enable the scientific community to access these valuable research datasets. Lidar data are typically characterized by millions to billions of individual measurements of x,y,z position plus attributes; these "raw" data are also often accompanied by derived raster products and are frequently terabytes in size. As a relatively new and rapidly evolving data collection technology, relevant open data standards and software projects are immature compared to those for other remote sensing platforms. The NSF-funded OpenTopography Facility project has developed an online lidar data access and processing system that co-locates data with on-demand processing tools to enable users to access both raw point cloud data as well as custom derived products and visualizations. OpenTopography is built on a Service Oriented Architecture (SOA) in which applications and data resources are deployed as standards compliant (XML and SOAP) Web services with the open source Opal Toolkit. To develop the underlying applications for data access, filtering and conversion, and various processing tasks, OpenTopography has heavily leveraged existing open source software efforts for both lidar and raster data. Operating on the de facto LAS binary point cloud format (maintained by ASPRS), open source libLAS and LASlib libraries provide OpenTopography data ingestion, query and translation capabilities. Similarly, raster data manipulation is performed through a suite of services built on the Geospatial Data Abstraction Library (GDAL). OpenTopography has also developed our own algorithm for high-performance gridding of lidar point cloud data

  19. Amazon S3 for Science Grids: a Viable Solution?

    OpenAIRE

    Palankar, Mayur; Iamnitchi, Adriana; Ripeanu, Matei; Garfinkel, Simson

    2008-01-01

    International Workshop on Data-Aware Distributed Computing (DADC'08), June 23-27, 2008, Boston, MA Refereed Conference Paper Amazon.com has introduced the Simple Storage Service (S3), a commodity-priced storage utility. S3 aims to provide storage as a low-cost, highly available service, with a simple 'pay-as-you-go' charging model. This article makes three contributions. First, we evaluate S3's ability to provide storage support to large-scale science projects from a cost, a...

  20. Implementation of Grid Tier 2 and Tier 3 facilities on a Distributed OpenStack Cloud

    Science.gov (United States)

    Limosani, Antonio; Boland, Lucien; Coddington, Paul; Crosby, Sean; Huang, Joanna; Sevior, Martin; Wilson, Ross; Zhang, Shunde

    2014-06-01

    The Australian Government is making a AUD 100 million investment in Compute and Storage for the academic community. The Compute facilities are provided in the form of 30,000 CPU cores located at 8 nodes around Australia in a distributed virtualized Infrastructure as a Service facility based on OpenStack. The storage will eventually consist of over 100 petabytes located at 6 nodes. All will be linked via a 100 Gb/s network. This proceeding describes the development of a fully connected WLCG Tier-2 grid site as well as a general purpose Tier-3 computing cluster based on this architecture. The facility employs an extension to Torque to enable dynamic allocations of virtual machine instances. A base Scientific Linux virtual machine (VM) image is deployed in the OpenStack cloud and automatically configured as required using Puppet. Custom scripts are used to launch multiple VMs, integrate them into the dynamic Torque cluster and to mount remote file systems. We report on our experience in developing this nation-wide ATLAS and Belle II Tier 2 and Tier 3 computing infrastructure using the national Research Cloud and storage facilities.

  1. Montage: a grid portal and software toolkit for science-grade astronomical image mosaicking

    CERN Document Server

    Jacob, Joseph C; Berriman, G Bruce; Good, John; Laity, Anastasia C; Deelman, Ewa; Kesselman, Carl; Singh, Gurmeet; Su, Mei-Hui; Prince, Thomas A; Williams, Roy

    2010-01-01

    Montage is a portable software toolkit for constructing custom, science-grade mosaics by composing multiple astronomical images. The mosaics constructed by Montage preserve the astrometry (position) and photometry (intensity) of the sources in the input images. The mosaic to be constructed is specified by the user in terms of a set of parameters, including dataset and wavelength to be used, location and size on the sky, coordinate system and projection, and spatial sampling rate. Many astronomical datasets are massive, and are stored in distributed archives that are, in most cases, remote with respect to the available computational resources. Montage can be run on both single- and multi-processor computers, including clusters and grids. Standard grid tools are used to run Montage in the case where the data or computers used to construct a mosaic are located remotely on the Internet. This paper describes the architecture, algorithms, and usage of Montage as both a software toolkit and as a grid portal. Timing ...

  2. ITK: Enabling Reproducible Research and Open Science

    Directory of Open Access Journals (Sweden)

    Matthew Michael McCormick

    2014-02-01

    Full Text Available Reproducibility verification is essential to the practice of the scientific method. Researchers report their findings, which are strengthened as other independent groups in the scientific community share similar outcomes. In the many scientific fields where software has become a fundamental tool for capturing and analyzing data, this requirement of reproducibility implies that reliable and comprehensive software platforms and tools should be made available to the scientific community. The tools will empower them and the public to verify, through practice, the reproducibility of observations that are reported in the scientific literature.Medical image analysis is one of the fields in which the use of computational resources, both software and hardware, are an essential platform for performing experimental work. In this arena, the introduction of the Insight Toolkit (ITK in 1999 has transformed the field and facilitates its progress by accelerating the rate at which algorithmic implementations are developed, tested, disseminated and improved. By building on the efficiency and quality of open source methodologies, ITK has provided the medical image community with an effective platform on which to build a daily workflow that incorporates the true scientific practices of reproducibility verification.This article describes the multiple tools, methodologies, and practices that the ITK community has adopted, refined, and followed during the past decade, in order to become one of the research communities with the most modern reproducibility verification infrastructure. For example, 207 contributors have created over 2400 unit tests that provide over 84% code line test coverage. The Insight Journal, an open publication journal associated with the toolkit, has seen over 360,000 publication downloads. The median normalized closeness centrality, a measure of knowledge flow, resulting from the distributed peer code review system was high, 0.46.

  3. Contemporary Scientists Discuss the Need for Openness and Open-Mindedness in Science and Society

    Science.gov (United States)

    Mulhall, Pamela J.; Smith, Dorothy V.; Hart, Christina E.; Gunstone, Richard F.

    2016-09-01

    We report on findings from a qualitative study of Australian scientists whose work brings them into contact with the public. This research sought to understand how a school science curriculum could better represent the work of scientists today. We discuss the views expressed by our participant scientists about the importance of openness and open-mindedness in their work, including their engagement with the public. They described openness as an important characteristic of science. Our participants also see open-mindedness on the part of both scientists and members of the public as important for productive relationships. They see the development of such relationships as an essential facet of their work. The views expressed by these scientists provide a provocative insight into the ways in which contemporary scientists see their work and relationships with their communities. Their perspectives have important implications for approaches to teaching science in schools.

  4. Open Science & Open Data Global Sprint 2016 | 2–3 June 2016

    CERN Multimedia

    Achintya Rao

    2016-01-01

    Join us as we learn to collaboratively build projects transforming science on the web! Thursday 2 June 2016 8.00 a.m. – Friday 3 June 20.00 p.m. CERN (3179-R-E06) This two-day sprint event brings together researchers, coders, librarians and the public from around the globe to hack on open science and open data projects in their communities. This year, we have four tracks you can contribute to: tools, citizen science, curriculum and open data. CERN is hosting three projects: Everware Open Cosmics CrowdAI   You can also participate in any of the other mozsprint projects for 2016. For more information, please visit: https://indico.cern.ch/event/535760/

  5. Contemporary Scientists Discuss the Need for Openness and Open-Mindedness in Science and Society

    Science.gov (United States)

    Mulhall, Pamela J.; Smith, Dorothy V.; Hart, Christina E.; Gunstone, Richard F.

    2017-10-01

    We report on findings from a qualitative study of Australian scientists whose work brings them into contact with the public. This research sought to understand how a school science curriculum could better represent the work of scientists today. We discuss the views expressed by our participant scientists about the importance of openness and open-mindedness in their work, including their engagement with the public. They described openness as an important characteristic of science. Our participants also see open-mindedness on the part of both scientists and members of the public as important for productive relationships. They see the development of such relationships as an essential facet of their work. The views expressed by these scientists provide a provocative insight into the ways in which contemporary scientists see their work and relationships with their communities. Their perspectives have important implications for approaches to teaching science in schools.

  6. Scientific Data Science and the Case for Open Access

    CERN Document Server

    Sarma, Gopal P

    2016-01-01

    "Open access" has become a central theme of journal reform in academic publishing. In this article, I examine the consequences of an important technological loophole in which publishers can claim to be adhering to the principles of open access by releasing articles in proprietary or "locked" formats that cannot be processed by automated tools, whereby even simple copy and pasting of text is disabled. These restrictions will prevent the development of an important infrastructural element of a modern research enterprise, namely, scientific data science, or the use of data analytic techniques to conduct meta-analyses and investigations into the scientific corpus. I give a brief history of the open access movement, discuss novel journalistic practices, and an overview of data-driven investigation of the scientific corpus. I argue that particularly in an era where the veracity of many research studies has been called into question, scientific data science should be one of the key motivations for open access publis...

  7. Open Science in Practice: Researcher Perspectives and Participation

    Directory of Open Access Journals (Sweden)

    Angus Whyte

    2011-03-01

    Full Text Available We report on an exploratory study consisting of brief case studies in selected disciplines, examining what motivates researchers to work (or want to work in an open manner with regard to their data, results and protocols, and whether advantages are delivered by working in this way. We review the policy background to open science, and literature on the benefits attributed to open data, considering how these relate to curation and to questions of who participates in science. The case studies investigate the perceived benefits to researchers, research institutions and funding bodies of utilising open scientific methods, the disincentives and barriers, and the degree to which there is evidence to support these perceptions. Six case study groups were selected in astronomy, bioinformatics, chemistry, epidemiology, language technology and neuroimaging. The studies identify relevant examples and issues through qualitative analysis of interview transcripts. We provide a typology of degrees of open working across the research lifecycle, and conclude that better support for open working, through guidelines to assist research groups in identifying the value and costs of working more openly, and further research to assess the risks, incentives and shifts in responsibility entailed by opening up the research process are needed.

  8. Integrating Free and Open Source Solutions into Geospatial Science Education

    Directory of Open Access Journals (Sweden)

    Vaclav Petras

    2015-06-01

    Full Text Available While free and open source software becomes increasingly important in geospatial research and industry, open science perspectives are generally less reflected in universities’ educational programs. We present an example of how free and open source software can be incorporated into geospatial education to promote open and reproducible science. Since 2008 graduate students at North Carolina State University have the opportunity to take a course on geospatial modeling and analysis that is taught with both proprietary and free and open source software. In this course, students perform geospatial tasks simultaneously in the proprietary package ArcGIS and the free and open source package GRASS GIS. By ensuring that students learn to distinguish between geospatial concepts and software specifics, students become more flexible and stronger spatial thinkers when choosing solutions for their independent work in the future. We also discuss ways to continually update and improve our publicly available teaching materials for reuse by teachers, self-learners and other members of the GIS community. Only when free and open source software is fully integrated into geospatial education, we will be able to encourage a culture of openness and, thus, enable greater reproducibility in research and development applications.

  9. Architectural Aspects of Grid Computing and its Global Prospects for E-Science Community

    Science.gov (United States)

    Ahmad, Mushtaq

    2008-05-01

    The paper reviews the imminent Architectural Aspects of Grid Computing for e-Science community for scientific research and business/commercial collaboration beyond physical boundaries. Grid Computing provides all the needed facilities; hardware, software, communication interfaces, high speed internet, safe authentication and secure environment for collaboration of research projects around the globe. It provides highly fast compute engine for those scientific and engineering research projects and business/commercial applications which are heavily compute intensive and/or require humongous amounts of data. It also makes possible the use of very advanced methodologies, simulation models, expert systems and treasure of knowledge available around the globe under the umbrella of knowledge sharing. Thus it makes possible one of the dreams of global village for the benefit of e-Science community across the globe.

  10. Mapping the hinterland: Data issues in open science.

    Science.gov (United States)

    Grand, Ann; Wilkinson, Clare; Bultitude, Karen; Winfield, Alan F T

    2016-01-01

    Open science is a practice in which the scientific process is shared completely and in real time. It offers the potential to support information flow, collaboration and dialogue among professional and non-professional participants. Using semi-structured interviews and case studies, this research investigated the relationship between open science and public engagement. This article concentrates on three particular areas of concern that emerged: first, how to effectively contextualise and narrate information to render it accessible, as opposed to simply available; second, concerns about data quantity and quality; and third, concerns about the skills required for effective contextualisation, mapping and interpretation of information.

  11. An Open and Holistic Approach for Geo and Space Sciences

    Science.gov (United States)

    Ritschel, Bernd; Seelus, Christoph; Neher, Günther; Toshihiko, Iyemori; Yatagai, Akiyo; Koyama, Yukinobu; Murayama, Yasuhiro; King, Todd; Hughes, Steve; Fung, Shing; Galkin, Ivan; Hapgood, Mike; Belehaki, Anna

    2016-04-01

    Geo and space sciences thus far have been very successful, even often an open, cross-domain and holistic approach did not play an essential role. But this situation is changing rapidly. The research focus is shifting into more complex, non-linear and multi-domain specified phenomena, such as e.g. climate change or space environment. This kind of phenomena only can be understood step by step using the holistic idea. So, what is necessary for a successful cross-domain and holistic approach in geo and space sciences? Research and science in general become more and more dependent from a rich fundus of multi-domain data sources, related context information and the use of highly advanced technologies in data processing. Such buzzword phrases as Big Data and Deep Learning are reflecting this development. Big Data also addresses the real exponential growing of data and information produced by measurements or simulations. Deep Learning technology may help to detect new patterns and relationships in data describing high sophisticated natural phenomena. And further on, we should not forget science and humanities are only two sides of the same medal in the continuing human process of knowledge discovery. The concept of Open Data or in particular the open access to scientific data is addressing the free and open availability of -at least publicly founded and generated- data. The open availability of data covers the free use, reuse and redistribution of data which have been established with the formation of World Data Centers already more than 50 years ago. So, we should not forget, the foundation for open data is the responsibility of the individual scientist up until the big science institutions and organizations for a sustainable management of data. Other challenges are discovering and collecting the appropriate data, and preferably all of them or at least the majority of the right data. Therefore a network of individual or even better institutional catalog-based and at least

  12. 78 FR 45992 - National Science and Technology Council; Notice of Meeting: Open Meeting of the National Science...

    Science.gov (United States)

    2013-07-30

    ... TECHNOLOGY POLICY National Science and Technology Council; Notice of Meeting: Open Meeting of the National Science and Technology Council; Committee on Technology; Nanoscale Science, Engineering, and Technology... Technology (NSET) Subcommittee of the Committee on Technology, National Science and Technology......

  13. GENESIS SciFlo: Enabling Multi-Instrument Atmospheric Science Using Grid Workflows

    Science.gov (United States)

    Wilson, B. D.; Tang, B.; Manipon, G.; Yunck, T.; Fetzer, E.; Braverman, A.; Dobinson, E.

    2004-12-01

    The General Earth Science Investigation Suite (GENESIS) project is a NASA-sponsored partnership between the Jet Propulsion Laboratory, academia, and NASA data centers to develop a new suite of web services tools to facilitate multi-sensor investigations in Earth System Science. The goal of GENESIS is to enable large-scale, multi-instrument atmospheric science using combined datasets from the AIRS, MODIS, MISR, and GPS sensors. Investigations will include cross-comparison of spaceborne climate sensors, cloud spectral analysis, study of upper troposphere-strato-sphere water transport, study of the aerosol indirect cloud effect, and global climate model validation. The challenges are to bring together very large datasets, reformat and understand the individual instrument retrievals, co-register or re-grid the retrieved physical parameters, perform computationally-intensive data fusion and data mining operations, and accumulate complex statistics over months to years of data. To meet these challenges, we are developing a Grid computing and dataflow framework, named SciFlo, in which we are deploying a set of versatile and reusable operators for data access, subsetting, registration, mining, fusion, compression, and advanced statistical analysis. SciFlo is a system for Scientific Knowledge Creation on the Grid using a Semantically-Enabled Dataflow Execution Environment. SciFlo leverages Simple Object Access Protocol (SOAP) Web Services and the Grid Computing standards (Globus Alliance toolkits), and enables scientists to do multi-instrument Earth Science by assembling reusable web services and executable operators into a distributed computing flow (operator tree). The SciFlo client & server engines optimize the execution of such distributed data flows and allow the user to transparently find and use datasets and operators without worrying about the actual location of the Grid resources. The scientist injects a distributed computation into the Grid by simply filling out

  14. Open Data Infrastructures And The Future Of Science

    Science.gov (United States)

    Boulton, G. S.

    2016-12-01

    Open publication of the evidence (the data) supporting a scientific claim has been the bedrock on which the scientific advances of the modern era of science have been built. It is also of immense importance in confronting three challenges unleashed by the digital revolution. The first is the threat the digital data storm poses to the principle of "scientific self-correction", in which false concepts are weeded out because of a demonstrable failure in logic or in the replication of observations or experiments. Large and complex data volumes are difficult to make openly available in ways that make rigorous scrutiny possible. Secondly, linking and integrating data from different sources about the same phenomena have created profound new opportunities for understanding the Earth. If data are neither accessible nor useable, such opportunities cannot be seized. Thirdly, open access publication, open data and ubiquitous modern communications enhance the prospects for an era of "Open Science" in which science emerges from behind its laboratory doors to engage in co-production of knowledge with other stakeholders in addressing major contemporary challenges to human society, in particular the need for long term thinking about planetary sustainability. If the benefits of an open data regime are to be realised, only a small part of the challenge lies in providing "hard" infrastructure. The major challenges lie in the "soft" infrastructure of relationships between the components of national science systems, of analytic and software tools, of national and international standards and the normative principles adopted by scientists themselves. The principles that underlie these relationships, the responsibilities of key actors and the rules of the game needed to maximise national performance and facilitate international collaboration are set out in an International Accord on Open Data.

  15. Open Data and Open Science for better Research in the Geo and Space Domain

    Science.gov (United States)

    Ritschel, B.; Seelus, C.; Neher, G.; Iyemori, T.; Koyama, Y.; Yatagai, A. I.; Murayama, Y.; King, T. A.; Hughes, S.; Fung, S. F.; Galkin, I. A.; Hapgood, M. A.; Belehaki, A.

    2015-12-01

    Main open data principles had been worked out in the run-up and finally adopted in the Open Data Charta at the G8 summit in Lough Erne, Northern Ireland in June 2013. Important principles are also valid for science data, such as Open Data by Default, Quality and Quantity, Useable by All, Releasing Data for Improved Governance, Releasing Data for Innovation. There is also an explicit relationship to such areas of high values as earth observation, education and geospatial data. The European union implementation plan of the Open Data Charta identifies among other things objectives such as making data available in an open format, enabling semantic interoperability, ensuring quality, documentation and where appropriate reconciliation across different data sources, implementing software solutionsallowing easy management, publication or visualization of datasets and simplifying clearance of intellectual property rights.Open Science is not just a list of already for a longer time known principles but stands for a lot of initiatives and projects around a better handling of scientific data and openly shared scientific knowledge. It is also about transparency in methodology and collection of data, availability and reuse of scientific data, public accessibility to scientific communication and using of social media to facility scientific collaboration. Some projects are concentrating on open sharing of free and open source software and even further hardware in kind of processing capabilities. In addition question about the mashup of data and publication and an open peer review process are addressed.Following the principles of open data and open science the newest results of the collaboration efforts in mashing up the data servers related to the Japanese IUGONET, the European Union ESPAS and the GFZ ISDC semantic Web projects will be presented here. The semantic Web based approach for the mashup is focusing on the design and implementation of a common but still distributed data

  16. An Open Science Approach to Gis-Based Paleoenvironment Data

    Science.gov (United States)

    Willmes, C.; Becker, D.; Verheul, J.; Yener, Y.; Zickel, M.; Bolten, A.; Bubenzer, O.; Bareth, G.

    2016-06-01

    Paleoenvironmental studies and according information (data) are abundantly published and available in the scientific record. However, GIS-based paleoenvironmental information and datasets are comparably rare. Here, we present an Open Science approach for creating GIS-based data and maps of paleoenvironments, and Open Access publishing them in a web based Spatial Data Infrastructure (SDI), for access by the archaeology and paleoenvironment communities. We introduce an approach to gather and create GIS datasets from published non-GIS based facts and information (data), such as analogous maps, textual information or figures in scientific publications. These collected and created geo-datasets and maps are then published, including a Digital Object Identifier (DOI) to facilitate scholarly reuse and citation of the data, in a web based Open Access Research Data Management Infrastructure. The geo-datasets are additionally published in an Open Geospatial Consortium (OGC) standards compliant SDI, and available for GIS integration via OGC Open Web Services (OWS).

  17. iElectrodes: A Comprehensive Open-Source Toolbox for Depth and Subdural Grid Electrode Localization

    Science.gov (United States)

    Blenkmann, Alejandro O.; Phillips, Holly N.; Princich, Juan P.; Rowe, James B.; Bekinschtein, Tristan A.; Muravchik, Carlos H.; Kochen, Silvia

    2017-01-01

    The localization of intracranial electrodes is a fundamental step in the analysis of invasive electroencephalography (EEG) recordings in research and clinical practice. The conclusions reached from the analysis of these recordings rely on the accuracy of electrode localization in relationship to brain anatomy. However, currently available techniques for localizing electrodes from magnetic resonance (MR) and/or computerized tomography (CT) images are time consuming and/or limited to particular electrode types or shapes. Here we present iElectrodes, an open-source toolbox that provides robust and accurate semi-automatic localization of both subdural grids and depth electrodes. Using pre- and post-implantation images, the method takes 2–3 min to localize the coordinates in each electrode array and automatically number the electrodes. The proposed pre-processing pipeline allows one to work in a normalized space and to automatically obtain anatomical labels of the localized electrodes without neuroimaging experts. We validated the method with data from 22 patients implanted with a total of 1,242 electrodes. We show that localization distances were within 0.56 mm of those achieved by experienced manual evaluators. iElectrodes provided additional advantages in terms of robustness (even with severe perioperative cerebral distortions), speed (less than half the operator time compared to expert manual localization), simplicity, utility across multiple electrode types (surface and depth electrodes) and all brain regions. PMID:28303098

  18. Tunable Reaction Potentials in Open Framework Nanoparticle Battery Electrodes for Grid-Scale Energy Storage

    KAUST Repository

    Wessells, Colin D.

    2012-02-28

    The electrical energy grid has a growing need for energy storage to address short-term transients, frequency regulation, and load leveling. Though electrochemical energy storage devices such as batteries offer an attractive solution, current commercial battery technology cannot provide adequate power, and cycle life, and energy efficiency at a sufficiently low cost. Copper hexacyanoferrate and nickel hexacyanoferrate, two open framework materials with the Prussian Blue structure, were recently shown to offer ultralong cycle life and high-rate performance when operated as battery electrodes in safe, inexpensive aqueous sodium ion and potassium ion electrolytes. In this report, we demonstrate that the reaction potential of copper-nickel alloy hexacyanoferrate nanoparticles may be tuned by controlling the ratio of copper to nickel in these materials. X-ray diffraction, TEM energy dispersive X-ray spectroscopy, and galvanostatic electrochemical cycling of copper-nickel hexacyanoferrate reveal that copper and nickel form a fully miscible solution at particular sites in the framework without perturbing the structure. This allows copper-nickel hexacyanoferrate to reversibly intercalate sodium and potassium ions for over 2000 cycles with capacity retentions of 100% and 91%, respectively. The ability to precisely tune the reaction potential of copper-nickel hexacyanoferrate without sacrificing cycle life will allow the development of full cells that utilize the entire electrochemical stability window of aqueous sodium and potassium ion electrolytes. © 2012 American Chemical Society.

  19. Tunable reaction potentials in open framework nanoparticle battery electrodes for grid-scale energy storage.

    Science.gov (United States)

    Wessells, Colin D; McDowell, Matthew T; Peddada, Sandeep V; Pasta, Mauro; Huggins, Robert A; Cui, Yi

    2012-02-28

    The electrical energy grid has a growing need for energy storage to address short-term transients, frequency regulation, and load leveling. Though electrochemical energy storage devices such as batteries offer an attractive solution, current commercial battery technology cannot provide adequate power, and cycle life, and energy efficiency at a sufficiently low cost. Copper hexacyanoferrate and nickel hexacyanoferrate, two open framework materials with the Prussian Blue structure, were recently shown to offer ultralong cycle life and high-rate performance when operated as battery electrodes in safe, inexpensive aqueous sodium ion and potassium ion electrolytes. In this report, we demonstrate that the reaction potential of copper-nickel alloy hexacyanoferrate nanoparticles may be tuned by controlling the ratio of copper to nickel in these materials. X-ray diffraction, TEM energy dispersive X-ray spectroscopy, and galvanostatic electrochemical cycling of copper-nickel hexacyanoferrate reveal that copper and nickel form a fully miscible solution at particular sites in the framework without perturbing the structure. This allows copper-nickel hexacyanoferrate to reversibly intercalate sodium and potassium ions for over 2000 cycles with capacity retentions of 100% and 91%, respectively. The ability to precisely tune the reaction potential of copper-nickel hexacyanoferrate without sacrificing cycle life will allow the development of full cells that utilize the entire electrochemical stability window of aqueous sodium and potassium ion electrolytes.

  20. Open science versus commercialization: a modern research conflict?

    Science.gov (United States)

    Caulfield, Timothy; Harmon, Shawn He; Joly, Yann

    2012-02-27

    Efforts to improve research outcomes have resulted in genomic researchers being confronted with complex and seemingly contradictory instructions about how to perform their tasks. Over the past decade, there has been increasing pressure on university researchers to commercialize their work. Concurrently, they are encouraged to collaborate, share data and disseminate new knowledge quickly (that is, to adopt an open science model) in order to foster scientific progress, meet humanitarian goals, and to maximize the impact of their research. We present selected guidelines from three countries (Canada, United States, and United Kingdom) situated at the forefront of genomics to illustrate this potential policy conflict. Examining the innovation ecosystem and the messages conveyed by the different policies surveyed, we further investigate the inconsistencies between open science and commercialization policies. Commercialization and open science are not necessarily irreconcilable and could instead be envisioned as complementary elements of a more holistic innovation framework. Given the exploratory nature of our study, we wish to point out the need to gather additional evidence on the coexistence of open science and commercialization policies and on its impact, both positive and negative, on genomics academic research.

  1. Evolution of Nursing Science: Is Open Access the Answer?

    Science.gov (United States)

    Clarke, Pamela N; Garcia, Jenny

    2015-10-01

    The open access movement where journal content is made freely available over the Internet is purported to increase scientific exchange, yet has pros and cons. There are issues related to quality that need to be examined in relation to evolution of nursing science. © The Author(s) 2015.

  2. GOSH! A roadmap for open-source science hardware

    CERN Multimedia

    Stefania Pandolfi

    2016-01-01

    The goal of the Gathering for Open Science Hardware (GOSH! 2016), held from 2 to 5 March 2016 at IdeaSquare, was to lay the foundations of the open-source hardware for science movement.   The participants in the GOSH! 2016 meeting gathered in IdeaSquare. (Image: GOSH Community) “Despite advances in technology, many scientific innovations are held back because of a lack of affordable and customisable hardware,” says François Grey, a professor at the University of Geneva and coordinator of Citizen Cyberlab – a partnership between CERN, the UN Institute for Training and Research and the University of Geneva – which co-organised the GOSH! 2016 workshop. “This scarcity of accessible science hardware is particularly obstructive for citizen science groups and humanitarian organisations that don’t have the same economic means as a well-funded institution.” Instead, open sourcing science hardware co...

  3. TCIA: An information resource to enable open science.

    Science.gov (United States)

    Prior, Fred W; Clark, Ken; Commean, Paul; Freymann, John; Jaffe, Carl; Kirby, Justin; Moore, Stephen; Smith, Kirk; Tarbox, Lawrence; Vendt, Bruce; Marquez, Guillermo

    2013-01-01

    Reusable, publicly available data is a pillar of open science. The Cancer Imaging Archive (TCIA) is an open image archive service supporting cancer research. TCIA collects, de-identifies, curates and manages rich collections of oncology image data. Image data sets have been contributed by 28 institutions and additional image collections are underway. Since June of 2011, more than 2,000 users have registered to search and access data from this freely available resource. TCIA encourages and supports cancer-related open science communities by hosting and managing the image archive, providing project wiki space and searchable metadata repositories. The success of TCIA is measured by the number of active research projects it enables (>40) and the number of scientific publications and presentations that are produced using data from TCIA collections (39).

  4. ESMPy and OpenClimateGIS: Python Interfaces for High Performance Grid Remapping and Geospatial Dataset Manipulation

    Science.gov (United States)

    O'Kuinghttons, Ryan; Koziol, Benjamin; Oehmke, Robert; DeLuca, Cecelia; Theurich, Gerhard; Li, Peggy; Jacob, Joseph

    2016-04-01

    The Earth System Modeling Framework (ESMF) Python interface (ESMPy) supports analysis and visualization in Earth system modeling codes by providing access to a variety of tools for data manipulation. ESMPy started as a Python interface to the ESMF grid remapping package, which provides mature and robust high-performance and scalable grid remapping between 2D and 3D logically rectangular and unstructured grids and sets of unconnected data. ESMPy now also interfaces with OpenClimateGIS (OCGIS), a package that performs subsetting, reformatting, and computational operations on climate datasets. ESMPy exposes a subset of ESMF grid remapping utilities. This includes bilinear, finite element patch recovery, first-order conservative, and nearest neighbor grid remapping methods. There are also options to ignore unmapped destination points, mask points on source and destination grids, and provide grid structure in the polar regions. Grid remapping on the sphere takes place in 3D Cartesian space, so the pole problem is not an issue as it can be with other grid remapping software. Remapping can be done between any combination of 2D and 3D logically rectangular and unstructured grids with overlapping domains. Grid pairs where one side of the regridding is represented by an appropriate set of unconnected data points, as is commonly found with observational data streams, is also supported. There is a developing interoperability layer between ESMPy and OpenClimateGIS (OCGIS). OCGIS is a pure Python, open source package designed for geospatial manipulation, subsetting, and computation on climate datasets stored in local NetCDF files or accessible remotely via the OPeNDAP protocol. Interfacing with OCGIS has brought GIS-like functionality to ESMPy (i.e. subsetting, coordinate transformations) as well as additional file output formats (i.e. CSV, ESRI Shapefile). ESMPy is distinguished by its strong emphasis on open source, community governance, and distributed development. The user

  5. EverVIEW: A visualization platform for hydrologic and Earth science gridded data

    Science.gov (United States)

    Romañach, Stephanie S.; McKelvy, Mark; Suir, Kevin; Conzelmann, Craig

    2015-03-01

    The EverVIEW Data Viewer is a cross-platform desktop application that combines and builds upon multiple open source libraries to help users to explore spatially-explicit gridded data stored in Network Common Data Form (NetCDF). Datasets are displayed across multiple side-by-side geographic or tabular displays, showing colorized overlays on an Earth globe or grid cell values, respectively. Time-series datasets can be animated to see how water surface elevation changes through time or how habitat suitability for a particular species might change over time under a given scenario. Initially targeted toward Florida's Everglades restoration planning, EverVIEW has been flexible enough to address the varied needs of large-scale planning beyond Florida, and is currently being used in biological planning efforts nationally and internationally.

  6. The Montage architecture for grid-enabled science processing of large, distributed datasets

    Science.gov (United States)

    Jacob, Joseph C.; Katz, Daniel S .; Prince, Thomas; Berriman, Bruce G.; Good, John C.; Laity, Anastasia C.; Deelman, Ewa; Singh, Gurmeet; Su, Mei-Hui

    2004-01-01

    Montage is an Earth Science Technology Office (ESTO) Computational Technologies (CT) Round III Grand Challenge investigation to deploy a portable, compute-intensive, custom astronomical image mosaicking service for the National Virtual Observatory (NVO). Although Montage is developing a compute- and data-intensive service for the astronomy community, we are also helping to address a problem that spans both Earth and Space science, namely how to efficiently access and process multi-terabyte, distributed datasets. In both communities, the datasets are massive, and are stored in distributed archives that are, in most cases, remote from the available Computational resources. Therefore, state of the art computational grid technologies are a key element of the Montage portal architecture. This paper describes the aspects of the Montage design that are applicable to both the Earth and Space science communities.

  7. Open data in science and research : raw data now

    OpenAIRE

    Giles, Jeremy

    2012-01-01

    Open Data in science and research is an emerging priority for governments and public bodies across the western world. In 2009 Tim Berners-Lee (inventor of the World Wide Web) called for “raw data now” – for governments, scientists and institutions to make their data openly available on the web. That message is rippling around the world and government and public bodies from EU to NERC are responding. Neelie Kroes, Vice-President of the European Commission, recently said: "Taxpayers should not ...

  8. Collection overview: ten years of wonderful open access science.

    Science.gov (United States)

    Roberts, Roland G; Alfred, Jane

    2013-10-01

    To mark our tenth Anniversary at PLOS Biology, we are launching a special, celebratory Tenth Anniversary PLOS Biology Collection which showcases 10 specially selected PLOS Biology research articles drawn from a decade of publishing excellent science. It also features newly commissioned articles, including thought-provoking pieces on the Open Access movement (past and present), on article-level metrics, and on the history of the Public Library of Science. Each research article highlighted in the collection is also accompanied by a PLOS Biologue blog post to extend the impact of these remarkable studies to the widest possible audience.

  9. Collection overview: ten years of wonderful open access science.

    Directory of Open Access Journals (Sweden)

    Roland G Roberts

    2013-10-01

    Full Text Available To mark our tenth Anniversary at PLOS Biology, we are launching a special, celebratory Tenth Anniversary PLOS Biology Collection which showcases 10 specially selected PLOS Biology research articles drawn from a decade of publishing excellent science. It also features newly commissioned articles, including thought-provoking pieces on the Open Access movement (past and present, on article-level metrics, and on the history of the Public Library of Science. Each research article highlighted in the collection is also accompanied by a PLOS Biologue blog post to extend the impact of these remarkable studies to the widest possible audience.

  10. Sealife: a semantic grid browser for the life sciences applied to the study of infectious diseases.

    Science.gov (United States)

    Schroeder, Michael; Burger, Albert; Kostkova, Patty; Stevens, Robert; Habermann, Bianca; Dieng-Kuntz, Rose

    2006-01-01

    The objective of Sealife is the conception and realisation of a semantic Grid browser for the life sciences, which will link the existing Web to the currently emerging eScience infrastructure. The SeaLife Browser will allow users to automatically link a host of Web servers and Web/Grid services to the Web content he/she is visiting. This will be accomplished using eScience's growing number of Web/Grid Services and its XML-based standards and ontologies. The browser will identify terms in the pages being browsed through the background knowledge held in ontologies. Through the use of Semantic Hyperlinks, which link identified ontology terms to servers and services, the SeaLife Browser will offer a new dimension of context-based information integration. In this paper, we give an overview over the different components of the browser and their interplay. This SeaLife Browser will be demonstrated within three application scenarios in evidence-based medicine, literature & patent mining, and molecular biology, all relating to the study of infectious diseases. The three applications vertically integrate the molecule/cell, the tissue/organ and the patient/population level by covering the analysis of high-throughput screening data for endocytosis (the molecular entry pathway into the cell), the expression of proteins in the spatial context of tissue and organs, and a high-level library on infectious diseases designed for clinicians and their patients. For more information see http://www.biote.ctu-dresden.de/sealife.

  11. Is Open Science the Future of Drug Development?

    Science.gov (United States)

    Shaw, Daniel L.

    2017-01-01

    Traditional drug development models are widely perceived as opaque and inefficient, with the cost of research and development continuing to rise even as production of new drugs stays constant. Searching for strategies to improve the drug discovery process, the biomedical research field has begun to embrace open strategies. The resulting changes are starting to reshape the industry. Open science—an umbrella term for diverse strategies that seek external input and public engagement—has become an essential tool with researchers, who are increasingly turning to collaboration, crowdsourcing, data sharing, and open sourcing to tackle some of the most pressing problems in medicine. Notable examples of such open drug development include initiatives formed around malaria and tropical disease. Open practices have found their way into the drug discovery process, from target identification and compound screening to clinical trials. This perspective argues that while open science poses some risks—which include the management of collaboration and the protection of proprietary data—these strategies are, in many cases, the more efficient and ethical way to conduct biomedical research. PMID:28356902

  12. Open science, e-science and the new technologies: Challenges and old problems in qualitative research in the social sciences

    Directory of Open Access Journals (Sweden)

    Ercilia García-Álvarez

    2012-12-01

    Full Text Available Purpose: As well as introducing the articles in the special issue titled "Qualitative Research in the Social Sciences", this article reviews the challenges, problems and main advances made by the qualitative paradigm in the context of the new European science policy based on open science and e-Science and analysis alternative technologies freely available in the 2.0 environment and their application to fieldwork and data analysis. Design/methodology: Theoretical review. Practical implications: The article identifies open access technologies with applications in qualitative research such as applications for smartphones and tablets, web platforms and specific qualitative data analysis software, all developed in both the e-Science context and the 2.0 environment. Social implications: The article discusses the possible role to be played by qualitative research in the open science and e-Science context and considers the impact of this new context on the size and structure of research groups, the development of truly collaborative research, the emergence of new ethical problems and quality assessment in review processes in an open environment. Originality/value: The article describes the characteristics that define the new scientific environment and the challenges posed for qualitative research, reviews the latest open access technologies available to researchers in terms of their main features and proposes specific applications suitable for fieldwork and data analysis.

  13. Integrating Solar Power onto the Electric Grid - Bridging the Gap between Atmospheric Science, Engineering and Economics

    Science.gov (United States)

    Ghonima, M. S.; Yang, H.; Zhong, X.; Ozge, B.; Sahu, D. K.; Kim, C. K.; Babacan, O.; Hanna, R.; Kurtz, B.; Mejia, F. A.; Nguyen, A.; Urquhart, B.; Chow, C. W.; Mathiesen, P.; Bosch, J.; Wang, G.

    2015-12-01

    One of the main obstacles to high penetrations of solar power is the variable nature of solar power generation. To mitigate variability, grid operators have to schedule additional reliability resources, at considerable expense, to ensure that load requirements are met by generation. Thus despite the cost of solar PV decreasing, the cost of integrating solar power will increase as penetration of solar resources onto the electric grid increases. There are three principal tools currently available to mitigate variability impacts: (i) flexible generation, (ii) storage, either virtual (demand response) or physical devices and (iii) solar forecasting. Storage devices are a powerful tool capable of ensuring smooth power output from renewable resources. However, the high cost of storage is prohibitive and markets are still being designed to leverage their full potential and mitigate their limitation (e.g. empty storage). Solar forecasting provides valuable information on the daily net load profile and upcoming ramps (increasing or decreasing solar power output) thereby providing the grid advance warning to schedule ancillary generation more accurately, or curtail solar power output. In order to develop solar forecasting as a tool that can be utilized by the grid operators we identified two focus areas: (i) develop solar forecast technology and improve solar forecast accuracy and (ii) develop forecasts that can be incorporated within existing grid planning and operation infrastructure. The first issue required atmospheric science and engineering research, while the second required detailed knowledge of energy markets, and power engineering. Motivated by this background we will emphasize area (i) in this talk and provide an overview of recent advancements in solar forecasting especially in two areas: (a) Numerical modeling tools for coastal stratocumulus to improve scheduling in the day-ahead California energy market. (b) Development of a sky imager to provide short term

  14. ESA Science Archives, VO tools and remote Scientific Data reduction in Grid Architectures

    Science.gov (United States)

    Arviset, C.; Barbarisi, I.; de La Calle, I.; Fajersztejn, N.; Freschi, M.; Gabriel, C.; Gomez, P.; Guainazzi, M.; Ibarra, A.; Laruelo, A.; Leon, I.; Micol, A.; Parrilla, E.; Ortiz, I.; Osuna, P.; Salgado, J.; Stebe, A.; Tapiador, D.

    2008-08-01

    This paper presents the latest functionalities of the ESA Science Archives located at ESAC, Spain, in particular, the following archives : the ISO Data Archive (IDA {http://iso.esac.esa.int/ida}), the XMM-Newton Science Archive (XSA {http://xmm.esac.esa.int/xsa}), the Integral SOC Science Data Archive (ISDA {http://integral.esac.esa.int/isda}) and the Planetary Science Archive (PSA {http://www.rssd.esa.int/psa}), both the classical and the map-based Mars Express interfaces. Furthermore, the ESA VOSpec {http://esavo.esac.esa.int/vospecapp} spectra analysis tool is described, which allows to access and display spectral information from VO resources (both real observational and theoretical spectra), including access to Lines database and recent analysis functionalities. In addition, we detail the first implementation of RISA (Remote Interface for Science Analysis), a web service providing remote users the ability to create fully configurable XMM-Newton data analysis workflows, and to deploy and run them on the ESAC Grid. RISA makes fully use of the inter-operability provided by the SIAP (Simple Image Access Protocol) services as data input, and at the same time its VO-compatible output can directly be used by general VO-tools.

  15. Transparency: the emerging third dimension of Open Science and Open Data

    Directory of Open Access Journals (Sweden)

    Liz Lyon

    2016-03-01

    Full Text Available This paper presents an exploration of the concept of research transparency. The policy context is described and situated within the broader arena of open science. This is followed by commentary on transparency within the research process, which includes a brief overview of the related concept of reproducibility and the associated elements of research integrity, fraud and retractions. A two-dimensional model or continuum of open science is considered and the paper builds on this foundation by presenting a three-dimensional model, which includes the additional axis of ‘transparency’. The concept is further unpacked and preliminary definitions of key terms are introduced: transparency, transparency action, transparency agent and transparency tool.  An important linkage is made to the research lifecycle as a setting for potential transparency interventions by libraries. Four areas are highlighted as foci for enhanced engagement with transparency goals: Leadership and Policy, Advocacy and Training, Research Infrastructures and Workforce Development.

  16. Science Gateways, Scientific Workflows and Open Community Software

    Science.gov (United States)

    Pierce, M. E.; Marru, S.

    2014-12-01

    Science gateways and scientific workflows occupy different ends of the spectrum of user-focused cyberinfrastructure. Gateways, sometimes called science portals, provide a way for enabling large numbers of users to take advantage of advanced computing resources (supercomputers, advanced storage systems, science clouds) by providing Web and desktop interfaces and supporting services. Scientific workflows, at the other end of the spectrum, support advanced usage of cyberinfrastructure that enable "power users" to undertake computational experiments that are not easily done through the usual mechanisms (managing simulations across multiple sites, for example). Despite these different target communities, gateways and workflows share many similarities and can potentially be accommodated by the same software system. For example, pipelines to process InSAR imagery sets or to datamine GPS time series data are workflows. The results and the ability to make downstream products may be made available through a gateway, and power users may want to provide their own custom pipelines. In this abstract, we discuss our efforts to build an open source software system, Apache Airavata, that can accommodate both gateway and workflow use cases. Our approach is general, and we have applied the software to problems in a number of scientific domains. In this talk, we discuss our applications to usage scenarios specific to earth science, focusing on earthquake physics examples drawn from the QuakSim.org and GeoGateway.org efforts. We also examine the role of the Apache Software Foundation's open community model as a way to build up common commmunity codes that do not depend upon a single "owner" to sustain. Pushing beyond open source software, we also see the need to provide gateways and workflow systems as cloud services. These services centralize operations, provide well-defined programming interfaces, scale elastically, and have global-scale fault tolerance. We discuss our work providing

  17. Knowledge sharing in public-private partnerships in life science: An open science perspective

    OpenAIRE

    Sánchez Jiménez, Óscar David; Aibar Puentes, Eduard

    2016-01-01

    Resultados preliminares sobre la adopción de prácticas de ciencia abierta en partenariados público-privados en Ciencias de la Vida. Resultats preliminars sobre l'adopció de pràctiques de ciència oberta a partenariats publico-privats en Ciències de la Vida. Preliminary results on the adoption of open science practices in public-private partnerships in Life Sciences.

  18. Grid-enabled measures: using Science 2.0 to standardize measures and share data.

    Science.gov (United States)

    Moser, Richard P; Hesse, Bradford W; Shaikh, Abdul R; Courtney, Paul; Morgan, Glen; Augustson, Erik; Kobrin, Sarah; Levin, Kerry Y; Helba, Cynthia; Garner, David; Dunn, Marsha; Coa, Kisha

    2011-05-01

    Scientists are taking advantage of the Internet and collaborative web technology to accelerate discovery in a massively connected, participative environment--a phenomenon referred to by some as Science 2.0. As a new way of doing science, this phenomenon has the potential to push science forward in a more efficient manner than was previously possible. The Grid-Enabled Measures (GEM) database has been conceptualized as an instantiation of Science 2.0 principles by the National Cancer Institute (NCI) with two overarching goals: (1) promote the use of standardized measures, which are tied to theoretically based constructs; and (2) facilitate the ability to share harmonized data resulting from the use of standardized measures. The first is accomplished by creating an online venue where a virtual community of researchers can collaborate together and come to consensus on measures by rating, commenting on, and viewing meta-data about the measures and associated constructs. The second is accomplished by connecting the constructs and measures to an ontological framework with data standards and common data elements such as the NCI Enterprise Vocabulary System (EVS) and the cancer Data Standards Repository (caDSR). This paper will describe the web 2.0 principles on which the GEM database is based, describe its functionality, and discuss some of the important issues involved with creating the GEM database such as the role of mutually agreed-on ontologies (i.e., knowledge categories and the relationships among these categories--for data sharing).

  19. LAMMPS Project Report for the Trinity KNL Open Science Period.

    Energy Technology Data Exchange (ETDEWEB)

    Moore, Stan Gerald [Sandia National Lab. (SNL-NM), Albuquerque, NM (United States); Thompson, Aidan P. [Sandia National Lab. (SNL-NM), Albuquerque, NM (United States); Wood, Mitchell [Sandia National Lab. (SNL-NM), Albuquerque, NM (United States)

    2017-08-01

    LAMMPS is a classical molecular dynamics code (lammps.sandia.gov) used to model materials science problems at Sandia National Laboratories and around the world. LAMMPS was one of three Sandia codes selected to participate in the Trinity KNL (TR2) Open Science period. During this period, three different problems of interest were investigated using LAMMPS. The first was benchmarking KNL performance using different force field models. The second was simulating void collapse in shocked HNS energetic material using an all-atom model. The third was simulating shock propagation through poly-crystalline RDX energetic material using a coarse-grain model, the results of which were used in an ACM Gordon Bell Prize submission. This report describes the results of these simulations, lessons learned, and some hardware issues found on Trinity KNL as part of this work.

  20. Open Access to Scientific Data: Promoting Science and Innovation

    Directory of Open Access Journals (Sweden)

    Guan-Hua Xu

    2007-06-01

    Full Text Available As an important part of the science and technology infrastructure platform of China, the Ministry of Science and Technology launched the Scientific Data Sharing Program in 2002. Twenty-four government agencies now participate in the Program. After five years of hard work, great progress has been achieved in the policy and legal framework, data standards, pilot projects, and international cooperation. By the end of 2005, one-third of the existing public-interest and basic scientific databases in China had been integrated and upgraded. By 2020, China is expected to build a more user-friendly scientific data management and sharing system, with 80 percent of scientific data available to the general public. In order to realize this objective, the emphases of the project are to perfect the policy and legislation system, improve the quality of data resources, expand and establish national scientific data centers, and strengthen international cooperation. It is believed that with the opening up of access to scientific data in China, the Program will play a bigger role in promoting science and national innovation.

  1. Semantic Web-based Vocabulary Broker for Open Science

    Science.gov (United States)

    Ritschel, B.; Neher, G.; Iyemori, T.; Murayama, Y.; Kondo, Y.; Koyama, Y.; King, T. A.; Galkin, I. A.; Fung, S. F.; Wharton, S.; Cecconi, B.

    2016-12-01

    Keyword vocabularies are used to tag and to identify data of science data repositories. Such vocabularies consist of controlled terms and the appropriate concepts, such as GCMD1 keywords or the ESPAS2 keyword ontology. The Semantic Web-based mash-up of domain-specific, cross- or even trans-domain vocabularies provides unique capabilities in the network of appropriate data resources. Based on a collaboration between GFZ3, the FHP4, the WDC for Geomagnetism5 and the NICT6 we developed the concept of a vocabulary broker for inter- and trans-disciplinary data detection and integration. Our prototype of the Semantic Web-based vocabulary broker uses OSF7 for the mash-up of geo and space research vocabularies, such as GCMD keywords, ESPAS keyword ontology and SPASE8 keyword vocabulary. The vocabulary broker starts the search with "free" keywords or terms of a specific vocabulary scheme. The vocabulary broker almost automatically connects the different science data repositories which are tagged by terms of the aforementioned vocabularies. Therefore the mash-up of the SKOS9 based vocabularies with appropriate metadata from different domains can be realized by addressing LOD10 resources or virtual SPARQL11 endpoints which maps relational structures into the RDF format12. In order to demonstrate such a mash-up approach in real life, we installed and use a D2RQ13 server for the integration of IUGONET14 data which are managed by a relational database. The OSF based vocabulary broker and the D2RQ platform are installed at virtual LINUX machines at the Kyoto University. The vocabulary broker meets the standard of a main component of the WDS15 knowledge network. The Web address of the vocabulary broker is http://wdcosf.kugi.kyoto-u.ac.jp 1 Global Change Master Directory2 Near earth space data infrastructure for e-science3 German Research Centre for Geosciences4 University of Applied Sciences Potsdam5 World Data Center for Geomagnetism Kyoto6 National Institute of Information and

  2. Science with the Virtual Observatory: the AstroGrid VO Desktop

    CERN Document Server

    Tedds, Jonathan A

    2009-01-01

    We introduce a general range of science drivers for using the Virtual Observatory (VO) and identify some common aspects to these as well as the advantages of VO data access. We then illustrate the use of existing VO tools to tackle multi wavelength science problems. We demonstrate the ease of multi mission data access using the VOExplorer resource browser, as provided by AstroGrid (http://www.astrogrid.org) and show how to pass the various results into any VO enabled tool such as TopCat for catalogue correlation. VOExplorer offers a powerful data-centric visualisation for browsing and filtering the entire VO registry using an iTunes type interface. This allows the user to bookmark their own personalised lists of resources and to run tasks on the selected resources as desired. We introduce an example of how more advanced querying can be performed to access existing X-ray cluster of galaxies catalogues and then select extended only X-ray sources as candidate clusters of galaxies in the 2XMMi catalogue. Finally ...

  3. Developing Healthcare Data Analytics APPs with Open Data Science Tools.

    Science.gov (United States)

    Hao, Bibo; Sun, Wen; Yu, Yiqin; Xie, Guotong

    2017-01-01

    Recent advances in big data analytics provide more flexible, efficient, and open tools for researchers to gain insight from healthcare data. Whilst many tools require researchers to develop programs with programming languages like Python, R and so on, which is not a skill set grasped by many researchers in the healthcare data analytics area. To make data science more approachable, we explored existing tools and developed a practice that can help data scientists convert existing analytics pipelines to user-friendly analytics APPs with rich interactions and features of real-time analysis. With this practice, data scientists can develop customized analytics pipelines as APPs in Jupyter Notebook and disseminate them to other researchers easily, and researchers can benefit from the shared notebook to perform analysis tasks or reproduce research results much more easily.

  4. LLNL Mercury Project Trinity Open Science Final Report

    Energy Technology Data Exchange (ETDEWEB)

    Brantley, Patrick [Lawrence Livermore National Lab. (LLNL), Livermore, CA (United States); Dawson, Shawn [Lawrence Livermore National Lab. (LLNL), Livermore, CA (United States); McKinley, Scott [Lawrence Livermore National Lab. (LLNL), Livermore, CA (United States); O' Brien, Matt [Lawrence Livermore National Lab. (LLNL), Livermore, CA (United States); Peters, Doug [Lawrence Livermore National Lab. (LLNL), Livermore, CA (United States); Pozulp, Mike [Lawrence Livermore National Lab. (LLNL), Livermore, CA (United States); Becker, Greg [Lawrence Livermore National Lab. (LLNL), Livermore, CA (United States); Mohror, Kathryn [Lawrence Livermore National Lab. (LLNL), Livermore, CA (United States); Moody, Adam [Lawrence Livermore National Lab. (LLNL), Livermore, CA (United States)

    2016-04-20

    The Mercury Monte Carlo particle transport code developed at Lawrence Livermore National Laboratory (LLNL) is used to simulate the transport of radiation through urban environments. These challenging calculations include complicated geometries and require significant computational resources to complete. As a result, a question arises as to the level of convergence of the calculations with Monte Carlo simulation particle count. In the Trinity Open Science calculations, one main focus was to investigate convergence of the relevant simulation quantities with Monte Carlo particle count to assess the current simulation methodology. Both for this application space but also of more general applicability, we also investigated the impact of code algorithms on parallel scaling on the Trinity machine as well as the utilization of the Trinity DataWarp burst buffer technology in Mercury via the LLNL Scalable Checkpoint/Restart (SCR) library.

  5. A Bibliometric Study of Scholarly Articles Published by Library and Information Science Authors about Open Access

    Science.gov (United States)

    Grandbois, Jennifer; Beheshti, Jamshid

    2014-01-01

    Introduction: This study aims to gain a greater understanding of the development of open access practices amongst library and information science authors, since their role is integral to the success of the broader open access movement. Method: Data were collected from scholarly articles about open access by library and information science authors…

  6. Future opportunities and future trends for e-infrastructures and life sciences: going beyond grid to enable life science data analysis

    Directory of Open Access Journals (Sweden)

    Fotis ePsomopoulos

    2015-06-01

    Full Text Available With the increasingly rapid growth of data in Life Sciences we are witnessing a major transition in the way research is conducted, from hypothesis-driven studies to data-driven simulations of whole systems. In the context of the European Grid Infrastructure Community Forum 2014 (Helsinki, 19–23 May 2014, a workshop was held aimed at understanding the state of the art of Grid/Cloud computing in EU research as viewed from within the field of Life Sciences. The workshop brought together Life Science researchers and infrastructure providers from around Europe and facilitated networking between them within the context of EGI. The first part of the workshop included talks from key infrastructures and projects within the Life Sciences community. This was complemented by technical talks that established the key aspects present in major research approaches. Finally, the discussion phase provided significant insights into the road ahead with proposals for possible collaborations and suggestions for future actions.

  7. How FOSTER supports training Open Science in the GeoSciences

    Science.gov (United States)

    Orth, Astrid

    2016-04-01

    FOSTER (1) is about promoting and facilitating the adoption of Open Science by the European research community, and fostering compliance with the open access policies set out in Horizon 2020 (H2020). FOSTER aims to reach out and provide training to the wide range of disciplines and countries involved in the European Research Area (ERA) by offering and supporting face-to-face as well as distance training. Different stakeholders, mainly young researchers, are trained to integrate Open Science in their daily workflow, supporting researchers to optimise their research visibility and impact. Strengthening the institutional training capacity is achieved through a train-the-trainers approach. The two-and-half-year project started in February 2014 with identifying, enriching and providing training content on all relevant topics in the area of Open Science. One of the main elements was to support two rounds of trainings, which were conducted during 2014 and 2015, organizing more than 100 training events with around 3000 participants. The presentation will explain the project objectives and results and will look into best practice training examples, among them successful training series in the GeoSciences. The FOSTER portal that now holds a collection of training resources (e.g. slides and PDFs, schedules and design of training events dedicated to different audiences, video captures of complete events) is presented. It provides easy ways to identify learning materials and to create own e-learning courses based on the materials and examples. (1) FOSTER is funded through the European Union's Seventh Framework Programme for research, technological development and demonstration under grant agreement no 612425. http://fosteropenscience.eu

  8. Cyberinfrastructure for Open Science at the Montreal Neurological Institute.

    Science.gov (United States)

    Das, Samir; Glatard, Tristan; Rogers, Christine; Saigle, John; Paiva, Santiago; MacIntyre, Leigh; Safi-Harab, Mouna; Rousseau, Marc-Etienne; Stirling, Jordan; Khalili-Mahani, Najmeh; MacFarlane, David; Kostopoulos, Penelope; Rioux, Pierre; Madjar, Cecile; Lecours-Boucher, Xavier; Vanamala, Sandeep; Adalat, Reza; Mohaddes, Zia; Fonov, Vladimir S; Milot, Sylvain; Leppert, Ilana; Degroot, Clotilde; Durcan, Thomas M; Campbell, Tara; Moreau, Jeremy; Dagher, Alain; Collins, D Louis; Karamchandani, Jason; Bar-Or, Amit; Fon, Edward A; Hoge, Rick; Baillet, Sylvain; Rouleau, Guy; Evans, Alan C

    2016-01-01

    Data sharing is becoming more of a requirement as technologies mature and as global research and communications diversify. As a result, researchers are looking for practical solutions, not only to enhance scientific collaborations, but also to acquire larger amounts of data, and to access specialized datasets. In many cases, the realities of data acquisition present a significant burden, therefore gaining access to public datasets allows for more robust analyses and broadly enriched data exploration. To answer this demand, the Montreal Neurological Institute has announced its commitment to Open Science, harnessing the power of making both clinical and research data available to the world (Owens, 2016a,b). As such, the LORIS and CBRAIN (Das et al., 2016) platforms have been tasked with the technical challenges specific to the institutional-level implementation of open data sharing, including: Comprehensive linking of multimodal data (phenotypic, clinical, neuroimaging, biobanking, and genomics, etc.)Secure database encryption, specifically designed for institutional and multi-project data sharing, ensuring subject confidentiality (using multi-tiered identifiers).Querying capabilities with multiple levels of single study and institutional permissions, allowing public data sharing for all consented and de-identified subject data.Configurable pipelines and flags to facilitate acquisition and analysis, as well as access to High Performance Computing clusters for rapid data processing and sharing of software tools.Robust Workflows and Quality Control mechanisms ensuring transparency and consistency in best practices.Long term storage (and web access) of data, reducing loss of institutional data assets.Enhanced web-based visualization of imaging, genomic, and phenotypic data, allowing for real-time viewing and manipulation of data from anywhere in the world.Numerous modules for data filtering, summary statistics, and personalized and configurable dashboards. Implementing

  9. Cyberinfrastructure for Open Science at the Montreal Neurological Institute

    Science.gov (United States)

    Das, Samir; Glatard, Tristan; Rogers, Christine; Saigle, John; Paiva, Santiago; MacIntyre, Leigh; Safi-Harab, Mouna; Rousseau, Marc-Etienne; Stirling, Jordan; Khalili-Mahani, Najmeh; MacFarlane, David; Kostopoulos, Penelope; Rioux, Pierre; Madjar, Cecile; Lecours-Boucher, Xavier; Vanamala, Sandeep; Adalat, Reza; Mohaddes, Zia; Fonov, Vladimir S.; Milot, Sylvain; Leppert, Ilana; Degroot, Clotilde; Durcan, Thomas M.; Campbell, Tara; Moreau, Jeremy; Dagher, Alain; Collins, D. Louis; Karamchandani, Jason; Bar-Or, Amit; Fon, Edward A.; Hoge, Rick; Baillet, Sylvain; Rouleau, Guy; Evans, Alan C.

    2017-01-01

    Data sharing is becoming more of a requirement as technologies mature and as global research and communications diversify. As a result, researchers are looking for practical solutions, not only to enhance scientific collaborations, but also to acquire larger amounts of data, and to access specialized datasets. In many cases, the realities of data acquisition present a significant burden, therefore gaining access to public datasets allows for more robust analyses and broadly enriched data exploration. To answer this demand, the Montreal Neurological Institute has announced its commitment to Open Science, harnessing the power of making both clinical and research data available to the world (Owens, 2016a,b). As such, the LORIS and CBRAIN (Das et al., 2016) platforms have been tasked with the technical challenges specific to the institutional-level implementation of open data sharing, including: Comprehensive linking of multimodal data (phenotypic, clinical, neuroimaging, biobanking, and genomics, etc.)Secure database encryption, specifically designed for institutional and multi-project data sharing, ensuring subject confidentiality (using multi-tiered identifiers).Querying capabilities with multiple levels of single study and institutional permissions, allowing public data sharing for all consented and de-identified subject data.Configurable pipelines and flags to facilitate acquisition and analysis, as well as access to High Performance Computing clusters for rapid data processing and sharing of software tools.Robust Workflows and Quality Control mechanisms ensuring transparency and consistency in best practices.Long term storage (and web access) of data, reducing loss of institutional data assets.Enhanced web-based visualization of imaging, genomic, and phenotypic data, allowing for real-time viewing and manipulation of data from anywhere in the world.Numerous modules for data filtering, summary statistics, and personalized and configurable dashboards. Implementing

  10. Informatics in radiology: An open-source and open-access cancer biomedical informatics grid annotation and image markup template builder.

    Science.gov (United States)

    Mongkolwat, Pattanasak; Channin, David S; Kleper, Vladimir; Rubin, Daniel L

    2012-01-01

    In a routine clinical environment or clinical trial, a case report form or structured reporting template can be used to quickly generate uniform and consistent reports. Annotation and image markup (AIM), a project supported by the National Cancer Institute's cancer biomedical informatics grid, can be used to collect information for a case report form or structured reporting template. AIM is designed to store, in a single information source, (a) the description of pixel data with use of markups or graphical drawings placed on the image, (b) calculation results (which may or may not be directly related to the markups), and (c) supplemental information. To facilitate the creation of AIM annotations with data entry templates, an AIM template schema and an open-source template creation application were developed to assist clinicians, image researchers, and designers of clinical trials to quickly create a set of data collection items, thereby ultimately making image information more readily accessible.

  11. Evolution of stellar collision products in open clusters : II. A grid of low-mass collisions

    NARCIS (Netherlands)

    Glebbeek, E.; Pols, O.R.

    2008-01-01

    In a companion paper we studied the detailed evolution of stellar collision products that occurred in an N-body simulation of the old open cluster M 67 and compared our detailed models to simple prescriptions. In this paper we extend this work by studying the evolution of the collision products in o

  12. Making USGS Science Data more Open, Accessible, and Usable: Leveraging ScienceBase for Success

    Science.gov (United States)

    Chang, M.; Ignizio, D.; Langseth, M. L.; Norkin, T.

    2016-12-01

    In 2013, the White House released initiatives requiring federally funded research to be made publicly available and machine readable. In response, the U.S. Geological Survey (USGS) has been developing a unified approach to make USGS data available and open. This effort has involved the establishment of internal policies and the release of a Public Access Plan, which outlines a strategy for the USGS to move forward into the modern era in scientific data management. Originally designed as a catalog and collaborative data management platform, ScienceBase (www.sciencebase.gov) is being leveraged to serve as a robust data hosting solution for USGS researchers to make scientific data accessible. With the goal of maintaining persistent access to formal data products and developing a management approach to facilitate stable data citation, the ScienceBase Data Release Team was established to ensure the quality, consistency, and meaningful organization of USGS data through standardized workflows and best practices. These practices include the creation and maintenance of persistent identifiers for data, improving the use of open data formats, establishing permissions for read/write access, validating the quality of standards compliant metadata, verifying that data have been reviewed and approved prior to release, and connecting to external search catalogs such as the USGS Science Data Catalog (data.usgs.gov) and data.gov. The ScienceBase team is actively building features to support this effort by automating steps to streamline the process, building metrics to track site visits and downloads, and connecting published digital resources in line with USGS and Federal policy. By utilizing ScienceBase to achieve stewardship quality and employing a dedicated team to help USGS scientists improve the quality of their data, the USGS is helping to meet today's data quality management challenges and ensure that reliable USGS data are available to and reusable for the public.

  13. Further validation of liquid metal MHD code for unstructured grid based on OpenFOAM

    Energy Technology Data Exchange (ETDEWEB)

    Feng, Jingchao; Chen, Hongli, E-mail: hlchen1@ustc.edu.cn; He, Qingyun; Ye, Minyou

    2015-11-15

    Highlights: • Specific correction scheme has been adopted to revise the calculating result for non-orthogonal meshes. • The developed MHD code based on OpenFOAM platform has been validated by benchmark cases under uniform and non-uniform magnetic field in round and rectangular ducts. • ALEX experimental results have been used to validate the MHD code based on OpenFOAM. - Abstract: In fusion liquid metal blankets, complex geometries involving contractions, expansions, bends, manifolds are very common. The characteristics of liquid metal flow in these geometries are significant. In order to extend the magnetohydrodynamic (MHD) solver developed on OpenFOAM platform to be applied in the complex geometry, the MHD solver based on unstructured meshes has been implemented. The adoption of non-orthogonal correction techniques in the solver makes it possible to process the non-orthogonal meshes in complex geometries. The present paper focused on the validation of the code under critical conditions. An analytical solution benchmark case and two experimental benchmark cases were conducted to validate the code. Benchmark case I is MHD flow in a circular pipe with arbitrary electric conductivity of the walls in a uniform magnetic field. Benchmark cases II and III are experimental cases of 3D laminar steady MHD flow under fringing magnetic field. In all these cases, the numerical results match well with the benchmark cases.

  14. ISOGA: Integrated Services Optical Grid Architecture for Emerging E-Science Collaborative Applications

    Energy Technology Data Exchange (ETDEWEB)

    Oliver Yu

    2008-11-28

    This final report describes the accomplishments in the ISOGA (Integrated Services Optical Grid Architecture) project. ISOGA enables efficient deployment of existing and emerging collaborative grid applications with increasingly diverse multimedia communication requirements over a wide-area multi-domain optical network grid; and enables collaborative scientists with fast retrieval and seamless browsing of distributed scientific multimedia datasets over a wide-area optical network grid. The project focuses on research and development in the following areas: the polymorphic optical network control planes to enable multiple switching and communication services simultaneously; the intelligent optical grid user-network interface to enable user-centric network control and monitoring; and the seamless optical grid dataset browsing interface to enable fast retrieval of local/remote dataset for visualization and manipulation.

  15. A New Open Access Journal of Marine Science and Engineering

    Directory of Open Access Journals (Sweden)

    Anthony S. Clare

    2013-03-01

    Full Text Available The oceans cover approximately 71% of the Earth’s surface and contain more than 97% of the planet’s water, representing over 100 times more liveable volume than the terrestrial habitat. Approximately fifty percent of the species on the planet occupy this ocean biome, much of which remains unexplored. The health and sustainability of the oceans are threatened by a combination of pressures associated with climate change and the ever-increasing demands we place on them for food, recreation, trade, energy and minerals. The biggest threat, however, is the pace of change to the oceans, e.g., ocean acidification, which is unprecedented in human history. Consequently, there has never been a greater need for the rapid and widespread dissemination of the outcomes of research aimed at improving our understanding of how the oceans work and solutions to their sustainable use. It is our hope that this new online, open-access Journal of Marine Science and Engineering will go some way to fulfilling this need. [...

  16. Open-phase operating modes of power flow control topologies in a Smart Grid Distribution Network

    Science.gov (United States)

    Astashev, M. G.; Novikov, M. A.; Panfilov, D. I.; Rashitov, P. A.; Remizevich, T. V.; Fedorova, M. I.

    2015-12-01

    The power flow regulating circuit node in an alternating current system is reviewed. The circuit node is accomplished based on a thyristor controlled phase angle regulator (TCPAR) with controlled thyristor switch. Research results of the individual phase control of the output voltage for the TCPAR are presented. Analytical expressions for the overvoltage factor calculation in the thyristor switch circuit for open-phase operating modes are received. Based on evaluation of overvoltage in operational and emergency modes, the implementability conditions of the individual phase control of the output voltage are determined. Under these conditions, maximal performance and complete controllability are provided.

  17. Authentic school science knowing and learning in open-inquiry science laboratories

    CERN Document Server

    Roth, Wolff-Michael

    1995-01-01

    According to John Dewey, Seymour Papert, Donald Schon, and Allan Collins, school activities, to be authentic, need to share key features with those worlds about which they teach. This book documents learning and teaching in open-inquiry learning environments, designed with the precepts of these educational thinkers in mind. The book is thus a first-hand report of knowing and learning by individuals and groups in complex open-inquiry learning environments in science. As such, it contributes to the emerging literature in this field. Secondly, it exemplifies research methods for studying such complex learning environments. The reader is thus encouraged not only to take the research findings as such, but to reflect on the process of arriving at these findings. Finally, the book is also an example of knowledge constructed by a teacher-researcher, and thus a model for teacher-researcher activity.

  18. A national upgrade of the climate monitoring grid in Sri Lanka. The place of Open Design, OSHW and FOSS.

    Science.gov (United States)

    Chemin, Yann; Bandara, Niroshan; Eriyagama, Nishadi

    2015-04-01

    The National Climate Observatory of Sri lanka is a proposition designed for the Government of Sri Lanka in September and discussed with private and public stakeholders in November 2014. The idea was initially to install a networked grid of weather instruments from locally-made open source hardware technology, on land and seas, that report live the state of climate. After initial stakeholder meetings, it was agreed to first try to connect any existing weather stations from different governmental and private sector agencies. This would bring existing information to a common ground through the Internet. At this point, it was realized that extracting information from various vendors set up would take a large amount of efforts, that is still the best and fastest anyway, as considerations from ownership and maintenance are the most important issues in a tropical humid country as Sri Lanka. Thus, the question of Open Design, open source hardware (OSHW) and free and open source software (FOSS) became a pivotal element in considering operationalization of any future elements of a national grid. Reasons range from ownership, to low-cost and customization, but prominently it is about technology ownership, royalty-free and local availability. Building on previous work from (Chemin and Bandara, 2014) we proposed to open design specifications and prototypes for weather monitoring for various kinds of needs, the Meteorological Department clearly specified that the highest variability observed spatially in Sri Lanka is rainfall, and their willingness to investigate OSHW electronics using their new team of electronics and sensors specialists. A local manufacturer is providing an OSHW micro-controller product, a start up is providing additional sensor boards under OSHW specifications and local manufacture of the sensors (tipping-bucket and other wind sensors) is under development and blueprints have been made available in the Public Domain for CNC machine, 3D printing or Plastic

  19. Open Science Strategies in Research Policies: A Comparative Exploration of Canada, the US and the UK

    Science.gov (United States)

    Lasthiotakis, Helen; Kretz, Andrew; Sá, Creso

    2015-01-01

    Several movements have emerged related to the general idea of promoting "openness" in science. Research councils are key institutions in bringing about changes proposed by these movements, as sponsors and facilitators of research. In this paper we identify the approaches used in Canada, the US and the UK to advance open science, as a…

  20. Open Data in Biomedical Science: Policy Drivers and Recent Progress

    Science.gov (United States)

    EPA's progress in implementing the open data initiatives first outlined in the 2009 Presidential memorandum on open government and more specifically regarding publications and data from publications in the 2013 Holdren memorandum. The presentation outlines the major points in bo...

  1. Open Data in Biomedical Science: Policy Drivers and Recent Progress

    Science.gov (United States)

    EPA's progress in implementing the open data initiatives first outlined in the 2009 Presidential memorandum on open government and more specifically regarding publications and data from publications in the 2013 Holdren memorandum. The presentation outlines the major points in bo...

  2. Grid Security

    CERN Document Server

    CERN. Geneva

    2004-01-01

    The aim of Grid computing is to enable the easy and open sharing of resources between large and highly distributed communities of scientists and institutes across many independent administrative domains. Convincing site security officers and computer centre managers to allow this to happen in view of today's ever-increasing Internet security problems is a major challenge. Convincing users and application developers to take security seriously is equally difficult. This paper will describe the main Grid security issues, both in terms of technology and policy, that have been tackled over recent years in LCG and related Grid projects. Achievements to date will be described and opportunities for future improvements will be addressed.

  3. CMS Monte Carlo production in the WLCG computing grid

    CERN Document Server

    Hernández, J M; Mohapatra, A; Filippis, N D; Weirdt, S D; Hof, C; Wakefield, S; Guan, W; Khomitch, A; Fanfani, A; Evans, D; Flossdorf, A; Maes, J; van Mulders, P; Villella, I; Pompili, A; My, S; Abbrescia, M; Maggi, G; Donvito, G; Caballero, J; Sanches, J A; Kavka, C; Van Lingen, F; Bacchi, W; Codispoti, G; Elmer, P; Eulisse, G; Lazaridis, C; Kalini, S; Sarkar, S; Hammad, G

    2008-01-01

    Monte Carlo production in CMS has received a major boost in performance and scale since the past CHEP06 conference. The production system has been re-engineered in order to incorporate the experience gained in running the previous system and to integrate production with the new CMS event data model, data management system and data processing framework. The system is interfaced to the two major computing Grids used by CMS, the LHC Computing Grid (LCG) and the Open Science Grid (OSG).

  4. Mathematics and Science: Uneasy Truce or Open Hostilities?

    Science.gov (United States)

    Bausor, John

    1974-01-01

    The present situation in the relation of mathematics and science instruction in schools, and mathematics in the Schools Council Integrated Science Project (SCISP), are discussed; complaints from science teachers about emphases in mathematics instruction are presented and encouraging features of modern mathematics are listed. (DT)

  5. Perspectives on Open Science and scientific data sharing:an interdisciplinary workshop.

    Science.gov (United States)

    Destro Bisol, Giovanni; Anagnostou, Paolo; Capocasa, Marco; Bencivelli, Silvia; Cerroni, Andrea; Contreras, Jorge; Enke, Neela; Fantini, Bernardino; Greco, Pietro; Heeney, Catherine; Luzi, Daniela; Manghi, Paolo; Mascalzoni, Deborah; Molloy, Jennifer; Parenti, Fabio; Wicherts, Jelte; Boulton, Geoffrey

    2014-01-01

    Looking at Open Science and Open Data from a broad perspective. This is the idea behind "Scientific data sharing: an interdisciplinary workshop", an initiative designed to foster dialogue between scholars from different scientific domains which was organized by the Istituto Italiano di Antropologia in Anagni, Italy, 2-4 September 2013.We here report summaries of the presentations and discussions at the meeting. They deal with four sets of issues: (i) setting a common framework, a general discussion of open data principles, values and opportunities; (ii) insights into scientific practices, a view of the way in which the open data movement is developing in a variety of scientific domains (biology, psychology, epidemiology and archaeology); (iii) a case study of human genomics, which was a trail-blazer in data sharing, and which encapsulates the tension that can occur between large-scale data sharing and one of the boundaries of openness, the protection of individual data; (iv) open science and the public, based on a round table discussion about the public communication of science and the societal implications of open science. There were three proposals for the planning of further interdisciplinary initiatives on open science. Firstly, there is a need to integrate top-down initiatives by governments, institutions and journals with bottom-up approaches from the scientific community. Secondly, more should be done to popularize the societal benefits of open science, not only in providing the evidence needed by citizens to draw their own conclusions on scientific issues that are of concern to them, but also explaining the direct benefits of data sharing in areas such as the control of infectious disease. Finally, introducing arguments from social sciences and humanities in the educational dissemination of open data may help students become more profoundly engaged with Open Science and look at science from a broader perspective.

  6. Grid Computing

    Science.gov (United States)

    Foster, Ian

    2001-08-01

    The term "Grid Computing" refers to the use, for computational purposes, of emerging distributed Grid infrastructures: that is, network and middleware services designed to provide on-demand and high-performance access to all important computational resources within an organization or community. Grid computing promises to enable both evolutionary and revolutionary changes in the practice of computational science and engineering based on new application modalities such as high-speed distributed analysis of large datasets, collaborative engineering and visualization, desktop access to computation via "science portals," rapid parameter studies and Monte Carlo simulations that use all available resources within an organization, and online analysis of data from scientific instruments. In this article, I examine the status of Grid computing circa 2000, briefly reviewing some relevant history, outlining major current Grid research and development activities, and pointing out likely directions for future work. I also present a number of case studies, selected to illustrate the potential of Grid computing in various areas of science.

  7. Indiana University receives grant from National Science Foundation to help build global grid network

    CERN Multimedia

    2001-01-01

    The NSF awarded a consortium of 15 universities $13.65 million to build the International Virtual Data Grid Laboratory, or iVDGL. The iVDGL will consist of a seamless network of thousands of computers at 40 locations in the US, Europe and Asia. These computers will work together as a powerful grid capable of handling petabytes of data. Indiana University will make significant contributions to this project by providing a prototype Tier-2 Data Center for the ATLAS high energy physics experiment and the International Grid Operations Center.

  8. Reliable multicast for the Grid: a case study in experimental computer science.

    Science.gov (United States)

    Nekovee, Maziar; Barcellos, Marinho P; Daw, Michael

    2005-08-15

    In its simplest form, multicast communication is the process of sending data packets from a source to multiple destinations in the same logical multicast group. IP multicast allows the efficient transport of data through wide-area networks, and its potentially great value for the Grid has been highlighted recently by a number of research groups. In this paper, we focus on the use of IP multicast in Grid applications, which require high-throughput reliable multicast. These include Grid-enabled computational steering and collaborative visualization applications, and wide-area distributed computing. We describe the results of our extensive evaluation studies of state-of-the-art reliable-multicast protocols, which were performed on the UK's high-speed academic networks. Based on these studies, we examine the ability of current reliable multicast technology to meet the Grid's requirements and discuss future directions.

  9. Open Data in Biomedical Science: Policy Drivers and Recent ...

    Science.gov (United States)

    EPA's progress in implementing the open data initiatives first outlined in the 2009 Presidential memorandum on open government and more specifically regarding publications and data from publications in the 2013 Holdren memorandum. The presentation outlines the major points in both memorandums regarding open data, presents several (but not exhaustive) EPA initiatives on open data, some of which occurred will before both policy memorandums. The presentation concludes by outlining the initiatives to ensure public access to all EPA publications through PubMed Central and all publication-associated data through the Environmental Data Gateway and Data.gov. The purpose of this presentation is to present EPA's progress in implementing the open data initiatives first outlined in the 2009 Presidential memorandum on open government and more specifically regarding publications and data from publications in the 2013 Holdren memorandum.

  10. Diagnosis method of an open-switch fault for a grid-connected T-type three-level inverter system

    DEFF Research Database (Denmark)

    Choi, U. M.; Lee, K. B.; Blaabjerg, Frede

    2012-01-01

    This paper proposes a diagnosis method of an open-switch fault and fault-tolerant control algorithm for a grid-connected T-type three-level inverter. The location of the faulty switch is identified by using the changes of average phase current and the neutral-point voltage. The fault-tolerant con......This paper proposes a diagnosis method of an open-switch fault and fault-tolerant control algorithm for a grid-connected T-type three-level inverter. The location of the faulty switch is identified by using the changes of average phase current and the neutral-point voltage. The fault......-tolerant control algorithm can be used when the open switch fault occurs in the middle switches. It is achieved by simply modifying the conventional SVM method. The proposed methods are advantageous as they do not require additional sensors and they do not involve complex calculations. Therefore, this method...

  11. Mobile Open-Source Solar-Powered 3-D Printers for Distributed Manufacturing in Off-Grid Communities

    Directory of Open Access Journals (Sweden)

    Debbie L. King

    2014-04-01

    Full Text Available Manufacturing in areas of the developing world that lack electricity severely restricts the technical sophistication of what is produced. More than a billion people with no access to electricity still have access to some imported higher-technologies; however, these often lack customization and often appropriateness for their community. Open source appropriate tech­nology (OSAT can over­come this challenge, but one of the key impediments to the more rapid development and distri­bution of OSAT is the lack of means of production beyond a specific technical complexity. This study designs and demonstrates the technical viability of two open-source mobile digital manufacturing facilities powered with solar photovoltaics, and capable of printing customizable OSAT in any com­munity with access to sunlight. The first, designed for com­munity use, such as in schools or maker­spaces, is semi-mobile and capable of nearly continuous 3-D printing using RepRap technology, while also powering multiple computers. The second design, which can be completely packed into a standard suitcase, allows for specialist travel from community to community to provide the ability to custom manufacture OSAT as needed, anywhere. These designs not only bring the possibility of complex manufacturing and replacement part fabrication to isolated rural communities lacking access to the electric grid, but they also offer the opportunity to leap-frog the entire conventional manufacturing supply chain, while radically reducing both the cost and the environmental impact of products for developing communities.

  12. Open Source Software Alternatives in Higher Education Following Computer Science Curricula 2013

    Directory of Open Access Journals (Sweden)

    Borislav Stoyanov

    2014-09-01

    Full Text Available In this study we overview the open source software, describe the advantages and disadvantages of using open source software in modern Higher Education following Computer Science Curricula 2013. The study’s main purposes are to clear the understanding of open source software, to present alternatives of the commercial software and demonstrate the potential benefits of integrating open source software in Higher Education.

  13. Virtual Labs (Science Gateways) as platforms for Free and Open Source Science

    Science.gov (United States)

    Lescinsky, David; Car, Nicholas; Fraser, Ryan; Friedrich, Carsten; Kemp, Carina; Squire, Geoffrey

    2016-04-01

    The Free and Open Source Software (FOSS) movement promotes community engagement in software development, as well as provides access to a range of sophisticated technologies that would be prohibitively expensive if obtained commercially. However, as geoinformatics and eResearch tools and services become more dispersed, it becomes more complicated to identify and interface between the many required components. Virtual Laboratories (VLs, also known as Science Gateways) simplify the management and coordination of these components by providing a platform linking many, if not all, of the steps in particular scientific processes. These enable scientists to focus on their science, rather than the underlying supporting technologies. We describe a modular, open source, VL infrastructure that can be reconfigured to create VLs for a wide range of disciplines. Development of this infrastructure has been led by CSIRO in collaboration with Geoscience Australia and the National Computational Infrastructure (NCI) with support from the National eResearch Collaboration Tools and Resources (NeCTAR) and the Australian National Data Service (ANDS). Initially, the infrastructure was developed to support the Virtual Geophysical Laboratory (VGL), and has subsequently been repurposed to create the Virtual Hazards Impact and Risk Laboratory (VHIRL) and the reconfigured Australian National Virtual Geophysics Laboratory (ANVGL). During each step of development, new capabilities and services have been added and/or enhanced. We plan on continuing to follow this model using a shared, community code base. The VL platform facilitates transparent and reproducible science by providing access to both the data and methodologies used during scientific investigations. This is further enhanced by the ability to set up and run investigations using computational resources accessed through the VL. Data is accessed using registries pointing to catalogues within public data repositories (notably including the

  14. Future opportunities and trends for e-infrastructures and life sciences: going beyond the grid to enable life science data analysis.

    Science.gov (United States)

    Duarte, Afonso M S; Psomopoulos, Fotis E; Blanchet, Christophe; Bonvin, Alexandre M J J; Corpas, Manuel; Franc, Alain; Jimenez, Rafael C; de Lucas, Jesus M; Nyrönen, Tommi; Sipos, Gergely; Suhr, Stephanie B

    2015-01-01

    With the increasingly rapid growth of data in life sciences we are witnessing a major transition in the way research is conducted, from hypothesis-driven studies to data-driven simulations of whole systems. Such approaches necessitate the use of large-scale computational resources and e-infrastructures, such as the European Grid Infrastructure (EGI). EGI, one of key the enablers of the digital European Research Area, is a federation of resource providers set up to deliver sustainable, integrated and secure computing services to European researchers and their international partners. Here we aim to provide the state of the art of Grid/Cloud computing in EU research as viewed from within the field of life sciences, focusing on key infrastructures and projects within the life sciences community. Rather than focusing purely on the technical aspects underlying the currently provided solutions, we outline the design aspects and key characteristics that can be identified across major research approaches. Overall, we aim to provide significant insights into the road ahead by establishing ever-strengthening connections between EGI as a whole and the life sciences community.

  15. The Earth System Grid Federation (ESGF): Climate Science Infrastructure for Large-scale Data Management and Dissemination

    Science.gov (United States)

    Williams, D. N.

    2015-12-01

    Progress in understanding and predicting climate change requires advanced tools to securely store, manage, access, process, analyze, and visualize enormous and distributed data sets. Only then can climate researchers understand the effects of climate change across all scales and use this information to inform policy decisions. With the advent of major international climate modeling intercomparisons, a need emerged within the climate-change research community to develop efficient, community-based tools to obtain relevant meteorological and other observational data, develop custom computational models, and export analysis tools for climate-change simulations. While many nascent efforts to fill these gaps appeared, they were not integrated and therefore did not benefit from collaborative development. Sharing huge data sets was difficult, and the lack of data standards prevented the merger of output data from different modeling groups. Thus began one of the largest-ever collaborative data efforts in climate science, resulting in the Earth System Grid Federation (ESGF), which is now used to disseminate model, observational, and reanalysis data for research assessed by the Intergovernmental Panel on Climate Change (IPCC). Today, ESGF is an open-source petabyte-level data storage and dissemination operational code-base that manages secure resources essential for climate change study. It is designed to remain robust even as data volumes grow exponentially. The internationally distributed, peer-to-peer ESGF "data cloud" archive represents the culmination of an effort that began in the late 1990s. ESGF portals are gateways to scientific data collections hosted at sites around the globe that allow the user to register and potentially access the entire ESGF network of data and services. The growing international interest in ESGF development efforts has attracted many others who want to make their data more widely available and easy to use. For example, the World Climate

  16. Open university of the Netherlands & Centre for Learning Sciences and Technologies

    NARCIS (Netherlands)

    Janssen, José; Stoyanov, Slavi

    2012-01-01

    Janssen, J., & Stoyanov, S. (2011, 14 December). Open University of the Netherlands & Centre for Learning Sciences and Technologies. Presentation for The Institute for Prospective Technological Studies (IPTS), Seville, Spain.

  17. Open-access databases as unprecedented resources and drivers of cultural change in fisheries science

    Energy Technology Data Exchange (ETDEWEB)

    McManamay, Ryan A [ORNL; Utz, Ryan [National Ecological Observatory Network

    2014-01-01

    Open-access databases with utility in fisheries science have grown exponentially in quantity and scope over the past decade, with profound impacts to our discipline. The management, distillation, and sharing of an exponentially growing stream of open-access data represents several fundamental challenges in fisheries science. Many of the currently available open-access resources may not be universally known among fisheries scientists. We therefore introduce many national- and global-scale open-access databases with applications in fisheries science and provide an example of how they can be harnessed to perform valuable analyses without additional field efforts. We also discuss how the development, maintenance, and utilization of open-access data are likely to pose technical, financial, and educational challenges to fisheries scientists. Such cultural implications that will coincide with the rapidly increasing availability of free data should compel the American Fisheries Society to actively address these problems now to help ease the forthcoming cultural transition.

  18. On the evolving open peer review culture for chemical information science.

    Science.gov (United States)

    Walters, W Patrick; Bajorath, Jürgen

    2015-01-01

    Compared to the traditional anonymous peer review process, open post-publication peer review provides additional opportunities -and challenges- for reviewers to judge scientific studies. In this editorial, we comment on the open peer review culture and provide some guidance for reviewers of manuscripts submitted to the Chemical Information Science channel of F1000Research.

  19. Perspectives on open science and scientific data sharing : An interdisciplinary workshop”

    NARCIS (Netherlands)

    Destro Bisol, G.; Anagnostou, P.; Capocasa, M.; Bencivelli, S.; Cerroni, A.; Contreras, J.; Enke, N.; Fantini, B.; Greco, P.; Heeney, C.; Luzi, D.; Manghi, P.; Mascalzoni, D.; Molloy, J.; Parenti, F.; Wicherts, J.M.; Boulton, G.

    2014-01-01

    Looking at Open Science and Open Data from a broad perspective. This is the idea behind “Scientific data sharing: an interdisciplinary workshop”, an initiative designed to foster dialogue between scholars from different scientific domains which was organized by the Istituto Italiano di Antropologia

  20. Open Access and the Transformation of Science-the Time is Ripe

    Institute of Scientific and Technical Information of China (English)

    Jan Velterop

    2005-01-01

    We are experiencing a'climate change'in science publishing and we will witness major changes in the way it is being done,resulting in more and more open access to the scientific research literature. The intemet makes it all possible, and the impact on science will be phenomenal.

  1. 75 FR 9876 - Science Advisory Board; Notice of Open Meeting

    Science.gov (United States)

    2010-03-04

    ..., education, and application of science to operations and information services. SAB activities and advice... meeting agenda. Place: The meeting will be held both days at the Dupont Hotel, 1500 New Hampshire Avenue... to the SAB Climate Service Options Report and (10) NOAA Next Generation Strategic Plan. FOR FURTHER...

  2. 'Open SESAME' for science in the Middle-East

    CERN Multimedia

    2003-01-01

    A memorandum of understanding has just been signed between CERN, SESAME and Jordan. SESAME, the international centre for Synchrotron light for Experimental Science and Applications in the Middle East, is currently being built in Jordan. Its President of Council is no other than CERN's former Director-General, Herwig Schopper.

  3. OGC and Grid Interoperability in enviroGRIDS Project

    Science.gov (United States)

    Gorgan, Dorian; Rodila, Denisa; Bacu, Victor; Giuliani, Gregory; Ray, Nicolas

    2010-05-01

    EnviroGRIDS (Black Sea Catchment Observation and Assessment System supporting Sustainable Development) [1] is a 4-years FP7 Project aiming to address the subjects of ecologically unsustainable development and inadequate resource management. The project develops a Spatial Data Infrastructure of the Black Sea Catchment region. The geospatial technologies offer very specialized functionality for Earth Science oriented applications as well as the Grid oriented technology that is able to support distributed and parallel processing. One challenge of the enviroGRIDS project is the interoperability between geospatial and Grid infrastructures by providing the basic and the extended features of the both technologies. The geospatial interoperability technology has been promoted as a way of dealing with large volumes of geospatial data in distributed environments through the development of interoperable Web service specifications proposed by the Open Geospatial Consortium (OGC), with applications spread across multiple fields but especially in Earth observation research. Due to the huge volumes of data available in the geospatial domain and the additional introduced issues (data management, secure data transfer, data distribution and data computation), the need for an infrastructure capable to manage all those problems becomes an important aspect. The Grid promotes and facilitates the secure interoperations of geospatial heterogeneous distributed data within a distributed environment, the creation and management of large distributed computational jobs and assures a security level for communication and transfer of messages based on certificates. This presentation analysis and discusses the most significant use cases for enabling the OGC Web services interoperability with the Grid environment and focuses on the description and implementation of the most promising one. In these use cases we give a special attention to issues such as: the relations between computational grid and

  4. OSG Director reports on grid progress

    CERN Multimedia

    Pordes, Ruth

    2006-01-01

    "In this Q&A from the Open Science Grid (OSG), executive director Ruth Prodes provides a brief history of the OSG, an overview of current projects and partners, and a glimpse at future plans, including how the recent $30 million award from the ODE's office of Science and the NSF will be employed. She also shares her thoughts of SC, saying the personal contacts are the best part."(4,5 pages)

  5. The Open Microscopy Environment: open image informatics for the biological sciences

    Science.gov (United States)

    Blackburn, Colin; Allan, Chris; Besson, Sébastien; Burel, Jean-Marie; Carroll, Mark; Ferguson, Richard K.; Flynn, Helen; Gault, David; Gillen, Kenneth; Leigh, Roger; Leo, Simone; Li, Simon; Lindner, Dominik; Linkert, Melissa; Moore, Josh; Moore, William J.; Ramalingam, Balaji; Rozbicki, Emil; Rustici, Gabriella; Tarkowska, Aleksandra; Walczysko, Petr; Williams, Eleanor; Swedlow, Jason R.

    2016-07-01

    Despite significant advances in biological imaging and analysis, major informatics challenges remain unsolved: file formats are proprietary, storage and analysis facilities are lacking, as are standards for sharing image data and results. While the open FITS file format is ubiquitous in astronomy, astronomical imaging shares many challenges with biological imaging, including the need to share large image sets using secure, cross-platform APIs, and the need for scalable applications for processing and visualization. The Open Microscopy Environment (OME) is an open-source software framework developed to address these challenges. OME tools include: an open data model for multidimensional imaging (OME Data Model); an open file format (OME-TIFF) and library (Bio-Formats) enabling free access to images (5D+) written in more than 145 formats from many imaging domains, including FITS; and a data management server (OMERO). The Java-based OMERO client-server platform comprises an image metadata store, an image repository, visualization and analysis by remote access, allowing sharing and publishing of image data. OMERO provides a means to manage the data through a multi-platform API. OMERO's model-based architecture has enabled its extension into a range of imaging domains, including light and electron microscopy, high content screening, digital pathology and recently into applications using non-image data from clinical and genomic studies. This is made possible using the Bio-Formats library. The current release includes a single mechanism for accessing image data of all types, regardless of original file format, via Java, C/C++ and Python and a variety of applications and environments (e.g. ImageJ, Matlab and R).

  6. Successful Massive Open Online Climate Course on Climate Science and Psychology

    Science.gov (United States)

    Nuccitelli, D. A.; Cook, J.

    2015-12-01

    In 2015, the University of Queensland and edX launched a Massive Open Online Course (MOOC), 'Making Sense of Climate Science Denial.' The MOOC debunked approximately 50 common climate myths using elements of both physical science and psychology. Students learned how to recognise the social and psychological drivers of climate science denial, how to better understand climate change, how to identify the techniques and fallacies that climate myths employ to distort climate science, and how to effectively debunk climate misinformation. Contributors to the website Skeptical Science delivered the lectures, which were reinforced via interviews with climate science and psychology experts. Over 15,000 students from 167 countries enrolled in the course, and student feedback was overwhelmingly positive. This MOOC provides a model for effective climate science education.

  7. The OpenEarth Framework (OEF) for the 3D Visualization of Integrated Earth Science Data

    Science.gov (United States)

    Nadeau, David; Moreland, John; Baru, Chaitan; Crosby, Chris

    2010-05-01

    Data integration is increasingly important as we strive to combine data from disparate sources and assemble better models of the complex processes operating at the Earth's surface and within its interior. These data are often large, multi-dimensional, and subject to differing conventions for data structures, file formats, coordinate spaces, and units of measure. When visualized, these data require differing, and sometimes conflicting, conventions for visual representations, dimensionality, symbology, and interaction. All of this makes the visualization of integrated Earth science data particularly difficult. The OpenEarth Framework (OEF) is an open-source data integration and visualization suite of applications and libraries being developed by the GEON project at the University of California, San Diego, USA. Funded by the NSF, the project is leveraging virtual globe technology from NASA's WorldWind to create interactive 3D visualization tools that combine and layer data from a wide variety of sources to create a holistic view of features at, above, and beneath the Earth's surface. The OEF architecture is open, cross-platform, modular, and based upon Java. The OEF's modular approach to software architecture yields an array of mix-and-match software components for assembling custom applications. Available modules support file format handling, web service communications, data management, user interaction, and 3D visualization. File parsers handle a variety of formal and de facto standard file formats used in the field. Each one imports data into a general-purpose common data model supporting multidimensional regular and irregular grids, topography, feature geometry, and more. Data within these data models may be manipulated, combined, reprojected, and visualized. The OEF's visualization features support a variety of conventional and new visualization techniques for looking at topography, tomography, point clouds, imagery, maps, and feature geometry. 3D data such as

  8. SEE-GRID eInfrastructure for Regional eScience

    Science.gov (United States)

    Prnjat, Ognjen; Balaz, Antun; Vudragovic, Dusan; Liabotis, Ioannis; Sener, Cevat; Marovic, Branko; Kozlovszky, Miklos; Neagu, Gabriel

    In the past 6 years, a number of targeted initiatives, funded by the European Commission via its information society and RTD programmes and Greek infrastructure development actions, have articulated a successful regional development actions in South East Europe that can be used as a role model for other international developments. The SEEREN (South-East European Research and Education Networking initiative) project, through its two phases, established the SEE segment of the pan-European G ´EANT network and successfully connected the research and scientific communities in the region. Currently, the SEE-LIGHT project is working towards establishing a dark-fiber backbone that will interconnect most national Research and Education networks in the region. On the distributed computing and storage provisioning i.e. Grid plane, the SEE-GRID (South-East European GRID e-Infrastructure Development) project, similarly through its two phases, has established a strong human network in the area of scientific computing and has set up a powerful regional Grid infrastructure, and attracted a number of applications from different fields from countries throughout the South-East Europe. The current SEEGRID-SCI project, ending in April 2010, empowers the regional user communities from fields of meteorology, seismology and environmental protection in common use and sharing of the regional e-Infrastructure. Current technical initiatives in formulation are focusing on a set of coordinated actions in the area of HPC and application fields making use of HPC initiatives. Finally, the current SEERA-EI project brings together policy makers - programme managers from 10 countries in the region. The project aims to establish a communication platform between programme managers, pave the way towards common e-Infrastructure strategy and vision, and implement concrete actions for common funding of electronic infrastructures on the regional level. The regional vision on establishing an e

  9. Open Access Citation Advantage in selected Information Science journals: an extended analysis to altmetrics indicators

    Directory of Open Access Journals (Sweden)

    Paulo Roberto Cintra

    2017-04-01

    Full Text Available Introduction: Open access refers to scientific literature available free of charge and free of copyright restrictions and licensing for its reuse. An increase in the total number of citations received by articles available in open access in relation to those of restricted, pay-walled access is expected, according to the Open Access Citation Advantage hypothesis. Objective: Assess the possible citation advantages and mentions on the social web that open access can offer to the Information Science area. Methodology: Bibliometric and altmetric indicators were analyzed in two journals: Journal of the American Society for Information Science and Scientometrics. Data collection was conducted in the Web of Science, Google Scholar, Altmetric.com and Mendeley. Results: The results indicated that for both journals, open access offers an advantage in the number of citations received by articles. It was also demonstrated that the advantage is maintained over time. Conclusions: This research confirmed the hypothesis of an Open Access Citation Advantage for the journals analyzed in the area of Information Science. This pattern was also observed for the altmetric data.

  10. Opening back up a path to participation in exoplanet science

    Science.gov (United States)

    Taylor, Stuart F.

    2015-08-01

    We present a long pursuit of participating in exoplanet science that after making good progress, has been blocked while others are caused by supervisors to misrepresent a group of authors as being one less person than the actual contributors.We present first a long period of preparation to join a project such as the private global telescope observatory followed by setting up observational programs that have been presented as successes by those allowed to finish these projects while leaving out the first astronomer.We present subsequent efforts to recover from being ostracized by both seeking alternative routes to participation as well as seeking means to take back the participation cut off without cause.This is a campaign for support from the community to go around the obstructive group by restoring memberships to those groups from which the target of ostracism has been kept out.We present the ideas and contributions given to colleagues to support the observatory being a member institution of the Kepler project, including starting the observatory's first planet confirmation observations and first transit timing observations. Contributed techniques for which credit was taken include weighting the reference stars. Contributions include demonstrating the importance of a wider FOV camera and obtaining better photometric stability.Replacement efforts include transients from planet destruction and using the location of the falloff to measure the rate of planets migrating into stars.We specifically seek for the planet-finding groups supported by this observatory to support restore the opportunity for membership in their collaborations.The long effort to join the Kepler and TESS science teams is well documented. We publicly campaign for these groups to not tolerate ostracism and discrimination by require this observatory to provide due access to its due members order to restore allowing the target of ostracism to take back earned roles in confirming and characterizing the

  11. New Source Code: Spelman Women Transforming the Grid of Science and Technology

    Science.gov (United States)

    Okonkwo, Holly

    From a seminary for newly freedwomen in the 19th century "Deep South" of the United States to a "Model Institution for Excellence" in undergraduate science, technology, engineering, and math education, the narrative of Spelman College is a critical piece to understanding the overall history and socially constructed nature of science and higher education in the U.S. Making a place for science at Spelman College, disrupts and redefines the presumed and acceptable roles of African American women in science and their social, political and economic engagements in U.S society as a whole. Over the course of 16 months, I explore the narrative experiences of members of the Spelman campus community and immerse myself in the environment to experience becoming of member of a scientific community that asserts a place for women of African descent in science and technology and perceives this positionality as positive, powerful and the locus of agency. My intention is to offer this research as an in-depth ethnographic presentation of intentional science learning, knowledge production and practice as lived experiences at the multiple intersections of the constructs of race, gender, positionality and U.S science itself. In this research, I am motivated to move the contemporary discourse of diversifying science, technology, engineering and mathematics fields in the U.S. academy, beyond the chronicling of women of African descent as statistical rarities over time, as subjectivities and the deficit frameworks that theoretically encapsulate their narratives. The findings of this research demonstrate that Spelman students, staff and alumni are themselves, the cultural capital that validates Spelman's identity as a place, its institutional mission and are at the core of the institutional success of the college. It is a personal mission as much as it is an institutional mission, which is precisely what makes it powerful.

  12. Research, collaboration, and open science using web 2.0.

    Science.gov (United States)

    Shee, Kevin; Strong, Michael; Guido, Nicholas J; Lue, Robert A; Church, George M; Viel, Alain

    2010-01-01

    There is little doubt that the Internet has transformed the world in which we live. Information that was once archived in bricks and mortar libraries is now only a click away, and people across the globe have become connected in a manner inconceivable only 20 years ago. Although many scientists and educators have embraced the Internet as an invaluable tool for research, education and data sharing, some have been somewhat slower to take full advantage of emerging Web 2.0 technologies. Here we discuss the benefits and challenges of integrating Web 2.0 applications into undergraduate research and education programs, based on our experience utilizing these technologies in a summer undergraduate research program in synthetic biology at Harvard University. We discuss the use of applications including wiki-based documentation, digital brainstorming, and open data sharing via the Web, to facilitate the educational aspects and collaborative progress of undergraduate research projects. We hope to inspire others to integrate these technologies into their own coursework or research projects.

  13. Research, Collaboration, and Open Science Using Web 2.0

    Directory of Open Access Journals (Sweden)

    Kevin Shee

    2010-10-01

    Full Text Available There is little doubt that the Internet has transformed the world in which we live. Information that was once archived in bricks and mortar libraries is now only a click away, and people across the globe have become connected in a manner inconceivable only 20 years ago. Although many scientists and educators have embraced the Internet as an invaluable tool for research, education and data sharing, some have been somewhat slower to take full advantage of emerging Web 2.0 technologies. Here we discuss the benefits and challenges of integrating Web 2.0 applications into undergraduate research and education programs, based on our experience utilizing these technologies in a summer undergraduate research program in synthetic biology at Harvard University. We discuss the use of applications including wiki-based documentation, digital brainstorming, and open data sharing via the Web, to facilitate the educational aspects and collaborative progress of undergraduate research projects. We hope to inspire others to integrate these technologies into their own coursework or research projects.

  14. Towards a Post Reductionist Science: The Open Universe

    CERN Document Server

    Kauffman, Stuart

    2009-01-01

    In this paper I discuss the reality that deductive inference is not the only way we explain in science. I discuss the role of the opportunity for an adaptation in the biosphere and claim that such an opportunity is a 'blind final cause', not an efficient cause, yet shapes evolution. I also argue that Darwinian exaptations are not describable by sufficient natural law. Based on an argument of Sir Karl Popper, I claim that no law, or function, f, maps a decoherence process in a Special Relativity setting from a specific space-time slice into its future. If true this suggests there can be no theory of everything entailing all that happens. I then discuss whether we can view laws as 'enabling constraints' and what they enable. Finally, in place of the weak Anthropic principle in a multiverse, I suggest that we might consider Darwin all the way down. It is not impossible that a single universe has an abiotic natural selection process for laws as enabling constraints and that the single universe that 'wins' is ours...

  15. Distributed Geant4 simulation in medical and space science applications using DIANE framework and the GRID

    CERN Document Server

    Moscicki, J T; Mantero, A; Pia, M G

    2003-01-01

    Distributed computing is one of the most important trends in IT which has recently gained significance for large-scale scientific applications. Distributed analysis environment (DIANE) is a R&D study, focusing on semiinteractive parallel and remote data analysis and simulation, which has been conducted at CERN. DIANE provides necessary software infrastructure for parallel scientific applications in the master-worker model. Advanced error recovery policies, automatic book-keeping of distributed jobs and on-line monitoring and control tools are provided. DIANE makes a transparent use of a number of different middleware implementations such as load balancing service (LSF, PBS, GRID Resource Broker, Condor) and security service (GSI, Kerberos, openssh). A number of distributed Geant 4 simulations have been deployed and tested, ranging from interactive radiotherapy treatment planning using dedicated clusters in hospitals, to globally-distributed simulations of astrophysics experiments using the European data g...

  16. Analysis gets on the starting Grid

    CERN Multimedia

    Roger Jones

    It is vital for ATLAS to have a functioning distributed analysis system to analyse its data. There are three major Grid deployments in ATLAS (Enabling Grids for E-sciencE, EGEE; the US Open Science Grid, OSG; and the Nordic DataGrid Facility, NGDF), and our data and jobs need to work across all of them, as well as on local machines and batch systems. Users must also be able to locate the data they want and register new small datasets so they can be used later. ATLAS has a suite of products to meet these needs, and a series of Distributed Analysis tutorials are training an increasing number of brave early adopters to use the system. Real users are vital to make sure that the tools are fit for their purpose and to refine our computing model. One such tutorial happened on the 1st and 2nd February at the National eScience Centre in Edinburgh, UK, sponsored by the GridPP Collaboration. The first day introduced an international set of tutees to the basic tools for Grid-based distributed analysis. The architecture...

  17. Design and Implementation of a Library and Information Science Open Access Journal Union Catalogue System

    Directory of Open Access Journals (Sweden)

    Sinn-Cheng Lin

    2017-03-01

    Full Text Available Open access is a mode of academic communication that has been on the rise in recent years, but open access academic resources are widely dispersed across the internet, making it occasionally inconvenient in terms of its use. This research is focused on library and information science, using the OAIS reference model as the system framework, two open access platform, DOAJ and E-LIS as the data sources, and through system implementation develop a “library and information science open access journal union catalogue” system. Using the OAI-PMH protocol as the data interoperability standard, and LAMP as the development environment, four major functionalities: injest, archiving, management and access of information were designed, developed, and integrated into system build. Actual testing and verification showed this system is able to successfully collect data from DOAJ and E-LIS open journal resources related to library and information science. The system is now active and functional, and can be used by researchers in the library and science information field.

  18. Open science: Investigating precipitation cycles in dynamically downscaled data using openly available radar data and open source software

    Science.gov (United States)

    Collis, Scott; helmus, Jonathan; Kotamarthi, Rao; Wang, Jiali; Feng, Yan; Ghate, Virendra

    2016-04-01

    In order to assess infrastructure resilience to climate change in urban centers, climate model output is needed at spatial resolutions required for urban planning. This is most commonly achieved using either empirical or dynamic downscaling at present. The utility of these downscaling methods for assessments depends on having estimates of biases in the models estimate climate variables and their extremes, surface temperature and precipitation as an example, developed using historical data sets. Since precipitation is a multi-scale stochastic process direct comparison with observations is challenging and even modern data sets work at scales too coarse to capture extreme events. Gauge data requires a direct hit by a storm to see the highest rain rates, often leading to an underestimation in the 1-100 year rainfall. This is exacerbated by phenomena such as training that can cause very high gradients in accumulation. This presentation details a long-term (multi-year) study of precipitation derived from open data from the NOAA Next-Generation Radar (NEXRAD) network. Two locations are studied; Portland, Maine, location for a pilot study conducted by the US Department of Homeland Security's on regional resilience to climate change and the Southern Great Plains of Oklahoma, home to the Department of Energy's ARM program. Both are located within 40km of a NEXRAD radar allowing retrievals of rainfall rates on the order of one kilometer using the Python-ARM Radar Toolkit (Py-ART). Both the diurnal and season cycle of precipitation is studied and compared to WRF dynamically downscaled precipitation rates. This project makes heavy use of open source community tools such as project Jupyter and the Scientific Python ecosystem to manage and process 10's of TB of data on midrange cluster infrastructure. Both the meteorological aspects and the data infrastructure and architecture will be discussed.

  19. Federated Space-Time Query for Earth Science Data Using OpenSearch Conventions

    Science.gov (United States)

    Lynnes, Chris; Beaumont, Bruce; Duerr, Ruth; Hua, Hook

    2009-01-01

    This slide presentation reviews a Space-time query system that has been developed to assist the user in finding Earth science data that fulfills the researchers needs. It reviews the reasons why finding Earth science data can be so difficult, and explains the workings of the Space-Time Query with OpenSearch and how this system can assist researchers in finding the required data, It also reviews the developments with client server systems.

  20. Alchemy & algorithms: perspectives on the philosophy and history of open science

    Directory of Open Access Journals (Sweden)

    Leo Lahti

    2017-05-01

    Full Text Available This paper gives the reader a chance to experience, or revisit, PHOS16: a conference on the History and Philosophy of Open Science. In the winter of 2016, we invited a varied international group to engage with these topics at the University of Helsinki, Finland. Our aim was a critical assessment of the defining features, underlying narratives, and overall objectives of the contemporary open science movement. The event brought together contemporary open science scholars, publishers, and advocates to discuss the philosophical foundations and historical roots of openness in academic research. The eight sessions combined historical views with more contemporary perspectives on topics such as transparency, reproducibility, collaboration, publishing, peer review, research ethics, as well as societal impact and engagement. We gathered together expert panelists and 15 invited speakers who have published extensively on these topics, which allowed us to engage in a thorough and multifaceted discussion. Together with our involved audience we charted the role and foundations of openness of research in our time, considered the accumulation and dissemination of scientific knowledge, and debated the various technical, legal, and ethical challenges of the past and present. In this article, we provide an overview of the topics covered at the conference as well as individual video interviews with each speaker. In addition to this, all the talks were recorded and they are offered here as an openly licensed community resource in both video and audio form.

  1. Editorial for special section of grid computing journal on "Cloud Computing and Services Science"

    OpenAIRE

    Sinderen, van, Marten; Ivanov, Ivan I.

    2014-01-01

    This editorial briefly discusses characteristics, technology developments and challenges of cloud computing. It then introduces the papers included in the special issue on "Cloud Computing and Services Science" and positions the work reported in these papers with respect to the previously mentioned challenges.

  2. Editorial for special section of grid computing journal on "Cloud Computing and Services Science"

    NARCIS (Netherlands)

    Sinderen, van Marten J.; Ivanov, Ivan I.

    2014-01-01

    This editorial briefly discusses characteristics, technology developments and challenges of cloud computing. It then introduces the papers included in the special issue on "Cloud Computing and Services Science" and positions the work reported in these papers with respect to the previously mentioned

  3. Extending Binary Large Object Support to Open Grid Services Architecture-Data Access and Integration Middleware Client Toolkit

    Directory of Open Access Journals (Sweden)

    Kiran K. Patnaik

    2011-01-01

    Full Text Available Problem statement: OGSA-DAI middleware allows data resources to be federated and accessed via web services on the web or within grids or clouds. It provides a client API for writing programs that access the exposed databases. Migrating existing applications to the new technology and using a new API to access the data of DBMS with BLOB is difficult and discouraging. A JDBC Driver is a much convenient alternative to existing mechanism and provides an extension to OGSA-DAI middleware and allows applications to use databases exposed in a grid through the OGSA-DAI 3.0. However, the driver does not support Binary Large Objects (BLOB. Approach: The driver is enhanced to support BLOB using the OGSA-DAI Client API. It transforms the JDBC calls into an OGSA-DAI workflow request and sends it to the server using Web Services (WS. The client API of OGSA-DAI uses activities that are connected to form a workflow and executed using a pipeline. This workflow mechanism is embedded into the driver. The WS container dispatches the request to the OGSA-DAI middleware for processing and the result is then transformed back to an instance of ResultSet implementation using the OGSA-DAI Client API, before it is returned to the user. Results: Test on handling of BLOBs (images, flash files and videos ranging from size 1 KB to size 2 GB were carried out on Oracle, MySQL and PostgreSQL databases using our enhanced JDBC driver and it performed well. Conclusion: The enhanced JDBC driver now can offer users, with no experience in Grid computing specifically on OGSA-DAI, the possibility to give their applications the ability to access databases exposed on the grid with minimal effort.

  4. Unlocking the Full Potential of Open Innovation in the Life Sciences through Classification Standards (under review)

    DEFF Research Database (Denmark)

    Nilsson, Niclas; Minssen, Timo

    2017-01-01

    ecosystem. The need to put such policies into practice is also acknowledged in international collaborations, such as the Innovative Medicines Initiative (IMI) exploring new models for public-private partnerships [3]. Open Innovation enables a more efficient dialogue between early and late stage research...... the identification of research-driven business opportunities. Transparent communication requires a common definition and standard for open innovation, to align both parties’ expectations. In this paper we suggest a 5-level classification system for the level of openness , to reduce the contract negotiation...... complexity and times, between two parties looking to engage in open innovation. The intention is to systematize definitions and contractual terms for open innovation from an operational aspect in the life science industry to reduce entry barriers and boost collaborative value generation....

  5. Introducing Enabling Computational Tools to the Climate Sciences: Multi-Resolution Climate Modeling with Adaptive Cubed-Sphere Grids

    Energy Technology Data Exchange (ETDEWEB)

    Jablonowski, Christiane [Univ. of Michigan, Ann Arbor, MI (United States)

    2015-07-14

    The research investigates and advances strategies how to bridge the scale discrepancies between local, regional and global phenomena in climate models without the prohibitive computational costs of global cloud-resolving simulations. In particular, the research explores new frontiers in computational geoscience by introducing high-order Adaptive Mesh Refinement (AMR) techniques into climate research. AMR and statically-adapted variable-resolution approaches represent an emerging trend for atmospheric models and are likely to become the new norm in future-generation weather and climate models. The research advances the understanding of multi-scale interactions in the climate system and showcases a pathway how to model these interactions effectively with advanced computational tools, like the Chombo AMR library developed at the Lawrence Berkeley National Laboratory. The research is interdisciplinary and combines applied mathematics, scientific computing and the atmospheric sciences. In this research project, a hierarchy of high-order atmospheric models on cubed-sphere computational grids have been developed that serve as an algorithmic prototype for the finite-volume solution-adaptive Chombo-AMR approach. The foci of the investigations have lied on the characteristics of both static mesh adaptations and dynamically-adaptive grids that can capture flow fields of interest like tropical cyclones. Six research themes have been chosen. These are (1) the introduction of adaptive mesh refinement techniques into the climate sciences, (2) advanced algorithms for nonhydrostatic atmospheric dynamical cores, (3) an assessment of the interplay between resolved-scale dynamical motions and subgrid-scale physical parameterizations, (4) evaluation techniques for atmospheric model hierarchies, (5) the comparison of AMR refinement strategies and (6) tropical cyclone studies with a focus on multi-scale interactions and variable-resolution modeling. The results of this research project

  6. INDIGO: Building a DataCloud Framework to support Open Science

    Science.gov (United States)

    Chen, Yin; de Lucas, Jesus Marco; Aguilar, Fenando; Fiore, Sandro; Rossi, Massimiliano; Ferrari, Tiziana

    2016-04-01

    New solutions are required to support Data Intensive Science in the emerging panorama of e-infrastructures, including Grid, Cloud and HPC services. The architecture proposed by the INDIGO-DataCloud (INtegrating Distributed data Infrastructures for Global ExplOitation) (https://www.indigo-datacloud.eu/) H2020 project, provides the path to integrate IaaS resources and PaaS platforms to provide SaaS solutions, while satisfying the requirements posed by different Research Communities, including several in Earth Science. This contribution introduces the INDIGO DataCloud architecture, describes the methodology followed to assure the integration of the requirements from different research communities, including examples like ENES, LifeWatch or EMSO, and how they will build their solutions using different INDIGO components.

  7. Grid Architecture 2

    Energy Technology Data Exchange (ETDEWEB)

    Taft, Jeffrey D. [Pacific Northwest National Lab. (PNNL), Richland, WA (United States)

    2016-01-01

    The report describes work done on Grid Architecture under the auspices of the Department of Electricity Office of Electricity Delivery and Reliability in 2015. As described in the first Grid Architecture report, the primary purpose of this work is to provide stakeholder insight about grid issues so as to enable superior decision making on their part. Doing this requires the creation of various work products, including oft-times complex diagrams, analyses, and explanations. This report provides architectural insights into several important grid topics and also describes work done to advance the science of Grid Architecture as well.

  8. A Survey of Physical Sciences, Engineering and Mathematics Faculty Regarding Author Fees in Open Access Journals

    Science.gov (United States)

    Cusker, Jeremy; Rauh, Anne E.

    2014-01-01

    Discussions of the potential of open access publishing frequently must contend with the skepticism of research authors regarding the need to pay author fees (also known as publication fees). With that in mind, the authors undertook a survey of faculty, postdocs, and graduate students in physical science, mathematics, and engineering fields at two…

  9. Polymers – A New Open Access Scientific Journal on Polymer Science

    OpenAIRE

    Shu-Kun Lin

    2009-01-01

    Polymers is a new interdisciplinary, Open Access scientific journal on polymer science, published by Molecular Diversity Preservation International (MDPI). This journal welcomes manuscript submissions on polymer chemistry, macromolecular chemistry, polymer physics, polymer characterization and all related topics. Both synthetic polymers and natural polymers, including biopolymers, are considered. Manuscripts will be thoroughly peer-reviewed in a timely fashion, and papers will be published, i...

  10. A Survey of Physical Sciences, Engineering and Mathematics Faculty Regarding Author Fees in Open Access Journals

    Science.gov (United States)

    Cusker, Jeremy; Rauh, Anne E.

    2014-01-01

    Discussions of the potential of open access publishing frequently must contend with the skepticism of research authors regarding the need to pay author fees (also known as publication fees). With that in mind, the authors undertook a survey of faculty, postdocs, and graduate students in physical science, mathematics, and engineering fields at two…

  11. The integration of open access journals in the scholarly communication system: Three science fields

    DEFF Research Database (Denmark)

    Faber Frandsen, Tove

    2009-01-01

    The greatest number of open access journals (OAJs) is found in the sciences and their influence is growing. However, there are only a few studies on the acceptance and thereby integration of these OAJs in the scholarly communication system. Even fewer studies provide insight into the differences...

  12. A Bright Spark: Open Teaching of Science Using Faraday's Lectures on Candles

    Science.gov (United States)

    Walker, Mark; Groger, Martin; Schutler, Kirsten; Mosler, Bernd

    2008-01-01

    As well as being a founding father of modern chemistry and physics Michael Faraday was also a skilled lecturer, able to explain scientific principles and ideas simply and concisely to nonscientific audiences. However science didactics today emphasizes the use of open and student-centered methods of teaching in which students find and develop…

  13. Dancing on the Grid: using e-Science tools to extend choreographic research.

    Science.gov (United States)

    Bailey, Helen; Bachler, Michelle; Buckingham Shum, Simon; Le Blanc, Anja; Popat, Sita; Rowley, Andrew; Turner, Martin

    2009-07-13

    This paper considers the role and impact of new and emerging e-Science tools on practice-led research in dance. Specifically, it draws on findings from the e-Dance project. This 2-year project brings together an interdisciplinary team combining research aspects of choreography, next generation of videoconferencing and human-computer interaction analysis incorporating hypermedia and nonlinear annotations for recording and documentation.

  14. Impact of problem finding on the quality of authentic open inquiry science research projects

    Science.gov (United States)

    Labanca, Frank

    2008-11-01

    Problem finding is a creative process whereby individuals develop original ideas for study. Secondary science students who successfully participate in authentic, novel, open inquiry studies must engage in problem finding to determine viable and suitable topics. This study examined problem finding strategies employed by students who successfully completed and presented the results of their open inquiry research at the 2007 Connecticut Science Fair and the 2007 International Science and Engineering Fair. A multicase qualitative study was framed through the lenses of creativity, inquiry strategies, and situated cognition learning theory. Data were triangulated by methods (interviews, document analysis, surveys) and sources (students, teachers, mentors, fair directors, documents). The data demonstrated that the quality of student projects was directly impacted by the quality of their problem finding. Effective problem finding was a result of students using resources from previous, specialized experiences. They had a positive self-concept and a temperament for both the creative and logical perspectives of science research. Successful problem finding was derived from an idiosyncratic, nonlinear, and flexible use and understanding of inquiry. Finally, problem finding was influenced and assisted by the community of practicing scientists, with whom the students had an exceptional ability to communicate effectively. As a result, there appears to be a juxtaposition of creative and logical/analytical thought for open inquiry that may not be present in other forms of inquiry. Instructional strategies are suggested for teachers of science research students to improve the quality of problem finding for their students and their subsequent research projects.

  15. Documentation in Otolaryngology. Sharing Otolaryngology research data in an open science ecosyste

    Directory of Open Access Journals (Sweden)

    Fernanda PESET

    2016-11-01

    Full Text Available Introduction and objective: The present text addresses the most significant aspects to share research data in otolaryngology in the context of open science as an ecosystem. Its aim is to offer a panoramic view that helps the researcher to manage their data as part of enriched science. Method: A bibliographical review and of the own experience in the field of the investigation data was performed. Results: The basic pillars for success are offered: its political, technical and necessary capacities. Discussion: The tasks of making data available should be recognized as part of the researcher's curriculum because documenting them to be reusable is a highly specialized and time-consuming task. Conclusions: It is considered that we are at a crucial moment to begin to share data. It is being considered in all scientific policy scenarios as in the EU through the European Open Science Computing.

  16. SMART GRIDS LABORATORIES INVENTORY 2015

    OpenAIRE

    PONCELA BLANCO MARTA; PRETTICO GIUSEPPE; ANDREADOU NIKOLETA; OLARIAGA-GUARDIOLA Miguel; FULLI Gianluca; COVRIG CATALIN-FELIX

    2015-01-01

    A smart electricity grid opens the door to a myriad of new applications aimed at enhancing security of supply, sustainability and market competitiveness. Gathering detailed information about smart grid laboratories activities represents a primary need. In order to obtain a better picture of the ongoing Smart Grid developments, after the successful smart grid project survey initiated in 2011, we recently launched a focused on-line survey addressed to organisations owning or running Smart Grid ...

  17. Free and Open Source Software for Geospatial in the field of planetary science

    Science.gov (United States)

    Frigeri, A.

    2012-12-01

    Information technology applied to geospatial analyses has spread quickly in the last ten years. The availability of OpenData and data from collaborative mapping projects increased the interest on tools, procedures and methods to handle spatially-related information. Free Open Source Software projects devoted to geospatial data handling are gaining a good success as the use of interoperable formats and protocols allow the user to choose what pipeline of tools and libraries is needed to solve a particular task, adapting the software scene to his specific problem. In particular, the Free Open Source model of development mimics the scientific method very well, and researchers should be naturally encouraged to take part to the development process of these software projects, as this represent a very agile way to interact among several institutions. When it comes to planetary sciences, geospatial Free Open Source Software is gaining a key role in projects that commonly involve different subjects in an international scenario. Very popular software suites for processing scientific mission data (for example, ISIS) and for navigation/planning (SPICE) are being distributed along with the source code and the interaction between user and developer is often very strict, creating a continuum between these two figures. A very widely spread library for handling geospatial data (GDAL) has started to support planetary data from the Planetary Data System, and recent contributions enabled the support to other popular data formats used in planetary science, as the Vicar one. The use of Geographic Information System in planetary science is now diffused, and Free Open Source GIS, open GIS formats and network protocols allow to extend existing tools and methods developed to solve Earth based problems, also to the case of the study of solar system bodies. A day in the working life of a researcher using Free Open Source Software for geospatial will be presented, as well as benefits and

  18. Physical Science Informatics: Providing Open Science Access to Microheater Array Boiling Experiment Data

    Science.gov (United States)

    McQuillen, John; Green, Robert D.; Henrie, Ben; Miller, Teresa; Chiaramonte, Fran

    2014-01-01

    The Physical Science Informatics (PSI) system is the next step in this an effort to make NASA sponsored flight data available to the scientific and engineering community, along with the general public. The experimental data, from six overall disciplines, Combustion Science, Fluid Physics, Complex Fluids, Fundamental Physics, and Materials Science, will present some unique challenges. Besides data in textual or numerical format, large portions of both the raw and analyzed data for many of these experiments are digital images and video, requiring large data storage requirements. In addition, the accessible data will include experiment design and engineering data (including applicable drawings), any analytical or numerical models, publications, reports, and patents, and any commercial products developed as a result of the research. This objective of paper includes the following: Present the preliminary layout (Figure 2) of MABE data within the PSI database. Obtain feedback on the layout. Present the procedure to obtain access to this database.

  19. Mobile Open-Source Solar-Powered 3-D Printers for Distributed Manufacturing in Off-Grid Communities

    National Research Council Canada - National Science Library

    Debbie L. King; Adegboyega Babasola; Joseph Rozario; Joshua M. Pearce

    2014-01-01

    .... This study designs and demonstrates the technical viability of two open-source mobile digital manufacturing facilities powered with solar photovoltaics, and capable of printing customizable OSAT in any...

  20. Harnessing the Use of Open Learning Exchange to Support Basic Education in Science and Mathematics in the Philippines

    Science.gov (United States)

    Feliciano, Josephine S.; Mandapat, Louie Carl R.; Khan, Concepcion L.

    2013-01-01

    This paper presents the open learning initiatives of the Science Education Institute of the Department of Science and Technology to overcome certain barriers, such as enabling access, cost of replication, timely feedback, monitoring and continuous improvement of learning modules. Using an open-education model, like MIT's (Massachusetts Institute…

  1. BOOK REVIEW: OPENING SCIENCE, THE EVOLVING GUIDE ON HOW THE INTERNET IS CHANGING RESEARCH, COLLABORATION, AND SCHOLARLY PUBLISHING

    Science.gov (United States)

    The way we get our funding, collaborate, do our research, and get the word out has evolved over hundreds of years but we can imagine a more open science world, largely facilitated by the internet. The movement towards this more open way of doing and presenting science is coming, ...

  2. The Grid : english version

    CERN Multimedia

    Rosy Mondardini Producer

    2003-01-01

    The Grid : . Sharing resources owned by many different organizations to access remote computers, software, and data efficiently and automatically . Secure access to establish the identity of a user or resource, after defining conditions under which sharing occurs . Bridging distance using high-speed connections between computers to create a global Grid . Open standards to allow applications designed for one Grid to run on all others

  3. Maintaining the momentum of Open Search in Earth Science Data discovery

    Science.gov (United States)

    Newman, D. J.; Lynnes, C.

    2013-12-01

    Federated Search for Earth Observation data has been a hallmark of EOSDIS (Earth Observing System Data and Information System) for two decades. Originally, the EOSDIS Version 0 system provided both data-collection-level and granule/file-level search in the mid 1990s with EOSDIS-specific socket protocols and message formats. Since that time, the advent of several standards has helped to simplify EOSDIS federated search, beginning with HTTP as the transfer protocol. Most recently, OpenSearch (www.opensearch.org) was employed for the EOS Clearinghouse (ECHO), based on a set of conventions that had been developed within the Earth Science Information Partners (ESIP) Federation. The ECHO OpenSearch API has evolved to encompass the ESIP RFC and the Open Geospatial Consortium (OGC) Open Search standard. Uptake of the ECHO Open Search API has been significant and has made ECHO accessible to client developers that found the previous ECHO SOAP API and current REST API too complex. Client adoption of the OpenSearch API appears to be largely driven by the simplicity of the OpenSearch convention. This simplicity is thus important to retain as the standard and convention evolve. For example, ECHO metrics indicate that the vast majority of ECHO users favor the following search criteria when using the REST API, - Spatial - bounding box, polygon, line and point - Temporal - start and end time - Keywords - free text Fewer than 10% of searches use additional constraints, particularly those requiring a controlled vocabulary, such as instrument, sensor, etc. This suggests that ongoing standardization efforts around OpenSearch usage for Earth Observation data may be more productive if oriented toward improving support for the Spatial, Temporal and Keyword search aspects. Areas still requiring improvement include support of - Concrete requirements for keyword constraints - Phrasal search for keyword constraints - Temporal constraint relations - Terminological symmetry between search URLs

  4. Open access behaviours and perceptions of health sciences faculty and roles of information professionals.

    Science.gov (United States)

    Lwoga, Edda T; Questier, Frederik

    2015-03-01

    This study sought to investigate the faculty's awareness, attitudes and use of open access, and the role of information professionals in supporting open access (OA) scholarly communication in Tanzanian health sciences universities. A cross-sectional survey was conducted. Semi-structured interviews were conducted with 16 librarians, while questionnaires were physically distributed to 415 faculty members in all eight Tanzanian health sciences universities, with a response rate of 71.1%. The study found that most faculty members were aware about OA issues. However, the high level of OA awareness among faculty members did not translate into actual dissemination of faculty's research outputs through OA web avenues. A small proportion of faculty's research materials was made available as OA. Faculty were more engaged with OA journal publishing than with self-archiving practices. Senior faculty with proficient technical skills were more likely to use open access than junior faculty. Major barriers to OA usage were related to ICT infrastructure, awareness, skills, author-pay model, and copyright and plagiarism concerns. Interviews with librarians revealed that there was a strong support for promoting OA issues on campus; however, this positive support with various open access-related tasks did not translate into actual action. It is thus important for librarians and OA administrators to consider all these factors for effective implementation of OA projects in research and academic institutions. This is the first comprehensive and detailed study focusing on the health sciences faculty's and librarians' behaviours and perceptions of open access initiatives in Tanzania and reveals findings that are useful for planning and implementing open access initiatives in other institutions with similar conditions. © 2015 Health Libraries Journal.

  5. Openness to Experience and Intellect Differentially Predict Creative Achievement in the Arts and Sciences.

    Science.gov (United States)

    Kaufman, Scott Barry; Quilty, Lena C; Grazioplene, Rachael G; Hirsh, Jacob B; Gray, Jeremy R; Peterson, Jordan B; DeYoung, Colin G

    2016-04-01

    The Big Five personality dimension Openness/Intellect is the trait most closely associated with creativity and creative achievement. Little is known, however, regarding the discriminant validity of its two aspects-Openness to Experience (reflecting cognitive engagement with perception, fantasy, aesthetics, and emotions) and Intellect (reflecting cognitive engagement with abstract and semantic information, primarily through reasoning)-in relation to creativity. In four demographically diverse samples totaling 1,035 participants, we investigated the independent predictive validity of Openness and Intellect by assessing the relations among cognitive ability, divergent thinking, personality, and creative achievement across the arts and sciences. We confirmed the hypothesis that whereas Openness predicts creative achievement in the arts, Intellect predicts creative achievement in the sciences. Inclusion of performance measures of general cognitive ability and divergent thinking indicated that the relation of Intellect to scientific creativity may be due at least in part to these abilities. Lastly, we found that Extraversion additionally predicted creative achievement in the arts, independently of Openness. Results are discussed in the context of dual-process theory.

  6. Open Access Research Via Collaborative Educational Blogging: A Case Study from Library & Information Science

    Directory of Open Access Journals (Sweden)

    Kristen Radsliff Rebmann

    2017-09-01

    Full Text Available This article charts the development of activities for online graduate students in library and information science. Project goals include helping students develop competencies in understanding open access publishing, synthesizing research in the field, and engaging in scholarly communication via collaborative educational blogging. Using a design experiment approach as a research strategy, focus is placed on the design of the collaborative blogging activity, open access research as a knowledge domain, and analyses of four iterations of the project. Findings from this iterative learning design suggest several benefits of implementing collaborative educational blogging activities in distance contexts.

  7. Wikipedia as Open Science: non-expert involvement in controversial scientific issues

    OpenAIRE

    Aibar Puentes, Eduard; Lerga Felip, Maura

    2016-01-01

    This study considers Wikipedia as a sui generis instance of Open Science and analyses how the non-expert or lay character of the average Wikipedia editor and the open and collaborative model of this free encyclopaedia are actually shaping the way controversial scientific issues are presented. Aquest estudi considera la Viquipèdia com un exemple sui generis de la ciència oberta, i analitza cóm el no expert o el caràcter laic de l'editor de Viquipèdia i el model col·laboratiu i obert d'aques...

  8. Global forces and local currents in Argentina's science policy crossroads: restricted access or open knowledge

    Directory of Open Access Journals (Sweden)

    Horacio Javier Etchichury

    2014-11-01

    Full Text Available The article describes the tensions between two competing approaches to scientific policy in Argentina. The traditional vision favors autonomous research. The neoliberal conception fosters the link between science and markets. In the past few years, a neodevelopmentalist current also tries to stress relevance of scientific research. Finally, the article describes how the Open Access movement has entered the debate. The World Bank intervention and the human rights dimension of the question are discussed in depth. The article introduces the notion of open knowledge as a guiding criterion to design a human-rights based scientific policy.

  9. Collaboration at International, National and Institutional Level – Vital in Fostering Open Science

    Directory of Open Access Journals (Sweden)

    Kristiina Hormia-Poutanen

    2016-05-01

    Full Text Available Open science and open research provide potential for new discoveries and solutions to global problems, thus are automatically extending beyond the boundaries of an individual research laboratory. By nature they imply and lead to collaboration among researchers. This collaboration should be established on all possible levels: institutional, national and international. The present paper looks at the situation in Finland, it shows how these collaborations are organized at the various levels. The special role played by LIBER is evidenced. The advantages of these collaborations are highlighted.

  10. Multi-Grid Boron-10 detector for large area applications in neutron scattering science

    CERN Document Server

    Andersen, Ken; Birch, Jens; Buffet, Jean-Claude; Correa, Jonathan; van Esch, Patrick; Guerard, Bruno; Hall-Wilton, Richard; Hultman, Lars; Höglund, Carina; Jensen, Jens; Khaplanov, Anton; Kirstein, Oliver; Piscitelli, Francesco; Vettier, Christian

    2012-01-01

    The present supply of 3He can no longer meet the detector demands of the upcoming ESS facility and continued detector upgrades at current neutron sources. Therefore viable alternative technologies are required to support the development of cutting-edge instrumentation for neutron scattering science. In this context, 10B-based detectors are being developed by collaboration between the ESS, ILL, and Link\\"{o}ping University. This paper reports on progress of this technology and the prospects applying it in modern neutron scattering experiments. The detector is made-up of multiple rectangular gas counter tubes coated with B4C, enriched in 10B. An anode wire reads out each tube, thereby giving position of conversion in one of the lateral co-ordinates as well as in depth of the detector. Position resolution in the remaining co-ordinate is obtained by segmenting the cathode tube itself. Boron carbide films have been produced at Link\\"{o}ping University and a detector built at ILL. The characterization study is pres...

  11. The ICTJA-CSIC Science Week 2016: an open door to Earth Sciences for secondary education students

    Science.gov (United States)

    Cortes-Picas, Jordi; Diaz, Jordi; Fernandez-Turiel, Jose-Luis; Garcia-Castellanos, Daniel; Geyer, Adelina; Jurado, Maria-Jose; Montoya, Encarni; Rejas Alejos, Marta; Sánchez-Pastor, Pilar; Valverde-Perez, Angel

    2017-04-01

    The Science Week is one of the main scientific outreach events every year in Spain. The Institute of Earth Sciences Jaume Almera of CSIC (ICTJA-CSIC) participates in it since many years ago, opening its doors and proposing several activities in which it is shown what kind of multidisciplinary research is being developed at the Institute and in Geosciences. The activities,developed as workshops, are designed and conducted by scientific and technical personnel of the centre, who participates in the Science Week voluntarily. The activities proposed by the ICTJA-CSIC staff are designed for a target audience composed by secondary school students (12-18 years). The ICTJA-CSIC joined Science Week 2016 in the framework of the activity entitled "What we investigate in Earth Sciences?". The aim is to show to the society what is being investigated in the ICTJA-CSIC. In addition, it is intended, with the contact and interaction between the public and the institute researchers, to increase the interest in scientific activity and, if possible, to generate new vocations in the field of the Earth Sciences among secondary school pupils. We show in this communication the experience of the Science Week 2016 at the ICTJA-CSIC, carried out with the effort and commitment of the of the Institute's personnel with the outreach of Earth Sciences research. Between November 14th and 19th 2016, more than 100 students from four secondary schools from Barcelona area visited the Institute and took part in the Science Week. A total of six interactive workshops were prepared showing different features of seismology, geophysical borehole logging, analog and digital modelling, paleoecology, volcanology and geochemistry. As a novelty, this year a new workshop based on an augmented reality sandbox was offered to show and to simulate the processes of creation and evolution of the topographic relief. In addition, within the workshop dedicated to geophysical borehole logging, six exact replicates of

  12. An Open, Large-Scale, Collaborative Effort to Estimate the Reproducibility of Psychological Science.

    Science.gov (United States)

    2012-11-01

    Reproducibility is a defining feature of science. However, because of strong incentives for innovation and weak incentives for confirmation, direct replication is rarely practiced or published. The Reproducibility Project is an open, large-scale, collaborative effort to systematically examine the rate and predictors of reproducibility in psychological science. So far, 72 volunteer researchers from 41 institutions have organized to openly and transparently replicate studies published in three prominent psychological journals in 2008. Multiple methods will be used to evaluate the findings, calculate an empirical rate of replication, and investigate factors that predict reproducibility. Whatever the result, a better understanding of reproducibility will ultimately improve confidence in scientific methodology and findings. © The Author(s) 2012.

  13. Challenges of Virtual and Open Distance Science Teacher Education in Zimbabwe

    Directory of Open Access Journals (Sweden)

    Vongai Mpofu

    2012-01-01

    Full Text Available This paper reports on a study of the implementation of science teacher education through virtual and open distance learning in the Mashonaland Central Province, Zimbabwe. The study provides insight into challenges faced by students and lecturers on inception of the program at four centres. Data was collected from completed evaluation survey forms of forty-two lecturers who were directly involved at the launch of the program and in-depth interviews. Qualitative data analysis revealed that the programme faces potential threat from centre-, institution-, lecturer-, and student-related factors. These include limited resources, large classes, inadequate expertise in open and distance education, inappropriate science teacher education qualifications, implementer conflict of interest in program participation, students’ low self-esteem, lack of awareness of quality parameters of delivery systems among staff, and lack of standard criteria to measure the quality of services. The paper recommends that issues raised be addressed in order to produce quality teachers.

  14. Introduction; open access in the social and political sciences : threat or opportunity?

    OpenAIRE

    Bull, MJ

    2016-01-01

    This article introduces a Symposium which brings together the academic and publishing industry in two key countries (the UK and the US) to analyse and assess the implications of Open Access (OA) journal publishing in the social and political sciences, as well as its different formats and developments to date. With articles by three academics (all involved in academic associations) and three publishers, the Symposium represents an exchange of views which help each of the two sectors understand...

  15. Polymers – A New Open Access Scientific Journal on Polymer Science

    Directory of Open Access Journals (Sweden)

    Shu-Kun Lin

    2009-12-01

    Full Text Available Polymers is a new interdisciplinary, Open Access scientific journal on polymer science, published by Molecular Diversity Preservation International (MDPI. This journal welcomes manuscript submissions on polymer chemistry, macromolecular chemistry, polymer physics, polymer characterization and all related topics. Both synthetic polymers and natural polymers, including biopolymers, are considered. Manuscripts will be thoroughly peer-reviewed in a timely fashion, and papers will be published, if accepted, within 6 to 8 weeks after submission. [...

  16. Creating a clinical video-conferencing facility in a security-constrained environment using open-source AccessGrid software and consumer hardware.

    Science.gov (United States)

    Terrazas, Enrique; Hamill, Timothy R; Wang, Ye; Channing Rodgers, R P

    2007-10-11

    The Department of Laboratory Medicine at the University of California, San Francisco (UCSF) has been split into widely separated facilities, leading to much time being spent traveling between facilities for meetings. We installed an open-source AccessGrid multi-media-conferencing system using (largely) consumer-grade equipment, connecting 6 sites at 5 separate facilities. The system was accepted rapidly and enthusiastically, and was inexpensive compared to alternative approaches. Security was addressed by aspects of the AG software and by local network administrative practices. The chief obstacles to deployment arose from security restrictions imposed by multiple independent network administration regimes, requiring a drastically reduced list of network ports employed by AG components.

  17. An open data repository and a data processing software toolset of an equivalent Nordic grid model matched to historical electricity market data.

    Science.gov (United States)

    Vanfretti, Luigi; Olsen, Svein H; Arava, V S Narasimham; Laera, Giuseppe; Bidadfar, Ali; Rabuzin, Tin; Jakobsen, Sigurd H; Lavenius, Jan; Baudette, Maxime; Gómez-López, Francisco J

    2017-04-01

    This article presents an open data repository, the methodology to generate it and the associated data processing software developed to consolidate an hourly snapshot historical data set for the year 2015 to an equivalent Nordic power grid model (aka Nordic 44), the consolidation was achieved by matching the model׳s physical response w.r.t historical power flow records in the bidding regions of the Nordic grid that are available from the Nordic electricity market agent, Nord Pool. The model is made available in the form of CIM v14, Modelica and PSS/E (Siemens PTI) files. The Nordic 44 model in Modelica and PSS/E were first presented in the paper titled "iTesla Power Systems Library (iPSL): A Modelica library for phasor time-domain simulations" (Vanfretti et al., 2016) [1] for a single snapshot. In the digital repository being made available with the submission of this paper (SmarTSLab_Nordic44 Repository at Github, 2016) [2], a total of 8760 snapshots (for the year 2015) that can be used to initialize and execute dynamic simulations using tools compatible with CIM v14, the Modelica language and the proprietary PSS/E tool are provided. The Python scripts to generate the snapshots (processed data) are also available with all the data in the GitHub repository (SmarTSLab_Nordic44 Repository at Github, 2016) [2]. This Nordic 44 equivalent model was also used in iTesla project (iTesla) [3] to carry out simulations within a dynamic security assessment toolset (iTesla, 2016) [4], and has been further enhanced during the ITEA3 OpenCPS project (iTEA3) [5]. The raw, processed data and output models utilized within the iTesla platform (iTesla, 2016) [4] are also available in the repository. The CIM and Modelica snapshots of the "Nordic 44" model for the year 2015 are available in a Zenodo repository.

  18. Spatial Data Infrastructures and Grid Computing: the GDI-Grid project

    Science.gov (United States)

    Padberg, A.; Kiehle, C.

    2009-04-01

    Distribution of spatial data through standards compliant spatial data infrastructures (SDI) is a fairly common practice nowadays. The Open Geospatial Consortium (OGC) offers a broad range of implementation specifications for accessing and presenting spatial data. In December 2007 the OGC published the Web Processing Service specification (WPS) for extending the capabilities of a SDI to include the processing of distributed data. By utilizing a WPS it is possible to shift the workload from the client to the server. Furthermore it is possible to create automated workflows that include data processing without the need for user interaction or manual computation of data via a desktop GIS. When complex processes are offered or large amounts of data are processed by a WPS, the computational power of the server might not suffice. Especially when such processes are invoked by a multitude of users the server might not able to provide the wanted performance. In this case, Grid Computing is one way to provide the required computational power by accessing great quantities of worker nodes in an existing Grid infrastructure through a Grid middleware. Due to their respective origins the paradigms of SDIs and Grid infrastructures differ significantly in several important matters. For instance security is handled differently in the scope of OWS and Grid Computing. While the OGC does not yet specify a comprehensive security concept, strict security rules are a top priority in Grid Computing where providers need a certain degree of control over their resources and users want to process sensitive data on external resources. To create a SDI that is able to benefit from the computational power and the vast storage capacities of a Grid infrastructure it is necessary to overcome all the conceptual differences between OWS and Grid Computing. GDI-Grid (english: SDI-Grid) is a project funded by the German Ministry for Science and Education. It aims at bridging the aforementioned gaps between

  19. Federated Space-Time Query for Earth Science Data Using OpenSearch Conventions

    Science.gov (United States)

    Lynnes, C.; Beaumont, B.; Duerr, R. E.; Hua, H.

    2009-12-01

    The past decade has seen a burgeoning of remote sensing and Earth science data providers, as evidenced in the growth of the Earth Science Information Partner (ESIP) federation. At the same time, the need to combine diverse data sets to enable understanding of the Earth as a system has also grown. While the expansion of data providers is in general a boon to such studies, the diversity presents a challenge to finding useful data for a given study. Locating all the data files with aerosol information for a particular volcanic eruption, for example, may involve learning and using several different search tools to execute the requisite space-time queries. To address this issue, the ESIP federation is developing a federated space-time query framework, based on the OpenSearch convention (www.opensearch.org), with Geo and Time extensions. In this framework, data providers publish OpenSearch Description Documents that describe in a machine-readable form how to execute queries against the provider. The novelty of OpenSearch is that the space-time query interface becomes both machine callable and easy enough to integrate into the web browser's search box. This flexibility, together with a simple REST (HTTP-get) interface, should allow a variety of data providers to participate in the federated search framework, from large institutional data centers to individual scientists. The simple interface enables trivial querying of multiple data sources and participation in recursive-like federated searches--all using the same common OpenSearch interface. This simplicity also makes the construction of clients easy, as does existing OpenSearch client libraries in a variety of languages. Moreover, a number of clients and aggregation services already exist and OpenSearch is already supported by a number of web browsers such as Firefox and Internet Explorer.

  20. Life science research and drug discovery at the turn of the 21st century: the experience of SwissBioGrid.

    Science.gov (United States)

    den Besten, Matthijs; Thomas, Arthur J; Schroeder, Ralph

    2009-04-22

    It is often said that the life sciences are transforming into an information science. As laboratory experiments are starting to yield ever increasing amounts of data and the capacity to deal with those data is catching up, an increasing share of scientific activity is seen to be taking place outside the laboratories, sifting through the data and modelling "in silico" the processes observed "in vitro." The transformation of the life sciences and similar developments in other disciplines have inspired a variety of initiatives around the world to create technical infrastructure to support the new scientific practices that are emerging. The e-Science programme in the United Kingdom and the NSF Office for Cyberinfrastructure are examples of these. In Switzerland there have been no such national initiatives. Yet, this has not prevented scientists from exploring the development of similar types of computing infrastructures. In 2004, a group of researchers in Switzerland established a project, SwissBioGrid, to explore whether Grid computing technologies could be successfully deployed within the life sciences. This paper presents their experiences as a case study of how the life sciences are currently operating as an information science and presents the lessons learned about how existing institutional and technical arrangements facilitate or impede this operation. SwissBioGrid gave rise to two pilot projects: one for proteomics data analysis and the other for high-throughput molecular docking ("virtual screening") to find new drugs for neglected diseases (specifically, for dengue fever). The proteomics project was an example of a data management problem, applying many different analysis algorithms to Terabyte-sized datasets from mass spectrometry, involving comparisons with many different reference databases; the virtual screening project was more a purely computational problem, modelling the interactions of millions of small molecules with a limited number of protein

  1. GridAPPS-D

    Energy Technology Data Exchange (ETDEWEB)

    2017-03-28

    GridAPPS-D is an open-source, open architecture, standards based platform for development of advanced electric power system planning and operations applications. GridAPPS-D provides a documented data abstraction for the application developer enabling creation of applications that can be run in any compliant system or platform. This enables development of applications that are platform vendor independent applications and applications that take advantage of the possibility of data rich and data driven applications based on deployment of smart grid devices and systems.

  2. An 11-year global gridded aerosol optical thickness reanalysis (v1.0) for atmospheric and climate sciences

    Science.gov (United States)

    Lynch, Peng; Reid, Jeffrey S.; Westphal, Douglas L.; Zhang, Jianglong; Hogan, Timothy F.; Hyer, Edward J.; Curtis, Cynthia A.; Hegg, Dean A.; Shi, Yingxi; Campbell, James R.; Rubin, Juli I.; Sessions, Walter R.; Turk, F. Joseph; Walker, Annette L.

    2016-04-01

    While stand alone satellite and model aerosol products see wide utilization, there is a significant need in numerous atmospheric and climate applications for a fused product on a regular grid. Aerosol data assimilation is an operational reality at numerous centers, and like meteorological reanalyses, aerosol reanalyses will see significant use in the near future. Here we present a standardized 2003-2013 global 1 × 1° and 6-hourly modal aerosol optical thickness (AOT) reanalysis product. This data set can be applied to basic and applied Earth system science studies of significant aerosol events, aerosol impacts on numerical weather prediction, and electro-optical propagation and sensor performance, among other uses. This paper describes the science of how to develop and score an aerosol reanalysis product. This reanalysis utilizes a modified Navy Aerosol Analysis and Prediction System (NAAPS) at its core and assimilates quality controlled retrievals of AOT from the Moderate Resolution Imaging Spectroradiometer (MODIS) on Terra and Aqua and the Multi-angle Imaging SpectroRadiometer (MISR) on Terra. The aerosol source functions, including dust and smoke, were regionally tuned to obtain the best match between the model fine- and coarse-mode AOTs and the Aerosol Robotic Network (AERONET) AOTs. Other model processes, including deposition, were tuned to minimize the AOT difference between the model and satellite AOT. Aerosol wet deposition in the tropics is driven with satellite-retrieved precipitation, rather than the model field. The final reanalyzed fine- and coarse-mode AOT at 550 nm is shown to have good agreement with AERONET observations, with global mean root mean square error around 0.1 for both fine- and coarse-mode AOTs. This paper includes a discussion of issues particular to aerosol reanalyses that make them distinct from standard meteorological reanalyses, considerations for extending such a reanalysis outside of the NASA A-Train era, and examples of how

  3. Massive open online courses in health sciences from Latin American institutions: A need for improvement?

    Science.gov (United States)

    Culquichicón, Carlos; Helguero-Santin, Luis M; Labán-Seminario, L Max; Cardona-Ospina, Jaime A; Aboshady, Omar A; Correa, Ricardo

    2017-01-01

    Background: Massive open online courses (MOOCs) have undergone exponential growth over the past few years, offering free and worldwide access to high-quality education. We identified the characteristics of MOOCs in the health sciences offered by Latin American institutions (LAIs). Methods: We screened the eight leading MOOCs platforms to gather their list of offerings. The MOOCs were classified by region and subject. Then, we obtained the following information: Scopus H-index for each institution and course instructor, QS World University Ranking® 2015/16 of LAI, and official language of the course. Results: Our search identified 4170 MOOCs worldwide. From them, 205 MOOCs were offered by LAIs, and six MOOCs were health sciences related. Most of these courses (n = 115) were offered through Coursera. One health science MOOC was taught by three instructors, of which only one was registered in Scopus (H-index = 0). The remaining five health science MOOCs had solely one instructor (H-index = 4 [0-17]). The Latin American country with the highest participation was Brazil (n = 11). Conclusion: The contribution of LAI to MOOCs in the health sciences is low.

  4. When open data is a Trojan Horse: The weaponization of transparency in science and governance

    Directory of Open Access Journals (Sweden)

    Karen EC Levy

    2016-03-01

    Full Text Available Openness and transparency are becoming hallmarks of responsible data practice in science and governance. Concerns about data falsification, erroneous analysis, and misleading presentation of research results have recently strengthened the call for new procedures that ensure public accountability for data-driven decisions. Though we generally count ourselves in favor of increased transparency in data practice, this Commentary highlights a caveat. We suggest that legislative efforts that invoke the language of data transparency can sometimes function as “Trojan Horses” through which other political goals are pursued. Framing these maneuvers in the language of transparency can be strategic, because approaches that emphasize open access to data carry tremendous appeal, particularly in current political and technological contexts. We illustrate our argument through two examples of pro-transparency policy efforts, one historical and one current: industry-backed “sound science” initiatives in the 1990s, and contemporary legislative efforts to open environmental data to public inspection. Rules that exist mainly to impede science-based policy processes weaponize the concept of data transparency. The discussion illustrates that, much as Big Data itself requires critical assessment, the processes and principles that attend it—like transparency—also carry political valence, and, as such, warrant careful analysis.

  5. When open data is a Trojan Horse: The weaponization of transparency in science and governance

    Directory of Open Access Journals (Sweden)

    Karen EC Levy

    2016-03-01

    Full Text Available Openness and transparency are becoming hallmarks of responsible data practice in science and governance. Concerns about data falsification, erroneous analysis, and misleading presentation of research results have recently strengthened the call for new procedures that ensure public accountability for data-driven decisions. Though we generally count ourselves in favor of increased transparency in data practice, this Commentary highlights a caveat. We suggest that legislative efforts that invoke the language of data transparency can sometimes function as “Trojan Horses” through which other political goals are pursued. Framing these maneuvers in the language of transparency can be strategic, because approaches that emphasize open access to data carry tremendous appeal, particularly in current political and technological contexts. We illustrate our argument through two examples of pro-transparency policy efforts, one historical and one current: industry-backed “sound science” initiatives in the 1990s, and contemporary legislative efforts to open environmental data to public inspection. Rules that exist mainly to impede science-based policy processes weaponize the concept of data transparency. The discussion illustrates that, much as Big Data itself requires critical assessment, the processes and principles that attend it—like transparency—also carry political valence, and, as such, warrant careful analysis.

  6. Open Source Science: The Gravitational Wave Processing-Enabled Archive for NANOGrav

    Science.gov (United States)

    Brazier, Adam; Cordes, James M.; Dieng, Awa; Ferdman, Robert; Garver-Daniels, Nathaniel; hawkins, steven; Hendrick, Justin; Huerta, Eliu; Lam, Michael T.; Lazio, T. Joseph W.; Lynch, Ryan S.; NANOGrav Consortium

    2016-01-01

    The North American Nanohertz Gravitational Wave Observatory (NANOGrav) dataset comprises pulsar timing data and data products from a continuing decades-long campaign of observations and high-precision analysis of over 40 millisecond pulsars conducted with the intent to detect nanohertz gravitational waves. Employing a team of developers, researchers and undergraduates, we have built an open source interface based on iPython/Jupyter notebooks allowing programmatic access and processing of the archived raw data and data products in order to greatly enhance science throughput. This is accomplished by: allowing instant access to the current dataset, subject to proprietary periods; providing an intuitive sandbox environment with a growing standard suite of analysis software to enhance learning opportunities for students in the NANOGrav science areas; and driving the development and sharing of new open source analysis tools. We also provide a separate web visualization interface, primarily developed by undergraduates, that allows the user to perform natural queries for data table construction and download, providing an environment for plotting both primary science and diagnostic data, with the next iteration allowing for real-time analysis tools such as model fitting and high-precision timing.

  7. 基于服务科学理论的智能电网创新研究%Smart Grid Innovation Based on Service Science

    Institute of Scientific and Technical Information of China (English)

    张海燕; 张涛; 王荣

    2012-01-01

    目前“智能电网”的研究主要侧重于技术的创新与应用,鲜有人员或机构直接将智能电网纳入服务经济发展的大背景进行研究.文章试图以“服务科学”作为理论基础对智能电网进行创新.希望为智能电网的研究开发与电网未来的发展提供一定的参考,将智能电网的建设置于当今服务经济的大背景下,得以全方位地适应未来全球市场和服务经济为主导的发展方向.%Currently, "smart grid" study focused primarily on technological innovation and application, few persons or institutions from the "service" cut into the perspective of intelligent research. This article attempts to use "services science" as a theoretical basis on smart grid research re-examined. Hope that through this research and analysis for smart grid and the future development of power, the paper can supply some reference value. The smart grid, under the background of the economy, can meet the full range of future global market and service economy oriented development.

  8. Creation of the new Open Access journal: "Library and Information Science Critique: Journal of the Sciences of Information Recorded in Documents"

    OpenAIRE

    Muela-Meza, Zapopan Martín; Torres-Reyes, José Antonio

    2008-01-01

    This document explains the creation of the new Open Access journal: "Library and Information Science Critique: Journal of the Sciences of Information Recorded in Documents," and it also include guidelines for authors' contributions. The journal has been created at the Research Centre in Information Recorded in Documents at the School of Philosophy and Letters at the Nuevo Leon Autonomous University, Monterrey, Nuevo Leon, Mexico.

  9. CERN readies world's biggest science grid The computing network now encompasses more than 100 sites in 31 countries

    CERN Multimedia

    Niccolai, James

    2005-01-01

    If the Large Hadron Collider (LHC) at CERN is to yield miraculous discoveries in particle physics, it may also require a small miracle in grid computing. By a lack of suitable tools from commercial vendors, engineers at the famed Geneva laboratory are hard at work building a giant grid to store and process the vast amount of data the collider is expected to produce when it begins operations in mid-2007 (2 pages)

  10. Open Science CBS Neuroimaging Repository: Sharing ultra-high-field MR images of the brain.

    Science.gov (United States)

    Tardif, Christine Lucas; Schäfer, Andreas; Trampel, Robert; Villringer, Arno; Turner, Robert; Bazin, Pierre-Louis

    2016-01-01

    Magnetic resonance imaging at ultra high field opens the door to quantitative brain imaging at sub-millimeter isotropic resolutions. However, novel image processing tools to analyze these new rich datasets are lacking. In this article, we introduce the Open Science CBS Neuroimaging Repository: a unique repository of high-resolution and quantitative images acquired at 7 T. The motivation for this project is to increase interest for high-resolution and quantitative imaging and stimulate the development of image processing tools developed specifically for high-field data. Our growing repository currently includes datasets from MP2RAGE and multi-echo FLASH sequences from 28 and 20 healthy subjects respectively. These datasets represent the current state-of-the-art in in-vivo relaxometry at 7 T, and are now fully available to the entire neuroimaging community. Copyright © 2015 Elsevier Inc. All rights reserved.

  11. Open data used in water sciences - Review of access, licenses and understandability

    Science.gov (United States)

    Falkenroth, Esa; Lagerbäck Adolphi, Emma; Arheimer, Berit

    2016-04-01

    The amount of open data available for hydrology research is continually growing. In the EU-funded project SWITCH-ON (Sharing Water-related Information to Tackle Changes in the Hydrosphere - for Operational Needs: www.water-switch-on.eu), we are addressing water concerns by exploring and exploiting the untapped potential of these new open data. This work is enabled by many ongoing efforts to facilitate the use of open data. For instance, a number of portals provide the means to search for open data sets and open spatial data services (such as the GEOSS Portal, INSPIRE community geoportal or various Climate Services and public portals). However, in general, many research groups in water sciences still hesitate in using this open data. We therefore examined some limiting factors. Factors that limit usability of a dataset include: (1) accessibility, (2) understandability and (3) licences. In the SWITCH-ON project we have developed a search tool for finding and accessing data with relevance to water science in Europe, as the existing ones are not addressing data needs in water sciences specifically. The tool is filled with some 9000 sets of metadata and each one is linked to water related key-words. The keywords are based on the ones developed within the CUAHSI community in USA, but extended with non-hydrosphere topics, additional subclasses and only showing keywords actually having data. Access to data sets: 78% of the data is directly accessible, while the rest is either available after registration and request, or through a web client for visualisation but without direct download. However, several data sets were found to be inaccessible due to server downtime, incorrect links or problems with the host database management system. One possible explanation for this could be that many datasets have been assembled by research project that no longer are funded. Hence, their server infrastructure would be less maintained compared to large-scale operational services

  12. Geosciences: An Open Access Journal on Earth and Planetary Sciences and Their Interdisciplinary Approaches

    Directory of Open Access Journals (Sweden)

    Jesus Martinez-Frias

    2011-05-01

    Full Text Available On behalf of the Editorial Board and the editorial management staff of MDPI, it is my great pleasure to introduce this new journal Geosciences. Geosciences is an international, peer-reviewed open access journal, which publishes original papers, rapid communications, technical notes and review articles, and discussions about all interdisciplinary aspects of the earth and planetary sciences. Geosciences may also include papers presented at scientific conferences (proceedings or articles on a well defined topic assembled by individual editors or organizations/institutions (special publications.

  13. Toward Transparent and Reproducible Science: Using Open Source "Big Data" Tools for Water Resources Assessment

    Science.gov (United States)

    Buytaert, W.; Zulkafli, Z. D.; Vitolo, C.

    2014-12-01

    Transparency and reproducibility are fundamental properties of good science. In the current era of large and diverse datasets and long and complex workflows for data analysis and inference, ensuring such transparency and reproducibility is challenging. Hydrological science is a good case in point, because the discipline typically uses a large variety of datasets ranging from local observations to large-scale remotely sensed products. These data are often obtained from various different sources, and integrated using complex yet uncertain modelling tools. In this paper, we present and discuss methods of ensuring transparency and reproducibility in scientific workflows for hydrological data analysis for the purpose of water resources assessment, using relevant examples of emerging open source "big data" tools. First, we discuss standards for data storage, access, and processing that allow improving the modularity of a hydrological analysis workflow. In particular standards emerging from the Open Geospatial Consortium, such as the Sensor Observation Service, the Web Coverage Service, hold promise. However, some bottlenecks such as the availability of data models and the ability to work with spatio-temperal subsets of large datasets, need further development. Next, we focus on available methods to build transparent data processing workflows. Again, standards such as OGC's Web Processing Service are being developed to facilitate web-based analytics. Yet, in practice, the experimental nature of these standards and web services in general often requires a more pragmatic approach. The availability of web technologies in popular open source data analysis environments such as R and Python often makes them an attractive solution for workflow creation and sharing. Lastly, we elaborate on the potential of open source solutions hold in the context of participatory approaches to data collection and knowledge generation. Using examples from the tropical Andes and the Himalayas, we

  14. Enabling a new Paradigm to Address Big Data and Open Science Challenges

    Science.gov (United States)

    Ramamurthy, Mohan; Fisher, Ward

    2017-04-01

    Data are not only the lifeblood of the geosciences but they have become the currency of the modern world in science and society. Rapid advances in computing, communi¬cations, and observational technologies — along with concomitant advances in high-resolution modeling, ensemble and coupled-systems predictions of the Earth system — are revolutionizing nearly every aspect of our field. Modern data volumes from high-resolution ensemble prediction/projection/simulation systems and next-generation remote-sensing systems like hyper-spectral satellite sensors and phased-array radars are staggering. For example, CMIP efforts alone will generate many petabytes of climate projection data for use in assessments of climate change. And NOAA's National Climatic Data Center projects that it will archive over 350 petabytes by 2030. For researchers and educators, this deluge and the increasing complexity of data brings challenges along with the opportunities for discovery and scientific breakthroughs. The potential for big data to transform the geosciences is enormous, but realizing the next frontier depends on effectively managing, analyzing, and exploiting these heterogeneous data sources, extracting knowledge and useful information from heterogeneous data sources in ways that were previously impossible, to enable discoveries and gain new insights. At the same time, there is a growing focus on the area of "Reproducibility or Replicability in Science" that has implications for Open Science. The advent of cloud computing has opened new avenues for not only addressing both big data and Open Science challenges to accelerate scientific discoveries. However, to successfully leverage the enormous potential of cloud technologies, it will require the data providers and the scientific communities to develop new paradigms to enable next-generation workflows and transform the conduct of science. Making data readily available is a necessary but not a sufficient condition. Data providers

  15. GSNL 2.0: leveraging on Open Science to promote science-based decision making in Disaster Risk Reduction

    Science.gov (United States)

    Salvi, Stefano; Rubbia, Giuliana; Abruzzese, Luigi

    2017-04-01

    In 2010 the GEO Geohazard Supersites and Natural Laboratories initiative (GSNL) launched the concept of a global partnership among the geophysical scientific community and the satellite and in situ data providers, aiming to promote scientific advancements in the knowledge of seismic and volcanic phenomena. The initial goal was successfully achieved, and many more new scientific results were obtained than it could have been possible if the Supersites had not existed (http://www.earthobservations.org/gsnl.php). At the same time the Supersites have demonstrated to be able to effectively support the rapid transfer of useful scientific information to the risk managers, exploiting the existing institutional relationships between the Supersite coordinators and the local decision makers. However, a more demanding call for action is given by the Sendai Framework 2015-2030 (outcome of the 2015 UN World Conference on Disaster Risk Reduction), where for the first time the knowledge of the risk components and the science based decision-making process are defined as top priorities for an effective DRR. There are evident possible synergies between the Sendai framework, GEO, the CEOS (Committee on Earth Observation Satellites), and GSNL, but for maximum benefit and effectiveness the latter needs to progress at a faster pace towards a full implementation of the Open Science approach to geohazard science. In the above global framework the Supersites can represent local test beds where to experiment coordination, collaboration and communication approaches and technological solutions tailored to the local situation, to ensure that the scientific community can contribute the information needed for the best possible decision making. This vision and the new developments of GSNL 2.0 have been approved by the GEO Program Board, and a clear roadmap has been set for the period 2017-2019. We will present the approach and the implementation plan at the conference.

  16. Data Science: History repeated? - The heritage of the Free and Open Source GIS community

    Science.gov (United States)

    Löwe, Peter; Neteler, Markus

    2014-05-01

    Data Science is described as the process of knowledge extraction from large data sets by means of scientific methods. The discipline draws heavily from techniques and theories from many fields, which are jointly used to furthermore develop information retrieval on structured or unstructured very large datasets. While the term Data Science was already coined in 1960, the current perception of this field places is still in the first section of the hype cycle according to Gartner, being well en route from the technology trigger stage to the peak of inflated expectations. In our view the future development of Data Science could benefit from the analysis of experiences from related evolutionary processes. One predecessor is the area of Geographic Information Systems (GIS). The intrinsic scope of GIS is the integration and storage of spatial information from often heterogeneous sources, data analysis, sharing of reconstructed or aggregated results in visual form or via data transfer. GIS is successfully applied to process and analyse spatially referenced content in a wide and still expanding range of science areas, spanning from human and social sciences like archeology, politics and architecture to environmental and geoscientific applications, even including planetology. This paper presents proven patterns for innovation and organisation derived from the evolution of GIS, which can be ported to Data Science. Within the GIS landscape, three strategic interacting tiers can be denoted: i) Standardisation, ii) applications based on closed-source software, without the option of access to and analysis of the implemented algorithms, and iii) Free and Open Source Software (FOSS) based on freely accessible program code enabling analysis, education and ,improvement by everyone. This paper focuses on patterns gained from the synthesis of three decades of FOSS development. We identified best-practices which evolved from long term FOSS projects, describe the role of community

  17. Afraid of Scooping – Case Study on Researcher Strategies against Fear of Scooping in the Context of Open Science

    Directory of Open Access Journals (Sweden)

    Heidi Laine

    2017-06-01

    Full Text Available The risk of scooping is often used as a counter argument for open science, especially open data. In this case study I have examined openness strategies, practices and attitudes in two open collaboration research projects created by Finnish researchers, in order to understand what made them resistant to the fear of scooping. The radically open approach of the projects includes open by default funding proposals, co-authorship and community membership. Primary sources used are interviews of the projects’ founding members. The analysis indicates that openness requires trust in close peers, but not necessarily in research community or society at large. Based on the case study evidence, focusing on intrinsic goals, like new knowledge and bringing about ethical reform, instead of external goals such as publications, supports openness. Understanding fundaments of science, philosophy of science and research ethics, can also have a beneficial effect on willingness to share. Whether there are aspects in open sharing that makes it seem riskier from the point of view of certain demographical groups within research community, such as women, could be worth closer inspection.

  18. Citizen Science and Open Data: a model for Invasive Alien Species in Europe

    Directory of Open Access Journals (Sweden)

    Ana Cristina Cardoso

    2017-07-01

    Full Text Available Invasive Alien Species (IAS are a growing threat to Europe's biodiversity. The implementation of European Union Regulation on IAS can benefit from the involvement of the public in IAS recording and management through Citizen Science (CS initiatives. Aiming to tackle issues related with the use of CS projects on IAS topics, a dedicated workshop titled “Citizen Science and Open Data: a model for Invasive Alien Species in Europe” was organized by the Joint Research Centre (JRC and the European Cooperation in Science and Technology (COST Association. Fifty key stakeholders from all Europe, including two Members of the European Parliament, attended the workshop. With a clear focus on IAS, the workshop aimed at addressing the following issues: a CS and policy, b citizen engagement, and c CS data management. Nine short presentations provided input on CS and IAS issues. Participants discussed specific topics in several round tables (“world café” style and reported back their conclusions to the audience and full assembly moderated discussions. Overall, the workshop enabled the sharing of ideas, approaches and best practices regarding CS and IAS. Specific opportunities and pitfalls of using CS data in the whole policy cycle for IAS were recognized. Concerning the implementation of the IAS Regulation, CS data could complement official surveillance systems, and contribute to the early warning of the IAS of Union concern after appropriate validation by the Member States’ competent authorities. CS projects can additionally increase awareness and empower citizens. Attendees pointed out the importance for further public engagement in CS projects on IAS that demonstrate specific initiatives and approaches and analyze lessons learned from past experiences. In addition, the workshop noted that the data gathered from different CS projects on IAS are fragmented. It highlighted the need for using an open and accessible platform to upload data originating

  19. Implementation of Computational Grid Services in Enterprise Grid Environments

    Directory of Open Access Journals (Sweden)

    R. J.A. Richard

    2008-01-01

    Full Text Available Grid Computing refers to the development of high performance computing environment or virtual super computing environment by utilizing available computing resources in a LAN, WAN and Internet. This new emerging research field offers enormous opportunities for e-Science applications such as astrophysics, bioinformatics, aerospace modeling, cancer research etc. Grid involves coordinating and sharing of computing power, application, data storage, network resources etc., across dynamically and geographically dispersed organizations. Most Grid environments are developed using Globus toolkit which is a UNIX/Linux based middleware to integrate computational resources over the network. The emergence of Global Grid concept provides an excellent opportunity for Grid based e-Science applications to use high performance super computing environments. Thus windows based enterprise grid environments can't be neglected in the development of Global Grids. This study discusses the basics of enterprise grids and the implementation of enterprise computational grids using Alchemi Tool Kit. This review study is organized into three parts. They are (i Introduction of Grid Technologies, (ii Design Concepts of Enterprise Grids and (iii Implementation of Computational Grid Services.

  20. Contributions to Desktop Grid Computing : From High Throughput Computing to Data-Intensive Sciences on Hybrid Distributed Computing Infrastructures

    OpenAIRE

    Fedak, Gilles

    2015-01-01

    Since the mid 90’s, Desktop Grid Computing - i.e the idea of using a large number of remote PCs distributed on the Internet to execute large parallel applications - has proved to be an efficient paradigm to provide a large computational power at the fraction of the cost of a dedicated computing infrastructure.This document presents my contributions over the last decade to broaden the scope of Desktop Grid Computing. My research has followed three different directions. The first direction has ...

  1. Running an open experiment: transparency and reproducibility in soil and ecosystem science

    Science.gov (United States)

    Bond-Lamberty, Ben; Peyton Smith, A.; Bailey, Vanessa

    2016-08-01

    Researchers in soil and ecosystem science, and almost every other field, are being pushed—by funders, journals, governments, and their peers—to increase transparency and reproducibility of their work. A key part of this effort is a move towards open data as a way to fight post-publication data loss, improve data and code quality, enable powerful meta- and cross-disciplinary analyses, and increase trust in, and the efficiency of, publicly-funded research. Many scientists however lack experience in, and may be unsure of the benefits of, making their data and fully-reproducible analyses publicly available. Here we describe a recent ‘open experiment’, in which we documented every aspect of a soil incubation online, making all raw data, scripts, diagnostics, final analyses, and manuscripts available in real time. We found that using tools such as version control, issue tracking, and open-source statistical software improved data integrity, accelerated our team’s communication and productivity, and ensured transparency. There are many avenues to improve scientific reproducibility and data availability, of which is this only one example, and it is not an approach suited for every experiment or situation. Nonetheless, we encourage the communities in our respective fields to consider its advantages, and to lead rather than follow with respect to scientific reproducibility, transparency, and data availability.

  2. Running an open experiment: transparency and reproducibility in soil and ecosystem science

    Energy Technology Data Exchange (ETDEWEB)

    Bond-Lamberty, Benjamin; Smith, Ashly P.; Bailey, Vanessa L.

    2016-07-29

    Researchers in soil and ecosystem science, and almost every other field, are being pushed--by funders, journals, governments, and their peers--to increase transparency and reproducibility of their work. A key part of this effort is a move towards open data as a way to fight post-publication data loss, improve data and code quality, enable powerful meta- and cross-disciplinary analyses, and increase trust in, and the efficiency of, publicly-funded research. Many scientists however lack experience in, and may be unsure of the benefits of, making their data and fully-reproducible analyses publicly available. Here we describe a recent "open experiment", in which we documented every aspect of a soil incubation online, making all raw data, scripts, diagnostics, final analyses, and manuscripts available in real time. We found that using tools such as version control, issue tracking, and open-source statistical software improved data integrity, accelerated our team's communication and productivity, and ensured transparency. There are many avenues to improve scientific reproducibility and data availability, of which is this only one example, and it is not an approach suited for every experiment or situation. Nonetheless, we encourage the communities in our respective fields to consider its advantages, and to lead rather than follow with respect to scientific reproducibility, transparency, and data availability.

  3. Progress in Open-World, Integrative, Collaborative Science Data Platforms (Invited)

    Science.gov (United States)

    Fox, P. A.

    2013-12-01

    As collaborative, or network science spreads into more Earth and space science fields, both the participants and their funders have expressed a very strong desire for highly functional data and information capabilities that are a) easy to use, b) integrated in a variety of ways, c) leverage prior investments and keep pace with rapid technical change, and d) are not expensive or time-consuming to build or maintain. In response, and based on our accumulated experience over the last decade and a maturing of several key technical approaches, we have adapted, extended, and integrated several open source applications and frameworks that handle major portions of functionality for these platforms. At minimum, these functions include: an object-type repository, collaboration tools, an ability to identify and manage all key entities in the platform, and an integrated portal to manage diverse content and applications, with varied access levels and privacy options. At a conceptual level, science networks (even small ones) deal with people, and many intellectual artifacts produced or consumed in research, organizational and/our outreach activities, as well as the relations among them. Increasingly these networks are modeled as knowledge networks, i.e. graphs with named and typed relations among the 'nodes'. Nodes can be people, organizations, datasets, events, presentations, publications, videos, meetings, reports, groups, and more. In this heterogeneous ecosystem, it is also important to use a set of common informatics approaches to co-design and co-evolve the needed science data platforms based on what real people want to use them for. In this contribution, we present our methods and results for information modeling, adapting, integrating and evolving a networked data science and information architecture based on several open source technologies (Drupal, VIVO, the Comprehensive Knowledge Archive Network; CKAN, and the Global Handle System; GHS). In particular we present both

  4. The pilot way to Grid resources using glideinWMS

    CERN Document Server

    Sfiligoi, Igor; Holzman, Burt; Mhashilkar, Parag; Padhi, Sanjay; Wurthwrin, Frank

    Grid computing has become very popular in big and widespread scientific communities with high computing demands, like high energy physics. Computing resources are being distributed over many independent sites with only a thin layer of grid middleware shared between them. This deployment model has proven to be very convenient for computing resource providers, but has introduced several problems for the users of the system, the three major being the complexity of job scheduling, the non-uniformity of compute resources, and the lack of good job monitoring. Pilot jobs address all the above problems by creating a virtual private computing pool on top of grid resources. This paper presents both the general pilot concept, as well as a concrete implementation, called glideinWMS, deployed in the Open Science Grid.

  5. Integrating Grid Services into the Cray XT4 Environment

    Energy Technology Data Exchange (ETDEWEB)

    NERSC; Cholia, Shreyas; Lin, Hwa-Chun Wendy

    2009-05-01

    The 38640 core Cray XT4"Franklin" system at the National Energy Research Scientific Computing Center (NERSC) is a massively parallel resource available to Department of Energy researchers that also provides on-demand grid computing to the Open Science Grid. The integration of grid services on Franklin presented various challenges, including fundamental differences between the interactive and compute nodes, a stripped down compute-node operating system without dynamic library support, a shared-root environment and idiosyncratic application launching. Inour work, we describe how we resolved these challenges on a running, general-purpose production system to provide on-demand compute, storage, accounting and monitoring services through generic grid interfaces that mask the underlying system-specific details for the end user.

  6. Grid computing enhances standards-compatible geospatial catalogue service

    Science.gov (United States)

    Chen, Aijun; Di, Liping; Bai, Yuqi; Wei, Yaxing; Liu, Yang

    2010-04-01

    A catalogue service facilitates sharing, discovery, retrieval, management of, and access to large volumes of distributed geospatial resources, for example data, services, applications, and their replicas on the Internet. Grid computing provides an infrastructure for effective use of computing, storage, and other resources available online. The Open Geospatial Consortium has proposed a catalogue service specification and a series of profiles for promoting the interoperability of geospatial resources. By referring to the profile of the catalogue service for Web, an innovative information model of a catalogue service is proposed to offer Grid-enabled registry, management, retrieval of and access to geospatial resources and their replicas. This information model extends the e-business registry information model by adopting several geospatial data and service metadata standards—the International Organization for Standardization (ISO)'s 19115/19119 standards and the US Federal Geographic Data Committee (FGDC) and US National Aeronautics and Space Administration (NASA) metadata standards for describing and indexing geospatial resources. In order to select the optimal geospatial resources and their replicas managed by the Grid, the Grid data management service and information service from the Globus Toolkits are closely integrated with the extended catalogue information model. Based on this new model, a catalogue service is implemented first as a Web service. Then, the catalogue service is further developed as a Grid service conforming to Grid service specifications. The catalogue service can be deployed in both the Web and Grid environments and accessed by standard Web services or authorized Grid services, respectively. The catalogue service has been implemented at the George Mason University/Center for Spatial Information Science and Systems (GMU/CSISS), managing more than 17 TB of geospatial data and geospatial Grid services. This service makes it easy to share and

  7. Toward Open Science at the European Scale: Geospatial Semantic Array Programming for Integrated Environmental Modelling

    Science.gov (United States)

    de Rigo, Daniele; Corti, Paolo; Caudullo, Giovanni; McInerney, Daniel; Di Leo, Margherita; San-Miguel-Ayanz, Jesús

    2013-04-01

    Interfacing science and policy raises challenging issues when large spatial-scale (regional, continental, global) environmental problems need transdisciplinary integration within a context of modelling complexity and multiple sources of uncertainty [1]. This is characteristic of science-based support for environmental policy at European scale [1], and key aspects have also long been investigated by European Commission transnational research [2-5]. Parameters ofthe neededdata- transformations ? = {?1????m} (a.5) Wide-scale transdisciplinary modelling for environment. Approaches (either of computational science or of policy-making) suitable at a given domain-specific scale may not be appropriate for wide-scale transdisciplinary modelling for environment (WSTMe) and corresponding policy-making [6-10]. In WSTMe, the characteristic heterogeneity of available spatial information (a) and complexity of the required data-transformation modelling (D- TM) appeal for a paradigm shift in how computational science supports such peculiarly extensive integration processes. In particular, emerging wide-scale integration requirements of typical currently available domain-specific modelling strategies may include increased robustness and scalability along with enhanced transparency and reproducibility [11-15]. This challenging shift toward open data [16] and reproducible research [11] (open science) is also strongly suggested by the potential - sometimes neglected - huge impact of cascading effects of errors [1,14,17-19] within the impressively growing interconnection among domain-specific computational models and frameworks. From a computational science perspective, transdisciplinary approaches to integrated natural resources modelling and management (INRMM) [20] can exploit advanced geospatial modelling techniques with an awesome battery of free scientific software [21,22] for generating new information and knowledge from the plethora of composite data [23-26]. From the perspective

  8. Grid Computing

    Indian Academy of Sciences (India)

    2016-05-01

    A computing grid interconnects resources such as high performancecomputers, scientific databases, and computercontrolledscientific instruments of cooperating organizationseach of which is autonomous. It precedes and is quitedifferent from cloud computing, which provides computingresources by vendors to customers on demand. In this article,we describe the grid computing model and enumerate themajor differences between grid and cloud computing.

  9. Listmania. How lists can open up fresh possibilities for research in the history of science.

    Science.gov (United States)

    Delbourgo, James; Müller-Wille, Staffan

    2012-12-01

    Anthropologists, linguists, cultural historians, and literary scholars have long emphasized the value of examining writing as a material practice and have often invoked the list as a paradigmatic example thereof. This Focus section explores how lists can open up fresh possibilities for research in the history of science. Drawing on examples from the early modern period, the contributors argue that attention to practices of list making reveals important relations between mercantile, administrative, and scientific attempts to organize the contents of the world. Early modern lists projected both spatial and temporal visions of nature: they inventoried objects in the process of exchange and collection; they projected possible trajectories for future endeavor; they publicized the social identities of scientific practitioners; and they became research tools that transformed understandings of the natural order.

  10. Opening science the evolving guide on how the Internet is changing research, collaboration and scholarly publishing

    CERN Document Server

    Friesike, Sascha

    2014-01-01

    Modern information and communication technologies, together with a cultural upheaval within the research community, have profoundly changed research in nearly every aspect. Ranging from sharing and discussing ideas in social networks for scientists to new collaborative environments and novel publication formats, knowledge creation and dissemination as we know it is experiencing a vigorous shift towards increased transparency, collaboration and accessibility. Many assume that research workflows will change more in the next 20 years than they have in the last 200. This book provides researchers, decision makers, and other scientific stakeholders with a snapshot of the basics, the tools, and the underlying visions that drive the current scientific (r)evolution, often called ‘Open Science.’

  11. Supporting the advancement of science: Open access publishing and the role of mandates

    Directory of Open Access Journals (Sweden)

    Phelps Lisa

    2012-01-01

    Full Text Available Abstract In December 2011 the United States House of Representatives introduced a new bill, the Research Works Act (H.R.3699, which if passed could threaten the public's access to US government funded research. In a digital age when professional and lay parties alike look more and more to the online environment to keep up to date with developments in their fields, does this bill serve the best interests of the community? Those in support of the Research Works Act argue that government open access mandates undermine peer-review and take intellectual property from publishers without compensation, however journals like Journal of Translational Medicine show that this is not the case. Journal of Translational Medicine in affiliation with the Society for Immunotherapy of Cancer demonstrates how private and public organisations can work together for the advancement of science.

  12. Formatting Open Science: agilely creating multiple document formats for academic manuscripts with Pandoc Scholar

    Directory of Open Access Journals (Sweden)

    Albert Krewinkel

    2017-05-01

    Full Text Available The timely publication of scientific results is essential for dynamic advances in science. The ubiquitous availability of computers which are connected to a global network made the rapid and low-cost distribution of information through electronic channels possible. New concepts, such as Open Access publishing and preprint servers are currently changing the traditional print media business towards a community-driven peer production. However, the cost of scientific literature generation, which is either charged to readers, authors or sponsors, is still high. The main active participants in the authoring and evaluation of scientific manuscripts are volunteers, and the cost for online publishing infrastructure is close to negligible. A major time and cost factor is the formatting of manuscripts in the production stage. In this article we demonstrate the feasibility of writing scientific manuscripts in plain markdown (MD text files, which can be easily converted into common publication formats, such as PDF, HTML or EPUB, using Pandoc. The simple syntax of Markdown assures the long-term readability of raw files and the development of software and workflows. We show the implementation of typical elements of scientific manuscripts—formulas, tables, code blocks and citations—and present tools for editing, collaborative writing and version control. We give an example on how to prepare a manuscript with distinct output formats, a DOCX file for submission to a journal, and a LATEX/PDF version for deposition as a PeerJ preprint. Further, we implemented new features for supporting ‘semantic web’ applications, such as the ‘journal article tag suite’—JATS, and the ‘citation typing ontology’—CiTO standard. Reducing the work spent on manuscript formatting translates directly to time and cost savings for writers, publishers, readers and sponsors. Therefore, the adoption of the MD format contributes to the agile production of open science

  13. Grid Enabled Geospatial Catalogue Web Service

    Science.gov (United States)

    Chen, Ai-Jun; Di, Li-Ping; Wei, Ya-Xing; Liu, Yang; Bui, Yu-Qi; Hu, Chau-Min; Mehrotra, Piyush

    2004-01-01

    Geospatial Catalogue Web Service is a vital service for sharing and interoperating volumes of distributed heterogeneous geospatial resources, such as data, services, applications, and their replicas over the web. Based on the Grid technology and the Open Geospatial Consortium (0GC) s Catalogue Service - Web Information Model, this paper proposes a new information model for Geospatial Catalogue Web Service, named as GCWS which can securely provides Grid-based publishing, managing and querying geospatial data and services, and the transparent access to the replica data and related services under the Grid environment. This information model integrates the information model of the Grid Replica Location Service (RLS)/Monitoring & Discovery Service (MDS) with the information model of OGC Catalogue Service (CSW), and refers to the geospatial data metadata standards from IS0 19115, FGDC and NASA EOS Core System and service metadata standards from IS0 191 19 to extend itself for expressing geospatial resources. Using GCWS, any valid geospatial user, who belongs to an authorized Virtual Organization (VO), can securely publish and manage geospatial resources, especially query on-demand data in the virtual community and get back it through the data-related services which provide functions such as subsetting, reformatting, reprojection etc. This work facilitates the geospatial resources sharing and interoperating under the Grid environment, and implements geospatial resources Grid enabled and Grid technologies geospatial enabled. It 2!so makes researcher to focus on science, 2nd not cn issues with computing ability, data locztic, processir,g and management. GCWS also is a key component for workflow-based virtual geospatial data producing.

  14. When data sharing gets close to 100%: what human paleogenetics can teach the open science movement.

    Science.gov (United States)

    Anagnostou, Paolo; Capocasa, Marco; Milia, Nicola; Sanna, Emanuele; Battaggia, Cinzia; Luzi, Daniela; Destro Bisol, Giovanni

    2015-01-01

    This study analyzes data sharing regarding mitochondrial, Y chromosomal and autosomal polymorphisms in a total of 162 papers on ancient human DNA published between 1988 and 2013. The estimated sharing rate was not far from totality (97.6% ± 2.1%) and substantially higher than observed in other fields of genetic research (evolutionary, medical and forensic genetics). Both a questionnaire-based survey and the examination of Journals' editorial policies suggest that this high sharing rate cannot be simply explained by the need to comply with stakeholders requests. Most data were made available through body text, but the use of primary databases increased in coincidence with the introduction of complete mitochondrial and next-generation sequencing methods. Our study highlights three important aspects. First, our results imply that researchers' awareness of the importance of openness and transparency for scientific progress may complement stakeholders' policies in achieving very high sharing rates. Second, widespread data sharing does not necessarily coincide with a prevalent use of practices which maximize data findability, accessibility, useability and preservation. A detailed look at the different ways in which data are released can be very useful to detect failures to adopt the best sharing modalities and understand how to correct them. Third and finally, the case of human paleogenetics tells us that a widespread awareness of the importance of Open Science may be important to build reliable scientific practices even in the presence of complex experimental challenges.

  15. When Data Sharing Gets Close to 100%: What Human Paleogenetics Can Teach the Open Science Movement

    Science.gov (United States)

    Anagnostou, Paolo; Capocasa, Marco; Milia, Nicola; Sanna, Emanuele; Battaggia, Cinzia; Luzi, Daniela; Destro Bisol, Giovanni

    2015-01-01

    This study analyzes data sharing regarding mitochondrial, Y chromosomal and autosomal polymorphisms in a total of 162 papers on ancient human DNA published between 1988 and 2013. The estimated sharing rate was not far from totality (97.6% ± 2.1%) and substantially higher than observed in other fields of genetic research (evolutionary, medical and forensic genetics). Both a questionnaire-based survey and the examination of Journals’ editorial policies suggest that this high sharing rate cannot be simply explained by the need to comply with stakeholders requests. Most data were made available through body text, but the use of primary databases increased in coincidence with the introduction of complete mitochondrial and next-generation sequencing methods. Our study highlights three important aspects. First, our results imply that researchers’ awareness of the importance of openness and transparency for scientific progress may complement stakeholders’ policies in achieving very high sharing rates. Second, widespread data sharing does not necessarily coincide with a prevalent use of practices which maximize data findability, accessibility, useability and preservation. A detailed look at the different ways in which data are released can be very useful to detect failures to adopt the best sharing modalities and understand how to correct them. Third and finally, the case of human paleogenetics tells us that a widespread awareness of the importance of Open Science may be important to build reliable scientific practices even in the presence of complex experimental challenges. PMID:25799293

  16. Social.Water--Open Source Citizen Science Software for CrowdHydrology

    Science.gov (United States)

    Fienen, M. N.; Lowry, C.

    2013-12-01

    CrowdHydrology is a crowd-sourced citizen science project in which passersby near streams are encouraged to read a gage and send an SMS (text) message with the water level to a number indicated on a sign. The project was initially started using free services such as Google Voice, Gmail, and Google Maps to acquire and present the data on the internet. Social.Water is open-source software, using Python and JavaScript, that automates the acquisition, categorization, and presentation of the data. Open-source objectives pervade both the project and the software as the code is hosted at Github, only free scripting codes are used, and any person or organization can install a gage and join the CrowdHydrology network. In the first year, 10 sites were deployed in upstate New York, USA. In the second year, expansion to 44 sites throughout the upper Midwest USA was achieved. Comparison with official USGS and academic measurements have shown low error rates. Citizen participation varies greatly from site to site, so surveys or other social information is sought for insight into why some sites experience higher rates of participation than others.

  17. Can Open Science save us from a solar-driven monsoon?

    Directory of Open Access Journals (Sweden)

    Laken Benjamin A.

    2016-01-01

    Full Text Available Numerous studies have been published claiming strong solar influences on the Earth’s weather and climate, many of which include documented errors and false-positives, yet are still frequently used to substantiate arguments of global warming denial. Recently, Badruddin & Aslam (2015 reported a highly significant relationship between the Indian monsoon and the cosmic ray flux. They found strong and opposing linear trends in the cosmic ray flux during composites of the strongest and weakest monsoons since 1964, and concluded that this relationship is causal. They further speculated that it could apply across the entire tropical and sub-tropical belt and be of global importance. However, examining the original data reveals the cause of this false-positive: an assumption that the data’s underlying distribution was Gaussian. Instead, due to the manner in which the composite samples were constructed, the correlations were biased towards high values. Incorrect or problematic statistical analyses such as this are typical in the field of solar-terrestrial studies, and consequently false-positives are frequently published. However, the widespread adoption of Open Science approaches, placing an emphasis on reproducible open-source analyses as demonstrated in this work, could remedy the situation.

  18. BioFed: federated query processing over life sciences linked open data.

    Science.gov (United States)

    Hasnain, Ali; Mehmood, Qaiser; Sana E Zainab, Syeda; Saleem, Muhammad; Warren, Claude; Zehra, Durre; Decker, Stefan; Rebholz-Schuhmann, Dietrich

    2017-03-15

    Biomedical data, e.g. from knowledge bases and ontologies, is increasingly made available following open linked data principles, at best as RDF triple data. This is a necessary step towards unified access to biological data sets, but this still requires solutions to query multiple endpoints for their heterogeneous data to eventually retrieve all the meaningful information. Suggested solutions are based on query federation approaches, which require the submission of SPARQL queries to endpoints. Due to the size and complexity of available data, these solutions have to be optimised for efficient retrieval times and for users in life sciences research. Last but not least, over time, the reliability of data resources in terms of access and quality have to be monitored. Our solution (BioFed) federates data over 130 SPARQL endpoints in life sciences and tailors query submission according to the provenance information. BioFed has been evaluated against the state of the art solution FedX and forms an important benchmark for the life science domain. The efficient cataloguing approach of the federated query processing system 'BioFed', the triple pattern wise source selection and the semantic source normalisation forms the core to our solution. It gathers and integrates data from newly identified public endpoints for federated access. Basic provenance information is linked to the retrieved data. Last but not least, BioFed makes use of the latest SPARQL standard (i.e., 1.1) to leverage the full benefits for query federation. The evaluation is based on 10 simple and 10 complex queries, which address data in 10 major and very popular data sources (e.g., Dugbank, Sider). BioFed is a solution for a single-point-of-access for a large number of SPARQL endpoints providing life science data. It facilitates efficient query generation for data access and provides basic provenance information in combination with the retrieved data. BioFed fully supports SPARQL 1.1 and gives access to the

  19. The FEMM Package: A Simple, Fast, and Accurate Open Source Electromagnetic Tool in Science and Engineering

    Directory of Open Access Journals (Sweden)

    K. B. Baltzis

    2008-01-01

    Full Text Available The finite element method (FEM is one of the most successful computational techniques for obtaining approximate solutions to the partial differential equations that arise in many scientific and engineering applications. Finite Element Method Magnetics (FEMM is a software package for solving electromagnetic problems using FEM. The program addresses 2D planar and 3D axisymmetric linear and nonlinear harmonic low frequency magnetic and magnetostatic problems and linear electrostatic problems. It is a simple, accurate, and low computational cost open source product, popular in science, engineering, and education. In this paper the main characteristics and functions of the package are presented. In order to demonstrate its use and exhibit the aid it offers in the study of electromagnetics a series of illustrative examples are given. The aim of the paper is to demonstrate the capability of FEMM to meet as a complementary tool the needs of science and technology especially when factors like the economic cost or the software complexity do not allow the use of commercial products.

  20. Big data, open science and the brain: lessons learned from genomics.

    Science.gov (United States)

    Choudhury, Suparna; Fishman, Jennifer R; McGowan, Michelle L; Juengst, Eric T

    2014-01-01

    The BRAIN Initiative aims to break new ground in the scale and speed of data collection in neuroscience, requiring tools to handle data in the magnitude of yottabytes (10(24)). The scale, investment and organization of it are being compared to the Human Genome Project (HGP), which has exemplified "big science" for biology. In line with the trend towards Big Data in genomic research, the promise of the BRAIN Initiative, as well as the European Human Brain Project, rests on the possibility to amass vast quantities of data to model the complex interactions between the brain and behavior and inform the diagnosis and prevention of neurological disorders and psychiatric disease. Advocates of this "data driven" paradigm in neuroscience argue that harnessing the large quantities of data generated across laboratories worldwide has numerous methodological, ethical and economic advantages, but it requires the neuroscience community to adopt a culture of data sharing and open access to benefit from them. In this article, we examine the rationale for data sharing among advocates and briefly exemplify these in terms of new "open neuroscience" projects. Then, drawing on the frequently invoked model of data sharing in genomics, we go on to demonstrate the complexities of data sharing, shedding light on the sociological and ethical challenges within the realms of institutions, researchers and participants, namely dilemmas around public/private interests in data, (lack of) motivation to share in the academic community, and potential loss of participant anonymity. Our paper serves to highlight some foreseeable tensions around data sharing relevant to the emergent "open neuroscience" movement.

  1. Information Power Grid: Distributed High-Performance Computing and Large-Scale Data Management for Science and Engineering

    Science.gov (United States)

    Johnston, William E.; Gannon, Dennis; Nitzberg, Bill; Feiereisen, William (Technical Monitor)

    2000-01-01

    The term "Grid" refers to distributed, high performance computing and data handling infrastructure that incorporates geographically and organizationally dispersed, heterogeneous resources that are persistent and supported. The vision for NASN's Information Power Grid - a computing and data Grid - is that it will provide significant new capabilities to scientists and engineers by facilitating routine construction of information based problem solving environments / frameworks that will knit together widely distributed computing, data, instrument, and human resources into just-in-time systems that can address complex and large-scale computing and data analysis problems. IPG development and deployment is addressing requirements obtained by analyzing a number of different application areas, in particular from the NASA Aero-Space Technology Enterprise. This analysis has focussed primarily on two types of users: The scientist / design engineer whose primary interest is problem solving (e.g., determining wing aerodynamic characteristics in many different operating environments), and whose primary interface to IPG will be through various sorts of problem solving frameworks. The second type of user if the tool designer: The computational scientists who convert physics and mathematics into code that can simulate the physical world. These are the two primary users of IPG, and they have rather different requirements. This paper describes the current state of IPG (the operational testbed), the set of capabilities being put into place for the operational prototype IPG, as well as some of the longer term R&D tasks.

  2. Towards a global participatory platform. Democratising open data, complexity science and collective intelligence

    Science.gov (United States)

    Buckingham Shum, S.; Aberer, K.; Schmidt, A.; Bishop, S.; Lukowicz, P.; Anderson, S.; Charalabidis, Y.; Domingue, J.; de Freitas, S.; Dunwell, I.; Edmonds, B.; Grey, F.; Haklay, M.; Jelasity, M.; Karpištšenko, A.; Kohlhammer, J.; Lewis, J.; Pitt, J.; Sumner, R.; Helbing, D.

    2012-11-01

    The FuturICT project seeks to use the power of big data, analytic models grounded in complexity science, and the collective intelligence they yield for societal benefit. Accordingly, this paper argues that these new tools should not remain the preserve of restricted government, scientific or corporate élites, but be opened up for societal engagement and critique. To democratise such assets as a public good, requires a sustainable ecosystem enabling different kinds of stakeholder in society, including but not limited to, citizens and advocacy groups, school and university students, policy analysts, scientists, software developers, journalists and politicians. Our working name for envisioning a sociotechnical infrastructure capable of engaging such a wide constituency is the Global Participatory Platform (GPP). We consider what it means to develop a GPP at the different levels of data, models and deliberation, motivating a framework for different stakeholders to find their ecological niches at different levels within the system, serving the functions of (i) sensing the environment in order to pool data, (ii) mining the resulting data for patterns in order to model the past/present/future, and (iii) sharing and contesting possible interpretations of what those models might mean, and in a policy context, possible decisions. A research objective is also to apply the concepts and tools of complexity science and social science to the project's own work. We therefore conceive the global participatory platform as a resilient, epistemic ecosystem, whose design will make it capable of self-organization and adaptation to a dynamic environment, and whose structure and contributions are themselves networks of stakeholders, challenges, issues, ideas and arguments whose structure and dynamics can be modelled and analysed.

  3. Participation of Environmental Science Students in an Open Discussion "Riga - European Green Capital"

    Science.gov (United States)

    Dace, Elina; Berzina, Alise; Ozolina, Liga; Lorence, Ieva

    2010-01-01

    Starting from the year 2010, each year one European city is selected as the European Green Capital of the year. The award is granted to a city that has a consistent record of achieving high environmental standards, and is committed to ongoing and ambitious goals for further environmental improvement and sustainable development, as well as can act as a role model to inspire other cities and promote best practices to other European cities. Riga participated in the competition once, but did not fulfill the conditions, therefore an open discussion "Riga - European Green Capital" was organized by a nongovernmental organization "Association of Environmental Science Students". The aim of the discussion was to develop suggestions for the Riga city council on how to win the title "European Green Capital". Students of technical and engineering sciences were involved in the discussion to give their vision on what is needed for the city to comply with all the criteria of the competition. Thus, another aim of the discussion was to promote collaboration between students and the Riga city council in terms of environmental thinking. As a result of the discussion, a nine-page letter was prepared with recommendations to the Riga city mayor on how to develop the city in a sustainable manner and outlining benefits which could arise if the city of Riga got the title. However, the most important outcome of the discussion are the skills which students gained from the experience of presenting their ideas and discussing them with specialists of the specific field. This should help in further studies and work, as well as in individual professional development. The discussions were also a starting point for further collaboration between the Riga city council and students from the Association of Latvian Environmental Science Students.

  4. OpenSearch (ECHO-ESIP) & REST API for Earth Science Data Access

    Science.gov (United States)

    Mitchell, A.; Cechini, M.; Pilone, D.

    2010-12-01

    This presentation will provide a brief technical overview of OpenSearch, the Earth Science Information Partners (ESIP) Federated Search framework, and the REST architecture; discuss NASA’s Earth Observing System (EOS) ClearingHOuse’s (ECHO) implementation lessons learned; and demonstrate the simplified usage of these technologies. SOAP, as a framework for web service communication has numerous advantages for Enterprise applications and Java/C# type programming languages. As a technical solution, SOAP has been a reliable framework on top of which many applications have been successfully developed and deployed. However, as interest grows for quick development cycles and more intriguing “mashups,” the SOAP API loses its appeal. Lightweight and simple are the vogue characteristics that are sought after. Enter the REST API architecture and OpenSearch format. Both of these items provide a new path for application development addressing some of the issues unresolved by SOAP. ECHO has made available all of its discovery, order submission, and data management services through a publicly accessible SOAP API. This interface is utilized by a variety of ECHO client and data partners to provide valuable capabilities to end users. As ECHO interacted with current and potential partners looking to develop Earth Science tools utilizing ECHO, it became apparent that the development overhead required to interact with the SOAP API was a growing barrier to entry. ECHO acknowledged the technical issues that were being uncovered by its partner community and chose to provide two new interfaces for interacting with the ECHO metadata catalog. The first interface is built upon the OpenSearch format and ESIP Federated Search framework. Leveraging these two items, a client (ECHO-ESIP) was developed with a focus on simplified searching and results presentation. The second interface is built upon the Representational State Transfer (REST) architecture. Leveraging the REST architecture, a

  5. Neue Aufgaben für wissenschaftliche Bibliotheken: Das Beispiel Open Science Lab

    Directory of Open Access Journals (Sweden)

    Lambert Heller

    2015-10-01

    Full Text Available Vor dem Hintergrund des Aufkommens vieler neuer digitaler Werkzeuge und Methoden zur Unterstützung des wissenschaftlichen Arbeitens wird seit etwa fünf Jahren unter wissenschaftlichen Bibliothekaren in Deutschland immer häufiger über Innovationsmanagement diskutiert. Wie lassen sich relevante Trends und Herausforderungen rechtzeitig erkennen und mit den begrenzten Ressourcen einer Einrichtung des öffentlichen Dienstes adäquat aufgreifen, bis hin zu einer Veränderung der Bibliotheksstrategie? Der Beitrag behandelt das Modell des an der Technischen Informationsbibliothek Hannover (TIB 2013 ins Leben gerufenen Open Science Lab. Unter Leitung des Autors werden Trends beobachtet und aufgegriffen, um in enger Zusammenarbeit mit Wissenschaftlern und Wissenschaftlerinnen neue digitale Werkzeuge und Methoden zu erproben, eine neue Informationspraxis zu kultivieren und daraus Innovationen für das Dienste-Spektrum der Bibliothek abzuleiten. Dies wird beispielhaft anhand der beiden Schwerpunktthemen kollaboratives Schreiben sowie linked-data-basierte Forschungsinformationssysteme (FIS geschildert und diskutiert. Given the rise of many new digital tools and methods for supporting scientific work, the last five years have seen a lot of discussion amongst German academic librarians about innovation management. How can we discover relevant trends and challenges in time and respond to them adequately up to the point of changing whole library strategies, despite the limited resources of a public sector institution? The paper presents the model of the Open Science Lab which was set up at the German National Library of Science and Technology (TIB Hannover in 2013. Under the direction of the author and in close collaboration with scientific communities, the lab group keeps track of trends and selects some of them in order to try out new tools and methods. The ultimate aim is to cultivate new information practices and develop new, innovative

  6. The BioGRID interaction database: 2015 update.

    Science.gov (United States)

    Chatr-Aryamontri, Andrew; Breitkreutz, Bobby-Joe; Oughtred, Rose; Boucher, Lorrie; Heinicke, Sven; Chen, Daici; Stark, Chris; Breitkreutz, Ashton; Kolas, Nadine; O'Donnell, Lara; Reguly, Teresa; Nixon, Julie; Ramage, Lindsay; Winter, Andrew; Sellam, Adnane; Chang, Christie; Hirschman, Jodi; Theesfeld, Chandra; Rust, Jennifer; Livstone, Michael S; Dolinski, Kara; Tyers, Mike

    2015-01-01

    The Biological General Repository for Interaction Datasets (BioGRID: http://thebiogrid.org) is an open access database that houses genetic and protein interactions curated from the primary biomedical literature for all major model organism species and humans. As of September 2014, the BioGRID contains 749,912 interactions as drawn from 43,149 publications that represent 30 model organisms. This interaction count represents a 50% increase compared to our previous 2013 BioGRID update. BioGRID data are freely distributed through partner model organism databases and meta-databases and are directly downloadable in a variety of formats. In addition to general curation of the published literature for the major model species, BioGRID undertakes themed curation projects in areas of particular relevance for biomedical sciences, such as the ubiquitin-proteasome system and various human disease-associated interaction networks. BioGRID curation is coordinated through an Interaction Management System (IMS) that facilitates the compilation interaction records through structured evidence codes, phenotype ontologies, and gene annotation. The BioGRID architecture has been improved in order to support a broader range of interaction and post-translational modification types, to allow the representation of more complex multi-gene/protein interactions, to account for cellular phenotypes through structured ontologies, to expedite curation through semi-automated text-mining approaches, and to enhance curation quality control. © The Author(s) 2014. Published by Oxford University Press on behalf of Nucleic Acids Research.

  7. Opening up Openness to Experience: A Four-Factor Model and Relations to Creative Achievement in the Arts and Sciences

    Science.gov (United States)

    Kaufman, Scott Barry

    2013-01-01

    Openness to experience is the broadest personality domain of the Big Five, including a mix of traits relating to intellectual curiosity, intellectual interests, perceived intelligence, imagination, creativity, artistic and aesthetic interests, emotional and fantasy richness, and unconventionality. Likewise, creative achievement is a broad…

  8. Opening up Openness to Experience: A Four-Factor Model and Relations to Creative Achievement in the Arts and Sciences

    Science.gov (United States)

    Kaufman, Scott Barry

    2013-01-01

    Openness to experience is the broadest personality domain of the Big Five, including a mix of traits relating to intellectual curiosity, intellectual interests, perceived intelligence, imagination, creativity, artistic and aesthetic interests, emotional and fantasy richness, and unconventionality. Likewise, creative achievement is a broad…

  9. Edaq530: A Transparent, Open-End and Open-Source Measurement Solution in Natural Science Education

    Science.gov (United States)

    Kopasz, Katalin; Makra, Peter; Gingl, Zoltan

    2011-01-01

    We present Edaq530, a low-cost, compact and easy-to-use digital measurement solution consisting of a thumb-sized USB-to-sensor interface and measurement software. The solution is fully open-source, our aim being to provide a viable alternative to professional solutions. Our main focus in designing Edaq530 has been versatility and transparency. In…

  10. OPEN INNOVATION PROJECT: THE SYSTEM OF ONLINE INDICATORS IN SCIENCE, TECHNOLOGY AND INNOVATION OF AMAZONAS (SiON

    Directory of Open Access Journals (Sweden)

    Moises Andrade Coelho

    2016-05-01

    Full Text Available This study aims to evaluate the implementation of an open innovation project in a public institution in the state of Amazonas. The theoretical and empirical background deals with science, technology and innovation indicators and open innovation. The study is characterized as a qualitative and descriptive research, with the case study as a methodological procedure. The delimitation of the universe was composed by a public institution in the area of science, technology and innovation (ST&I. In the case study, it was used an approach as tool to assess the implementation of open innovation projects. The results are shown several stages of open innovation project analyzed.  The study demonstrates the implications of open innovation project adoption to the strengthening of external networks and the maturing of the internal environment. The relevance of the study is based on the evaluation of an open innovation project in a public institution in order to foster the transition from traditional innovation processes to open innovation processes.

  11. HP advances Grid Strategy for the adaptive enterprise

    CERN Multimedia

    2003-01-01

    "HP today announced plans to further enable its enterprise infrastructure technologies for grid computing. By leveraging open grid standards, HP plans to help customers simplify the use and management of distributed IT resources. The initiative will integrate industry grid standards, including the Globus Toolkit and Open Grid Services Architecture (OGSA), across HP's enterprise product lines" (1 page).

  12. Long-term Science Data Curation Using a Digital Object Model and Open-Source Frameworks

    Science.gov (United States)

    Pan, J.; Lenhardt, W.; Wilson, B. E.; Palanisamy, G.; Cook, R. B.

    2010-12-01

    Scientific digital content, including Earth Science observations and model output, has become more heterogeneous in format and more distributed across the Internet. In addition, data and metadata are becoming necessarily linked internally and externally on the Web. As a result, such content has become more difficult for providers to manage and preserve and for users to locate, understand, and consume. Specifically, it is increasingly harder to deliver relevant metadata and data processing lineage information along with the actual content consistently. Readme files, data quality information, production provenance, and other descriptive metadata are often separated in the storage level as well as in the data search and retrieval interfaces available to a user. Critical archival metadata, such as auditing trails and integrity checks, are often even more difficult for users to access, if they exist at all. We investigate the use of several open-source software frameworks to address these challenges. We use Fedora Commons Framework and its digital object abstraction as the repository, Drupal CMS as the user-interface, and the Islandora module as the connector from Drupal to Fedora Repository. With the digital object model, metadata of data description and data provenance can be associated with data content in a formal manner, so are external references and other arbitrary auxiliary information. Changes are formally audited on an object, and digital contents are versioned and have checksums automatically computed. Further, relationships among objects are formally expressed with RDF triples. Data replication, recovery, metadata export are supported with standard protocols, such as OAI-PMH. We provide a tentative comparative analysis of the chosen software stack with the Open Archival Information System (OAIS) reference model, along with our initial results with the existing terrestrial ecology data collections at NASA’s ORNL Distributed Active Archive Center for

  13. Open source software and low cost sensors for teaching UAV science

    Science.gov (United States)

    Kefauver, S. C.; Sanchez-Bragado, R.; El-Haddad, G.; Araus, J. L.

    2016-12-01

    Drones, also known as UASs (unmanned aerial systems), UAVs (Unmanned Aerial Vehicles) or RPAS (Remotely piloted aircraft systems), are both useful advanced scientific platforms and recreational toys that are appealing to younger generations. As such, they can make for excellent education tools as well as low-cost scientific research project alternatives. However, the process of taking pretty pictures to remote sensing science can be daunting if one is presented with only expensive software and sensor options. There are a number of open-source tools and low cost platform and sensor options available that can provide excellent scientific research results, and, by often requiring more user-involvement than commercial software and sensors, provide even greater educational benefits. Scale-invariant feature transform (SIFT) algorithm implementations, such as the Microsoft Image Composite Editor (ICE), which can create quality 2D image mosaics with some motion and terrain adjustments and VisualSFM (Structure from Motion), which can provide full image mosaicking with movement and orthorectification capacities. RGB image quantification using alternate color space transforms, such as the BreedPix indices, can be calculated via plugins in the open-source software Fiji (http://fiji.sc/Fiji; http://github.com/george-haddad/CIMMYT). Recent analyses of aerial images from UAVs over different vegetation types and environments have shown RGB metrics can outperform more costly commercial sensors. Specifically, Hue-based pixel counts, the Triangle Greenness Index (TGI), and the Normalized Green Red Difference Index (NGRDI) consistently outperformed NDVI in estimating abiotic and biotic stress impacts on crop health. Also, simple kits are available for NDVI camera conversions. Furthermore, suggestions for multivariate analyses of the different RGB indices in the "R program for statistical computing", such as classification and regression trees can allow for a more approachable

  14. An open source approach to enable the reproducibility of scientific workflows in the ocean sciences

    Science.gov (United States)

    Di Stefano, M.; Fox, P. A.; West, P.; Hare, J. A.; Maffei, A. R.

    2013-12-01

    Every scientist should be able to rerun data analyses conducted by his or her team and regenerate the figures in a paper. However, all too often the correct version of a script goes missing, or the original raw data is filtered by hand and the filtering process is undocumented, or there is lack of collaboration and communication among scientists working in a team. Here we present 3 different use cases in ocean sciences in which end-to-end workflows are tracked. The main tool that is deployed to address these use cases is based on a web application (IPython Notebook) that provides the ability to work on very diverse and heterogeneous data and information sources, providing an effective way to share the and track changes to source code used to generate data products and associated metadata, as well as to track the overall workflow provenance to allow versioned reproducibility of a data product. Use cases selected for this work are: 1) A partial reproduction of the Ecosystem Status Report (ESR) for the Northeast U.S. Continental Shelf Large Marine Ecosystem. Our goal with this use case is to enable not just the traceability but also the reproducibility of this biannual report, keeping track of all the processes behind the generation and validation of time-series and spatial data and information products. An end-to-end workflow with code versioning is developed so that indicators in the report may be traced back to the source datasets. 2) Realtime generation of web pages to be able to visualize one of the environmental indicators from the Ecosystem Advisory for the Northeast Shelf Large Marine Ecosystem web site. 3) Data and visualization integration for ocean climate forecasting. In this use case, we focus on a workflow to describe how to provide access to online data sources in the NetCDF format and other model data, and make use of multicore processing to generate video animation from time series of gridded data. For each use case we show how complete workflows

  15. Evaluating mobile centric readiness of students: A case of computer science students in open-distance learning

    CSIR Research Space (South Africa)

    Chipangura, B

    2015-07-01

    Full Text Available This study examined the mobile centric readiness of Computer Science students at an Open and Distance Learning (ODL) university in South Africa. Quantitative data was captured through a survey and a total of 129 students responded to the survey...

  16. Software Uncertainty in Integrated Environmental Modelling: the role of Semantics and Open Science

    Science.gov (United States)

    de Rigo, Daniele

    2013-04-01

    growing debate on open science and scientific knowledge freedom [2,56-59]. In particular, the role of free software has been underlined within the paradigm of reproducible research [50,58-60]. In the spectrum of reproducibility, the free availability of the source code is emphasized [58] as the first step from non-reproducible research (only based on classic peer-reviewed publications) toward reproducibility. Applying this paradigm to WSTMe, an alternative strategy to black-boxes would suggest exposing not only final outputs but also key intermediate layers of data and information along with the corresponding free software D- TM modules. A concise, semantically-enhanced modularization [14,15] may help not only to see the code (as a very basic prerequisite for semantic transparency) but also to understand - and correct - it [61]. Semantically-enhanced, concise modularization is e.g. supported by semantic array programming (SemAP) [14,15] and its extension to geospatial problems [8,10]. Some WSTMe may surely be classified in the subset of software systems which "are growing well past the ability of a small group of people to completely understand the content", while "data from these systems are often used for critical decision making" [52]. In this context, the further uncertainty arising from the unpredicted "(not to say unpredictable)" [53] behaviour of software errors propagation in WSTMe should be explicitly considered as software uncertainty [62,63]. Thedata and informationflow ofa black- box D-TM isoften a(hidden)compositionofD-TM modules: Semantics and design diversity. Silent faults [64] are a critical class of software errors altering computation output without evident symptoms - such as computation premature interruption (exceptions, error messages, ...), obviously unrealistic results or computation patterns (e.g. noticeably shorter/longer or endless computations). As it has been underlined, "many scientific results are corrupted, perhaps fatally so, by

  17. Monitoring of Grid scientific workflows

    NARCIS (Netherlands)

    Balis, B.; Bubak, M.; Łabno, B.

    2008-01-01

    Scientific workflows are a means of conducting in silico experiments in modern computing infrastructures for e-Science, often built on top of Grids. Monitoring of Grid scientific workflows is essential not only for performance analysis but also to collect provenance data and gather feedback useful

  18. Monitoring of Grid scientific workflows

    NARCIS (Netherlands)

    Balis, B.; Bubak, M.; Łabno, B.

    2008-01-01

    Scientific workflows are a means of conducting in silico experiments in modern computing infrastructures for e-Science, often built on top of Grids. Monitoring of Grid scientific workflows is essential not only for performance analysis but also to collect provenance data and gather feedback useful i

  19. USA National Phenology Network gridded products documentation

    Science.gov (United States)

    Crimmins, Theresa M.; Marsh, R. Lee; Switzer, Jeff R.; Crimmins, Michael A.; Gerst, Katharine L.; Rosemartin, Alyssa H.; Weltzin, Jake F.

    2017-02-23

    The goals of the USA National Phenology Network (USA-NPN, www.usanpn.org) are to advance science, inform decisions, and communicate and connect with the public regarding phenology and species’ responses to environmental variation and climate change. The USA-NPN seeks to facilitate informed ecosystem stewardship and management by providing phenological information freely and openly. One way the USA-NPN is endeavoring to accomplish these goals is by providing data and data products in a wide range of formats, including gridded real-time, short-term forecasted, and historical maps of phenological events, patterns and trends. This document describes the suite of gridded phenologically relevant data products produced and provided by the USA National Phenology Network, which can be accessed at www.usanpn.org/data/phenology_maps and also through web services at geoserver.usanpn.org/geoserver/wms?request=GetCapabilities.

  20. How partnership accelerates Open Science: High Energy Physics and INSPIRE, a case study of a complex repository ecosystem

    CERN Document Server

    AUTHOR|(CDS)2079501; Hecker, Bernard Louis; Holtkamp, Annette; Mele, Salvatore; O'Connell, Heath; Sachs, Kirsten; Simko, Tibor; Schwander, Thorsten

    2013-01-01

    Public calls, agency mandates and scientist demand for Open Science are by now a reality with different nuances across diverse research communities. A complex “ecosystem” of services and tools, mostly communityDdriven, will underpin this revolution in science. Repositories stand to accelerate this process, as “openness” evolves beyond text, in lockstep with scholarly communication. We present a case study of a global discipline, HighDEnergy Physics (HEP), where most of these transitions have already taken place in a “social laboratory” of multiple global information services interlinked in a complex, but successful, ecosystem at the service of scientists. We discuss our firstDhand experience, at a technical and organizational level, of leveraging partnership across repositories and with the user community in support of Open Science, along threads relevant to the OR2013 community.

  1. Fibonacci Grids

    Science.gov (United States)

    Swinbank, Richard; Purser, James

    2006-01-01

    Recent years have seen a resurgence of interest in a variety of non-standard computational grids for global numerical prediction. The motivation has been to reduce problems associated with the converging meridians and the polar singularities of conventional regular latitude-longitude grids. A further impetus has come from the adoption of massively parallel computers, for which it is necessary to distribute work equitably across the processors; this is more practicable for some non-standard grids. Desirable attributes of a grid for high-order spatial finite differencing are: (i) geometrical regularity; (ii) a homogeneous and approximately isotropic spatial resolution; (iii) a low proportion of the grid points where the numerical procedures require special customization (such as near coordinate singularities or grid edges). One family of grid arrangements which, to our knowledge, has never before been applied to numerical weather prediction, but which appears to offer several technical advantages, are what we shall refer to as "Fibonacci grids". They can be thought of as mathematically ideal generalizations of the patterns occurring naturally in the spiral arrangements of seeds and fruit found in sunflower heads and pineapples (to give two of the many botanical examples). These grids possess virtually uniform and highly isotropic resolution, with an equal area for each grid point. There are only two compact singular regions on a sphere that require customized numerics. We demonstrate the practicality of these grids in shallow water simulations, and discuss the prospects for efficiently using these frameworks in three-dimensional semi-implicit and semi-Lagrangian weather prediction or climate models.

  2. Virtual microscopy in medical research: Open European Nephrology Science Center (OpEN.SC)

    Science.gov (United States)

    Schrader, Thomas; Beil, Michael; Schmidt, Danilo; Dietel, Manfred; Lindemann, Gabriela

    2007-03-01

    The amount and heterogeneity of data in biomedical research, notably in transnational research, requires new methods for the collection, presentation and analysis of information. Important data from laboratory experiments as well as patient trials are available as images. Thus, the integration and processing of image data represent a crucial component of information systems in biomedical research. The Charité Medical School in Berlin has established a new information service center for kidney diseases and transplantation (Open European Nephrology Science Centre - OpEN.SC) together with the German Research Agency (DFG). The aims of this project are (i) to improve the availability of raw data, (ii) to establish an infrastructure for clinical trials, (iii) to monitor the occurrence of rare disease patterns and (iv) to establish a quality assurance system. Major diagnostic procedures in medicine are based on the processing and analysis of image data. In diagnostic pathology, the availability of automated slide scanners provide the opportunity to digitize entire microscopic slides. The processing, presentation and analysis of these image data are called virtual microscopy. The integration of this new technology into the OpEN.SC system and the link to other heterogeneous data of individual patients represent a major technological challenge. Thus, new ways in communication between clinical and scientific partners have to be established and will be promoted by the project. The technological basis of the repository are web services for a scalable and adaptable system. HL7 and DICOM are considered the main medical standards of communication.

  3. Lead grids

    CERN Multimedia

    1974-01-01

    One of the 150 lead grids used in the multiwire proportional chamber g-ray detector. The 0.75 mm diameter holes are spaced 1 mm centre to centre. The grids were made by chemical cutting techniques in the Godet Workshop of the SB Physics.

  4. Investigating What Undergraduate Students Know About Science: Results from Complementary Strategies to Code Open-Ended Responses

    Science.gov (United States)

    Tijerino, K.; Buxner, S.; Impey, C.; CATS

    2013-04-01

    This paper presents new findings from an ongoing study of undergraduate student science literacy. Using data drawn from a 22 year project and over 11,000 student responses, we present how students' word usage in open-ended responses relates to what it means to study something scientifically. Analysis of students' responses show that they easily use words commonly associated with science, such as hypothesis, study, method, test, and experiment; but do these responses use scientific words knowledgeably? As with many multifaceted disciplines, demonstration of comprehension varies. This paper presents three different ways that student responses have been coded to investigate their understanding of science; 1) differentiating quality of a response with a coding scheme; 2) using word counting as an indicator of overall response strength; 3) responses are coded for quality of students' response. Building on previous research, comparison of science literacy and open-ended responses demonstrates that knowledge of science facts and vocabulary does not indicate a comprehension of the concepts behind these facts and vocabulary. This study employs quantitative and qualitative methods to systematically determine frequency and meaning of responses to standardized questions, and illustrates how students are able to demonstrate a knowledge of vocabulary. However, this knowledge is not indicative of conceptual understanding and poses important questions about how we assess students' understandings of science.

  5. Information Power Grid: Distributed High-Performance Computing and Large-Scale Data Management for Science and Engineering

    Science.gov (United States)

    Johnston, William E.; Gannon, Dennis; Nitzberg, Bill

    2000-01-01

    We use the term "Grid" to refer to distributed, high performance computing and data handling infrastructure that incorporates geographically and organizationally dispersed, heterogeneous resources that are persistent and supported. This infrastructure includes: (1) Tools for constructing collaborative, application oriented Problem Solving Environments / Frameworks (the primary user interfaces for Grids); (2) Programming environments, tools, and services providing various approaches for building applications that use aggregated computing and storage resources, and federated data sources; (3) Comprehensive and consistent set of location independent tools and services for accessing and managing dynamic collections of widely distributed resources: heterogeneous computing systems, storage systems, real-time data sources and instruments, human collaborators, and communications systems; (4) Operational infrastructure including management tools for distributed systems and distributed resources, user services, accounting and auditing, strong and location independent user authentication and authorization, and overall system security services The vision for NASA's Information Power Grid - a computing and data Grid - is that it will provide significant new capabilities to scientists and engineers by facilitating routine construction of information based problem solving environments / frameworks. Such Grids will knit together widely distributed computing, data, instrument, and human resources into just-in-time systems that can address complex and large-scale computing and data analysis problems. Examples of these problems include: (1) Coupled, multidisciplinary simulations too large for single systems (e.g., multi-component NPSS turbomachine simulation); (2) Use of widely distributed, federated data archives (e.g., simultaneous access to metrological, topological, aircraft performance, and flight path scheduling databases supporting a National Air Space Simulation systems}; (3

  6. Information Power Grid: Distributed High-Performance Computing and Large-Scale Data Management for Science and Engineering

    Science.gov (United States)

    Johnston, William E.; Gannon, Dennis; Nitzberg, Bill

    2000-01-01

    We use the term "Grid" to refer to distributed, high performance computing and data handling infrastructure that incorporates geographically and organizationally dispersed, heterogeneous resources that are persistent and supported. This infrastructure includes: (1) Tools for constructing collaborative, application oriented Problem Solving Environments / Frameworks (the primary user interfaces for Grids); (2) Programming environments, tools, and services providing various approaches for building applications that use aggregated computing and storage resources, and federated data sources; (3) Comprehensive and consistent set of location independent tools and services for accessing and managing dynamic collections of widely distributed resources: heterogeneous computing systems, storage systems, real-time data sources and instruments, human collaborators, and communications systems; (4) Operational infrastructure including management tools for distributed systems and distributed resources, user services, accounting and auditing, strong and location independent user authentication and authorization, and overall system security services The vision for NASA's Information Power Grid - a computing and data Grid - is that it will provide significant new capabilities to scientists and engineers by facilitating routine construction of information based problem solving environments / frameworks. Such Grids will knit together widely distributed computing, data, instrument, and human resources into just-in-time systems that can address complex and large-scale computing and data analysis problems. Examples of these problems include: (1) Coupled, multidisciplinary simulations too large for single systems (e.g., multi-component NPSS turbomachine simulation); (2) Use of widely distributed, federated data archives (e.g., simultaneous access to metrological, topological, aircraft performance, and flight path scheduling databases supporting a National Air Space Simulation systems}; (3

  7. Keeping the door open: romantic science and the experience of self

    OpenAIRE

    Halliwell, Martin

    1996-01-01

    The thesis positions three modem thinkers working in different areas of the human sciences - William James, Ludwig Binswanger and Oliver Sacks - within a framework of romantic science. Romantic science is a term which is developed explicitly in the work of Sacks and also illuminates the central concerns of James and Binswanger. As such, romantic science provides a useful framework in which to discuss conceptual changes in the medical humanities (a branch of the human sciences directed to pati...

  8. Edaq530: a transparent, open-end and open-source measurement solution in natural science education

    CERN Document Server

    Kopasz, Katalin; Gingl, Zoltán

    2010-01-01

    We present Edaq530, a compact and easy-to-use digital measurement solution consisting of a thumb-sized USB-to-sensor interface and a measurement software. The solution is fully open source, our aim being to provide a viable alternative to professional solutions. Our main focus in designing Edaq530 has been versatility and transparency. In this article, we shall introduce the capabilities of Edaq530, complement it by showing a few sample experiments, and discuss the feedback we have received in the course a teacher training workshop in which the participants received personal copies of Edaq530 and later made reports on how they could utilise Edaq530 in their teaching.

  9. Edaq530: a transparent, open-end and open-source measurement solution in natural science education

    Energy Technology Data Exchange (ETDEWEB)

    Kopasz, Katalin; Makra, Peter; Gingl, Zoltan, E-mail: phil@titan.physx.u-szeged.hu [Department of Experimental Physics, University of Szeged, Dom ter 9, Szeged, H6720 (Hungary)

    2011-03-15

    We present Edaq530, a low-cost, compact and easy-to-use digital measurement solution consisting of a thumb-sized USB-to-sensor interface and measurement software. The solution is fully open-source, our aim being to provide a viable alternative to professional solutions. Our main focus in designing Edaq530 has been versatility and transparency. In this paper, we shall introduce the capabilities of Edaq530, complement it by showing a few sample experiments, and discuss the feedback we have received in the course of a teacher training workshop in which the participants received personal copies of Edaq530 and later made reports on how they could utilize Edaq530 in their teaching.

  10. 76 FR 12711 - Smart Grid Advisory Committee

    Science.gov (United States)

    2011-03-08

    ... National Institute of Standards and Technology Smart Grid Advisory Committee AGENCY: National Institute of Standards and Technology, Department of Commerce. ACTION: Notice of open meeting. SUMMARY: The Smart Grid... will be posted on the Smart Grid Web site at http://www.nist.gov/smartgrid . DATES: The SGAC will hold...

  11. 76 FR 70412 - Smart Grid Advisory Committee

    Science.gov (United States)

    2011-11-14

    ... National Institute of Standards and Technology Smart Grid Advisory Committee AGENCY: National Institute of Standards and Technology, Department of Commerce. ACTION: Notice of open meeting. SUMMARY: The Smart Grid... agenda may change to accommodate Committee business. The final agenda will be posted on the Smart Grid...

  12. 76 FR 46279 - Smart Grid Advisory Committee

    Science.gov (United States)

    2011-08-02

    ... National Institute of Standards and Technology Smart Grid Advisory Committee AGENCY: Department of Commerce, National Institute of Standards and Technology ACTION: Notice of open meeting. SUMMARY: The Smart Grid... should be sent to Office of the National Coordinator for Smart Grid Interoperability, National Institute...

  13. 77 FR 38768 - Smart Grid Advisory Committee

    Science.gov (United States)

    2012-06-29

    ... National Institute of Standards and Technology Smart Grid Advisory Committee AGENCY: National Institute of Standards and Technology, Department of Commerce. ACTION: Notice of open meeting. SUMMARY: The Smart Grid... the Smart Grid Interoperability Panel transition plan, review the status of the research subcommittee...

  14. 75 FR 55306 - Smart Grid Advisory Committee

    Science.gov (United States)

    2010-09-10

    ... National Institute of Standards and Technology Smart Grid Advisory Committee AGENCY: National Institute of Standards and Technology, Department of Commerce. ACTION: Notice of open meeting. SUMMARY: The Smart Grid... provide an update on NIST's Smart Grid program. The agenda may change to accommodate Committee business...

  15. An overview of the use of Open Source in the NASA Langley Atmospheric Science Data Center Archive Next Generation system

    Science.gov (United States)

    Dye, R. A.; Perez, J.; Piatko, P. J.; Coogan, S. P.; Parker, L.

    2012-12-01

    The Atmospheric Science Data Center (ASDC) at NASA Langley Research Center is responsible for the archive and distribution of Earth science data in the areas of radiation budget, clouds, aerosols, and tropospheric chemistry. Over the past several years the ASDC has developed and implemented the Archive Next Generation (ANGe) system, a state-of-the-art data ingest, archival, and distribution system to serve the atmospheric sciences data provider and user communities. ANGe employs Open Source technologies including the JBoss Application Server, a PostGIS-enabled PostgreSQL database system to store geospatial metadata, modules from the GeoTools Open Source Java GIS Toolkit including the Java Topology Suite (JTS) and GeoAPI libraries, and other libraries such as the Spring framework. ANGe was developed using a suite of several Open Source tools comprised of Eclipse, Ant, Subversion and Jenkins. ANGe is also deployed into an operational environment that leverages Open Source technologies from the Linux Operating system to tools such as Ganglia for monitoring. This presentation provides an overview of ANGe with a focus on the Open Source technologies employed in the implementation and deployment of the system. The ASDC is part of Langley's Science Directorate. The Data Center was established in 1991 to support NASA's Earth Observing System and the U.S. Global Change Research Program. It is unique among NASA data centers in the size of its archive, cutting edge computing technology, and full range of data services. For more information regarding ASDC data holdings, documentation, tools and services, visit http://eosweb.larc.nasa.gov.

  16. Hematology - Open TG-GATEs | LSDB Archive [Life Science Database Archive metadata

    Lifescience Database Archive (English)

    Full Text Available [ Credits ] BLAST Search Image Search Home About Archive Update History Contact us Open...base Description Download License Update History of This Database Site Policy | Contact Us Hematology - Open TG-GATEs | LSDB Archive ...

  17. Body weight - Open TG-GATEs | LSDB Archive [Life Science Database Archive metadata

    Lifescience Database Archive (English)

    Full Text Available [ Credits ] BLAST Search Image Search Home About Archive Update History Contact us Open...d License Update History of This Database Site Policy | Contact Us Body weight - Open TG-GATEs | LSDB Archive ...

  18. Pathological items - Open TG-GATEs | LSDB Archive [Life Science Database Archive metadata

    Lifescience Database Archive (English)

    Full Text Available [ Credits ] BLAST Search Image Search Home About Archive Update History Contact us Open... This Database Site Policy | Contact Us Pathological items - Open TG-GATEs | LSDB Archive ...

  19. Individual list - Open TG-GATEs | LSDB Archive [Life Science Database Archive metadata

    Lifescience Database Archive (English)

    Full Text Available [ Credits ] BLAST Search Image Search Home About Archive Update History Contact us Open... History of This Database Site Policy | Contact Us Individual list - Open TG-GATEs | LSDB Archive ...

  20. gCube Grid services

    CERN Document Server

    Andrade, Pedro

    2008-01-01

    gCube is a service-based framework for eScience applications requiring collaboratory, on-demand, and intensive information processing. It provides to these communities Virtual Research Environments (VREs) to support their activities. gCube is build on top of standard technologies for computational Grids, namely the gLite middleware. The software was produced by the DILIGENT project and will continue to be supported and further developed by the D4Science project. gCube reflects within its name a three-sided interpretation of the Grid vision of resource sharing: sharing of computational resources, sharing of structured data, and sharing of application services. As such, gCube embodies the defining characteristics of computational Grids, data Grids, and virtual data Grids. Precisely, it builds on gLite middleware for managing distributed computations and unstructured data, includes dedicated services for managing data and metadata, provides services for distributed information retrieval, allows the orchestration...

  1. Index of /data/open-tggates/20110225 [Life Science Database Archive metadata

    Lifescience Database Archive (English)

    Full Text Available Index of /data/open-tggates/20110225 Name Last modified Size Description Parent Dir...ectory - Human/ 18-Mar-2011 10:36 - README.html 05-Mar-2013 14:39 16K Rat/ 18-Mar-2011 10:41 - open_tggates_...attribu..> 16-Sep-2011 09:58 171K open_tggates_main.zip 18-Mar-2011 13:57 7.4K Index of /data/open-tggates/20110225 ...

  2. Download - Open TG-GATEs | LSDB Archive [Life Science Database Archive metadata

    Lifescience Database Archive (English)

    Full Text Available [ Credits ] BLAST Search Image Search Home About Archive Update History Contact us Open...e data #4-10 in a single table. # Data name File Simple search and download 14 CEL file attachments Open-tgg... SEF URLs by Artio About This Database Database Description Download License Update History of This Database Site Policy | Contact Us Download - Open TG-GATEs | LSDB Archive ...

  3. Southampton uni's computer whizzes develop "mini" grid

    CERN Multimedia

    Sherriff, Lucy

    2006-01-01

    "In a bid to help its students explore the potential of grid computing, the University of Southampton's Computer Science department has developed what it calls a "lightweight grid". The system has been designed to allow students to experiment with grid technology without the complexity of inherent security concerns of the real thing. (1 page)

  4. Cell sample - Open TG-GATEs | LSDB Archive [Life Science Database Archive metadata

    Lifescience Database Archive (English)

    Full Text Available [ Credits ] BLAST Search Image Search Home About Archive Update History Contact us ...: open_tggates_cell.zip File URL: ftp://ftp.biosciencedbc.jp/archive/open-tggates...ase Description Download License Update History of This Database Site Policy | Contact Us Cell sample - Open TG-GATEs | LSDB Archive ...

  5. Biochemistry - Open TG-GATEs | LSDB Archive [Life Science Database Archive metadata

    Lifescience Database Archive (English)

    Full Text Available [ Credits ] BLAST Search Image Search Home About Archive Update History Contact us ...y.zip File URL: ftp://ftp.biosciencedbc.jp/archive/open-tggates/LATEST/open_tggat...e Database Description Download License Update History of This Database Site Policy | Contact Us Biochemistry - Open TG-GATEs | LSDB Archive ...

  6. The Martian Goes To College: Open Inquiry with Science Fiction in the Classroom.

    Science.gov (United States)

    Beatty, L.; Patterson, J. D.

    2015-12-01

    Storytelling is an ancient art; one that can get lost in the reams of data available in a typical geology or astronomy classroom. But storytelling draws us to a magical place. Our students, with prior experience in either a geology or astronomy course, were invited to explore Mars in a special topics course at Johnson County Community College through reading The Martian by Andy Weir. As they traveled with astronaut Mark Watney, the students used Google Mars, Java Mission-planning and Analysis for Remote Sensing (JMARS), and learning modules from the Mars for Earthlings web site to investigate the terrain and the processes at work in the past and present on Mars. Our goal was to apply their understanding of processes on Earth in order to explain and predict what they observed on Mars courtesy of the remote sensing opportunities available from Viking, Pathfinder, the Mars Exploration Rovers, and Maven missions; sort of an inter-planetary uniformitarianism. Astronaut Mark Watney's fictional journey from Acidalia Planitia to Schiaparelli Crater was analyzed using learning modules in Mars for Earthlings and exercises that we developed based on Google Mars, JMARS, Rotating Sky Explorer, and Science Friday podcasts. Each student also completed an individual project that either focused on a particular region that Astronaut Mark Watney traveled through or a problem that he faced. Through this open-inquiry learning style, they determined some processes that shaped Mars such as crater impacts, volcanism, fluid flow, mass movement, and groundwater sapping and also investigated the efficacy of solar energy as a power source based on location and the likelihood of regolith potential as a mineral matter source for soil.

  7. Tenure-Track Science Faculty and the 'Open Access Citation Effect'

    Directory of Open Access Journals (Sweden)

    R. Christopher Doty

    2013-02-01

    Full Text Available INTRODUCTION The observation that open access (OA articles receive more citations than subscription-based articles is known as the OA citation effect (OACE. Implicit in many OACE studies is the belief that authors are heavily invested in the number of citations their articles receive. This study seeks to determine what influence the OACE has on the decision-making process of tenure-track science faculty when they consider where to submit a manuscript for publication. METHODS Fifteen tenure-track faculty members in the Departments of Biology and Chemistry at the University of North Carolina at Chapel Hill participated in semi-structured interviews employing a variation of the critical incident tecnique. RESULTS Seven of the fifteen faculty members said they would consider making a future article freely-available based on the OACE. Due to dramatically different expectations with respect to the size of the OACE, however, only one of them is likely to seriously consider the OACE when deciding where to submit their next manuscript for publication. DISCUSSION Journal reputation and audience, and the quality of the editorial and review process are the most important factors in deciding where to submit a manuscript for publication. Once a subset of journals has satisfied these criteria, financial and access issues compete with the OACE in making a final decision. CONCLUSION In order to increase the number of OA materials, librarians should continue to emphasize depositing pre- and post-prints in disciplinary and institutional repositories and retaining the author rights prior to publication in order to make it possible to do so.

  8. ReSS: Resource Selection Service for National and Campus Grid Infrastructure

    Science.gov (United States)

    Mhashilkar, Parag; Garzoglio, Gabriele; Levshina, Tanya; Timm, Steve

    2010-04-01

    The Open Science Grid (OSG) offers access to around hundred Compute elements (CE) and storage elements (SE) via standard Grid interfaces. The Resource Selection Service (ReSS) is a push-based workload management system that is integrated with the OSG information systems and resources. ReSS integrates standard Grid tools such as Condor, as a brokering service and the gLite CEMon, for gathering and publishing resource information in GLUE Schema format. ReSS is used in OSG by Virtual Organizations (VO) such as Dark Energy Survey (DES), DZero and Engagement VO. ReSS is also used as a Resource Selection Service for Campus Grids, such as FermiGrid. VOs use ReSS to automate the resource selection in their workload management system to run jobs over the grid. In the past year, the system has been enhanced to enable publication and selection of storage resources and of any special software or software libraries (like MPI libraries) installed at computing resources. In this paper, we discuss the Resource Selection Service, its typical usage on the two scales of a National Cyber Infrastructure Grid, such as OSG, and of a campus Grid, such as FermiGrid.

  9. Interoperability of GADU in using heterogeneous grid resources for bioinformatics applications.

    Science.gov (United States)

    Sulakhe, Dinanath; Rodriguez, Alex; Wilde, Michael; Foster, Ian; Maltsev, Natalia

    2008-03-01

    Bioinformatics tools used for efficient and computationally intensive analysis of genetic sequences require large-scale computational resources to accommodate the growing data. Grid computational resources such as the Open Science Grid and TeraGrid have proved useful for scientific discovery. The genome analysis and database update system (GADU) is a high-throughput computational system developed to automate the steps involved in accessing the Grid resources for running bioinformatics applications. This paper describes the requirements for building an automated scalable system such as GADU that can run jobs on different Grids. The paper describes the resource-independent configuration of GADU using the Pegasus-based virtual data system that makes high-throughput computational tools interoperable on heterogeneous Grid resources. The paper also highlights the features implemented to make GADU a gateway to computationally intensive bioinformatics applications on the Grid. The paper will not go into the details of problems involved or the lessons learned in using individual Grid resources as it has already been published in our paper on genome analysis research environment (GNARE) and will focus primarily on the architecture that makes GADU resource independent and interoperable across heterogeneous Grid resources.

  10. Accelerating target discovery using pre-competitive open science-patients need faster innovation more than anyone else.

    Science.gov (United States)

    Low, Eric; Bountra, Chas; Lee, Wen Hwa

    2016-01-01

    We are experiencing a new era enabled by unencumbered access to high quality data through the emergence of open science initiatives in the historically challenging area of early stage drug discovery. At the same time, many patient-centric organisations are taking matters into their own hands by participating in, enabling and funding research. Here we present the rationale behind the innovative partnership between the Structural Genomics Consortium (SGC)-an open, pre-competitive pre-clinical research consortium and the research-focused patient organisation Myeloma UK to create a new, comprehensive platform to accelerate the discovery and development of new treatments for multiple myeloma.

  11. Open NASA Earth Exchange (OpenNEX): Strategies for enabling cross organization collaboration in the earth sciences

    Science.gov (United States)

    Michaelis, A.; Ganguly, S.; Nemani, R. R.; Votava, P.; Wang, W.; Lee, T. J.; Dungan, J. L.

    2014-12-01

    Sharing community-valued codes, intermediary datasets and results from individual efforts with others that are not in a direct funded collaboration can be a challenge. Cross organization collaboration is often impeded due to infrastructure security constraints, rigid financial controls, bureaucracy, and workforce nationalities, etc., which can force groups to work in a segmented fashion and/or through awkward and suboptimal web services. We show how a focused community may come together, share modeling and analysis codes, computing configurations, scientific results, knowledge and expertise on a public cloud platform; diverse groups of researchers working together at "arms length". Through the OpenNEX experimental workshop, users can view short technical "how-to" videos and explore encapsulated working environment. Workshop participants can easily instantiate Amazon Machine Images (AMI) or launch full cluster and data processing configurations within minutes. Enabling users to instantiate computing environments from configuration templates on large public cloud infrastructures, such as Amazon Web Services, may provide a mechanism for groups to easily use each others work and collaborate indirectly. Moreover, using the public cloud for this workshop allowed a single group to host a large read only data archive, making datasets of interest to the community widely available on the public cloud, enabling other groups to directly connect to the data and reduce the costs of the collaborative work by freeing other individual groups from redundantly retrieving, integrating or financing the storage of the datasets of interest.

  12. 76 FR 4645 - Fusion Energy Sciences Advisory Committee; Notice of Open Meeting

    Science.gov (United States)

    2011-01-26

    ... From the Federal Register Online via the Government Publishing Office DEPARTMENT OF ENERGY Fusion... Science. SUMMARY: This notice announces a meeting of the Fusion Energy Sciences Advisory Committee. The..., Office of Fusion Energy Sciences; U.S. Department of Energy; 1000 Independence Avenue, SW.;...

  13. Grid Interoperation with ARC Middleware for the CMS Experiment

    CERN Document Server

    Edelmann, Erik; Frey, Jaime; Gronager, Michael; Happonen, Kalle; Johansson, Daniel; Kleist, Josva; Klem, Jukka; Koivumaki, Jesper; Linden, Tomas; Pirinen, Antti; Qing, Di

    2010-01-01

    The Compact Muon Solenoid (CMS) is one of the general purpose experiments at the CERN Large Hadron Collider (LHC). CMS computing relies on different grid infrastructures to provide computational and storage resources. The major grid middleware stacks used for CMS computing are gLite, Open Science Grid (OSG) and ARC (Advanced Resource Connector). Helsinki Institute of Physics (HIP) hosts one of the Tier-2 centers for CMS computing. CMS Tier-2 centers operate software systems for data transfers (PhEDEx), Monte Carlo production (ProdAgent) and data analysis (CRAB). In order to provide the Tier-2 services for CMS, HIP uses tools and components from both ARC and gLite grid middleware stacks. Interoperation between grid systems is a challenging problem and HIP uses two different solutions to provide the needed services. The first solution is based on gLite-ARC grid level interoperability. This allows to use ARC resources in CMS without modifying the CMS application software. The second solution is based on developi...

  14. Grid Interoperation with ARC middleware for the CMS experiment

    Energy Technology Data Exchange (ETDEWEB)

    Edelmann, Erik; Groenager, Michael; Johansson, Daniel; Kleist, Josva [Nordic DataGrid Facility, Kastruplundgade 22, 1., DK-2770 Kastrup (Denmark); Field, Laurence; Qing, Di [CERN, CH-1211 Geneve 23 (Switzerland); Frey, Jaime [University of Wisconsin-Madison, 1210 W. Dayton St., Madison, WI (United States); Happonen, Kalle; Klem, Jukka; Koivumaeki, Jesper; Linden, Tomas; Pirinen, Antti, E-mail: Jukka.Klem@cern.c [Helsinki Institute of Physics, PO Box 64, FIN-00014 University of Helsinki (Finland)

    2010-04-01

    The Compact Muon Solenoid (CMS) is one of the general purpose experiments at the CERN Large Hadron Collider (LHC). CMS computing relies on different grid infrastructures to provide computational and storage resources. The major grid middleware stacks used for CMS computing are gLite, Open Science Grid (OSG) and ARC (Advanced Resource Connector). Helsinki Institute of Physics (HIP) hosts one of the Tier-2 centers for CMS computing. CMS Tier-2 centers operate software systems for data transfers (PhEDEx), Monte Carlo production (ProdAgent) and data analysis (CRAB). In order to provide the Tier-2 services for CMS, HIP uses tools and components from both ARC and gLite grid middleware stacks. Interoperation between grid systems is a challenging problem and HIP uses two different solutions to provide the needed services. The first solution is based on gLite-ARC grid level interoperability. This allows to use ARC resources in CMS without modifying the CMS application software. The second solution is based on developing specific ARC plugins in CMS software.

  15. Food consumption - Open TG-GATEs | LSDB Archive [Life Science Database Archive metadata

    Lifescience Database Archive (English)

    Full Text Available Open TG-GATEs Food consumption Data detail Data name Food consumption Description of data contents The list regarding results of food... consumption measurement acquired from rats used in the in vivo tests. Data file File name: open_tggates_foo...rchive/open-tggates/LATEST/open_tggates_food_consumption.zip File size: 108 KB Simple search URL http://togo...db.biosciencedbc.jp/togodb/view/open_tggates_food_consumption#en Data acquisition method The amount of daily food... intake of the first day is calculated as the amount of food taken during on

  16. Open Access in the Natural and Social Sciences: The Correspondence of Innovative Moves to Enhance Access, Inclusion and Impact in Scholarly Communication

    Science.gov (United States)

    Armbruster, Chris

    2008-01-01

    Online, open access is the superior model for scholarly communication. A variety of scientific communities in physics, the life sciences and economics have gone furthest in innovating their scholarly communication through open access, enhancing accessibility for scientists, students and the interested public. Open access enjoys a comparative…

  17. Open Access in the Natural and Social Sciences: The Correspondence of Innovative Moves to Enhance Access, Inclusion and Impact in Scholarly Communication

    Science.gov (United States)

    Armbruster, Chris

    2008-01-01

    Online, open access is the superior model for scholarly communication. A variety of scientific communities in physics, the life sciences and economics have gone furthest in innovating their scholarly communication through open access, enhancing accessibility for scientists, students and the interested public. Open access enjoys a comparative…

  18. Grid oscillators

    Science.gov (United States)

    Popovic, Zorana B.; Kim, Moonil; Rutledge, David B.

    1988-01-01

    Loading a two-dimensional grid with active devices offers a means of combining the power of solid-state oscillators in the microwave and millimeter-wave range. The grid structure allows a large number of negative resistance devices to be combined. This approach is attractive because the active devices do not require an external locking signal, and the combining is done in free space. In addition, the loaded grid is a planar structure amenable to monolithic integration. Measurements on a 25-MESFET grid at 9.7 GHz show power-combining and frequency-locking without an external locking signal, with an ERP of 37 W. Experimental far-field patterns agree with theoretical results obtained using reciprocity.

  19. Data privacy for the smart grid

    CERN Document Server

    Herold, Rebecca

    2015-01-01

    The Smart Grid and PrivacyWhat Is the Smart Grid? Changes from Traditional Energy Delivery Smart Grid Possibilities Business Model Transformations Emerging Privacy Risks The Need for Privacy PoliciesPrivacy Laws, Regulations, and Standards Privacy-Enhancing Technologies New Privacy Challenges IOT Big Data What Is the Smart Grid?Market and Regulatory OverviewTraditional Electricity Business SectorThe Electricity Open Market Classifications of Utilities Rate-Making ProcessesElectricity Consumer

  20. Effectiveness of instruction in rubric use in improving fourth-grade students' science open-response outcomes

    Science.gov (United States)

    Bohlin, Sandra L.

    This study focused on the role of rubric instruction in assisting students to answer open-response science questions. The purpose sought to determine if rubric instruction could help students recognize levels of performance, thereby improving their open-response outcomes. Performance tasks and open-response questions regarding real-world problems are necessary to assess the skills of application of knowledge. Rubrics are appropriate for scoring open-response questions because they can assess how students solve problems, the accuracy of solutions, and also provide feedback to students about characteristics of different qualities of work. Rubrics have been used in studies involving assessment, but the effects of rubric use on student learning has not been directly investigated. The theoretical foundations and research related to the use of rubrics suggest that rubrics assist in helping students to recognize more or less adequate responses and thus provide a self-adjustment strategy to improve students' own performance. Previous research has shown that students are able to follow a model to learn strategies for performance, that cognitive strategies can be taught, and that self-regulation enhances academic learning. The effectiveness of six weeks of rubric instruction with practice and feedback was compared to practice only with no feedback, and with no treatment. Chi2 tests were used to compare high, medium, and low score categories from students' pre- and posttests. The first research question inquired as to the effects of rubric instruction on students' ability to identify various levels of response from science open-response answers. Students who received rubric instruction were more able to identify rubric levels on the posttest without the presence of the rubric because they were familiar with it from treatment while the other two groups were not. They did not improve their ability from pre- to posttest, however. The practice group's ability to identify response

  1. An OpenEarth Framework (OEF) for Integrating and Visualizing Earth Science Data

    Science.gov (United States)

    Moreland, J. L.; Nadeau, D. R.; Baru, C.; Crosby, C. J.

    2009-12-01

    The integration of data is essential to make transformative progress in understanding the complex processes operating at the Earth’s surface and within its interior. While our current ability to collect massive amounts of data, develop structural models, and generate high-resolution dynamics models is well developed, our ability to quantitatively integrate these data and models into holistic interpretations of Earth systems is poorly developed. We lack the basic tools to realize a first-order goal in Earth science of developing integrated 4D models of Earth structure and processes using a complete range of available constraints, at a time when the research agenda of major efforts such as EarthScope demand such a capability. Among the challenges to 3D data integration are data that may be in different coordinate spaces, units, value ranges, file formats, and data structures. While several file format standards exist, they are infrequently or incorrectly used. Metadata is often missing, misleading, or relegated to README text files along side the data. This leaves much of the work to integrate data bogged down by simple data management tasks. The OpenEarth Framework (OEF) being developed by GEON addresses these data management difficulties. The software incorporates file format parsers, data interpretation heuristics, user interfaces to prompt for missing information, and visualization techniques to merge data into a common visual model. The OEF’s data access libraries parse formal and de facto standard file formats and map their data into a common data model. The software handles file format quirks, storage details, caching, local and remote file access, and web service protocol handling. Heuristics are used to determine coordinate spaces, units, and other key data features. Where multiple data structure, naming, and file organization conventions exist, those heuristics check for each convention’s use to find a high confidence interpretation of the data. When

  2. Auscope: Australian Earth Science Information Infrastructure using Free and Open Source Software

    Science.gov (United States)

    Woodcock, R.; Cox, S. J.; Fraser, R.; Wyborn, L. A.

    2013-12-01

    Since 2005 the Australian Government has supported a series of initiatives providing researchers with access to major research facilities and information networks necessary for world-class research. Starting with the National Collaborative Research Infrastructure Strategy (NCRIS) the Australian earth science community established an integrated national geoscience infrastructure system called AuScope. AuScope is now in operation, providing a number of components to assist in understanding the structure and evolution of the Australian continent. These include the acquisition of subsurface imaging , earth composition and age analysis, a virtual drill core library, geological process simulation, and a high resolution geospatial reference framework. To draw together information from across the earth science community in academia, industry and government, AuScope includes a nationally distributed information infrastructure. Free and Open Source Software (FOSS) has been a significant enabler in building the AuScope community and providing a range of interoperable services for accessing data and scientific software. A number of FOSS components have been created, adopted or upgraded to create a coherent, OGC compliant Spatial Information Services Stack (SISS). SISS is now deployed at all Australian Geological Surveys, many Universities and the CSIRO. Comprising a set of OGC catalogue and data services, and augmented with new vocabulary and identifier services, the SISS provides a comprehensive package for organisations to contribute their data to the AuScope network. This packaging and a variety of software testing and documentation activities enabled greater trust and notably reduced barriers to adoption. FOSS selection was important, not only for technical capability and robustness, but also for appropriate licensing and community models to ensure sustainability of the infrastructure in the long term. Government agencies were sensitive to these issues and Au

  3. Coalescent: an open-science framework for importance sampling in coalescent theory.

    Science.gov (United States)

    Tewari, Susanta; Spouge, John L

    2015-01-01

    Background. In coalescent theory, computer programs often use importance sampling to calculate likelihoods and other statistical quantities. An importance sampling scheme can exploit human intuition to improve statistical efficiency of computations, but unfortunately, in the absence of general computer frameworks on importance sampling, researchers often struggle to translate new sampling schemes computationally or benchmark against different schemes, in a manner that is reliable and maintainable. Moreover, most studies use computer programs lacking a convenient user interface or the flexibility to meet the current demands of open science. In particular, current computer frameworks can only evaluate the efficiency of a single importance sampling scheme or compare the efficiencies of different schemes in an ad hoc manner. Results. We have designed a general framework (http://coalescent.sourceforge.net; language: Java; License: GPLv3) for importance sampling that computes likelihoods under the standard neutral coalescent model of a single, well-mixed population of constant size over time following infinite sites model of mutation. The framework models the necessary core concepts, comes integrated with several data sets of varying size, implements the standard competing proposals, and integrates tightly with our previous framework for calculating exact probabilities. For a given dataset, it computes the likelihood and provides the maximum likelihood estimate of the mutation parameter. Well-known benchmarks in the coalescent literature validate the accuracy of the framework. The framework provides an intuitive user interface with minimal clutter. For performance, the framework switches automatically to modern multicore hardware, if available. It runs on three major platforms (Windows, Mac and Linux). Extensive tests and coverage make the framework reliable and maintainable. Conclusions. In coalescent theory, many studies of computational efficiency consider only

  4. Grid and Entrepreneurship Workshop

    CERN Multimedia

    2006-01-01

    The CERN openlab is organising a special workshop about Grid opportunities for entrepreneurship. This one-day event will provide an overview of what is involved in spin-off technology, with a special reference to the context of computing and data Grids. Lectures by experienced entrepreneurs will introduce the key concepts of entrepreneurship and review, in particular, the industrial potential of EGEE (the EU co-funded Enabling Grids for E-sciencE project, led by CERN). Case studies will be given by CEOs of European start-ups already active in the Grid and computing cluster area, and regional experts will provide an overview of efforts in several European regions to stimulate entrepreneurship. This workshop is designed to encourage students and researchers involved or interested in Grid technology to consider the entrepreneurial opportunities that this technology may create in the coming years. This workshop is organized as part of the CERN openlab student programme, which is co-sponsored by CERN, HP, ...

  5. Grid attacks avian flu

    CERN Multimedia

    2006-01-01

    During April, a collaboration of Asian and European laboratories analysed 300,000 possible drug components against the avian flu virus H5N1 using the EGEE Grid infrastructure. Schematic presentation of the avian flu virus.The distribution of the EGEE sites in the world on which the avian flu scan was performed. The goal was to find potential compounds that can inhibit the activities of an enzyme on the surface of the influenza virus, the so-called neuraminidase, subtype N1. Using the Grid to identify the most promising leads for biological tests could speed up the development process for drugs against the influenza virus. Co-ordinated by CERN and funded by the European Commission, the EGEE project (Enabling Grids for E-sciencE) aims to set up a worldwide grid infrastructure for science. The challenge of the in silico drug discovery application is to identify those molecules which can dock on the active sites of the virus in order to inhibit its action. To study the impact of small scale mutations on drug r...

  6. Organ weight - Open TG-GATEs | LSDB Archive [Life Science Database Archive metadata

    Lifescience Database Archive (English)

    Full Text Available [ Credits ] BLAST Search Image Search Home About Archive Update History Contact us ...name: open_tggates_organ_weight.zip File URL: ftp://ftp.biosciencedbc.jp/archive/...load License Update History of This Database Site Policy | Contact Us Organ weight - Open TG-GATEs | LSDB Archive ...

  7. CEL file attributes - Open TG-GATEs | LSDB Archive [Life Science Database Archive metadata

    Lifescience Database Archive (English)

    Full Text Available [ Credits ] BLAST Search Image Search Home About Archive Update History Contact us Open... by Artio About This Database Database Description Download License Update History of This Database Site Policy | Contact Us CEL file attributes - Open TG-GATEs | LSDB Archive ...

  8. Shining examples of grid applications

    CERN Multimedia

    Hammerle, Hannelore

    2006-01-01

    Users in more than 150 virtual organisations from fields as diverse as biomedicine, earth Sciences and high-energy physics are now using the distributed computing infrastructure of the enabling grids for E-sciencE (EGEE) project, which shows the wide adoption and versatiblity of this new technology (1 page)

  9. The Grid2003 Production Grid Principles and Practice

    CERN Document Server

    Foster, I; Gose, S; Maltsev, N; May, E; Rodríguez, A; Sulakhe, D; Vaniachine, A; Shank, J; Youssef, S; Adams, D; Baker, R; Deng, W; Smith, J; Yu, D; Legrand, I; Singh, S; Steenberg, C; Xia, Y; Afaq, A; Berman, E; Annis, J; Bauerdick, L A T; Ernst, M; Fisk, I; Giacchetti, L; Graham, G; Heavey, A; Kaiser, J; Kuropatkin, N; Pordes, R; Sekhri, V; Weigand, J; Wu, Y; Baker, K; Sorrillo, L; Huth, J; Allen, M; Grundhoefer, L; Hicks, J; Luehring, F C; Peck, S; Quick, R; Simms, S; Fekete, G; Van den Berg, J; Cho, K; Kwon, K; Son, D; Park, H; Canon, S; Jackson, K; Konerding, D E; Lee, J; Olson, D; Sakrejda, I; Tierney, B; Green, M; Miller, R; Letts, J; Martin, T; Bury, D; Dumitrescu, C; Engh, D; Gardner, R; Mambelli, M; Smirnov, Y; Voeckler, J; Wilde, M; Zhao, Y; Zhao, X; Avery, P; Cavanaugh, R J; Kim, B; Prescott, C; Rodríguez, J; Zahn, A; McKee, S; Jordan, C; Prewett, J; Thomas, T; Severini, H; Clifford, B; Deelman, E; Flon, L; Kesselman, C; Mehta, G; Olomu, N; Vahi, K; De, K; McGuigan, P; Sosebee, M; Bradley, D; Couvares, P; De Smet, A; Kireyev, C; Paulson, E; Roy, A; Koranda, S; Moe, B; Brown, B; Sheldon, P

    2004-01-01

    The Grid2003 Project has deployed a multi-virtual organization, application-driven grid laboratory ("GridS") that has sustained for several months the production-level services required by physics experiments of the Large Hadron Collider at CERN (ATLAS and CMS), the Sloan Digital Sky Survey project, the gravitational wave search experiment LIGO, the BTeV experiment at Fermilab, as well as applications in molecular structure analysis and genome analysis, and computer science research projects in such areas as job and data scheduling. The deployed infrastructure has been operating since November 2003 with 27 sites, a peak of 2800 processors, work loads from 10 different applications exceeding 1300 simultaneous jobs, and data transfers among sites of greater than 2 TB/day. We describe the principles that have guided the development of this unique infrastructure and the practical experiences that have resulted from its creation and use. We discuss application requirements for grid services deployment and configur...

  10. 15 MW HArdware-in-the-loop Grid Simulation Project

    Energy Technology Data Exchange (ETDEWEB)

    Rigas, Nikolaos [Clemson Univ., SC (United States); Fox, John Curtiss [Clemson Univ., SC (United States); Collins, Randy [Clemson Univ., SC (United States); Tuten, James [Clemson Univ., SC (United States); Salem, Thomas [Clemson Univ., SC (United States); McKinney, Mark [Clemson Univ., SC (United States); Hadidi, Ramtin [Clemson Univ., SC (United States); Gislason, Benjamin [Clemson Univ., SC (United States); Boessneck, Eric [Clemson Univ., SC (United States); Leonard, Jesse [Clemson Univ., SC (United States)

    2014-10-31

    The 15MW Hardware-in-the-loop (HIL) Grid Simulator project was to (1) design, (2) construct and (3) commission a state-of-the-art grid integration testing facility for testing of multi-megawatt devices through a ‘shared facility’ model open to all innovators to promote the rapid introduction of new technology in the energy market to lower the cost of energy delivered. The 15 MW HIL Grid Simulator project now serves as the cornerstone of the Duke Energy Electric Grid Research, Innovation and Development (eGRID) Center. This project leveraged the 24 kV utility interconnection and electrical infrastructure of the US DOE EERE funded WTDTF project at the Clemson University Restoration Institute in North Charleston, SC. Additionally, the project has spurred interest from other technology sectors, including large PV inverter and energy storage testing and several leading edge research proposals dealing with smart grid technologies, grid modernization and grid cyber security. The key components of the project are the power amplifier units capable of providing up to 20MW of defined power to the research grid. The project has also developed a one of a kind solution to performing fault ride-through testing by combining a reactive divider network and a large power converter into a hybrid method. This unique hybrid method of performing fault ride-through analysis will allow for the research team at the eGRID Center to investigate the complex differences between the alternative methods of performing fault ride-through evaluations and will ultimately further the science behind this testing. With the final goal of being able to perform HIL experiments and demonstration projects, the eGRID team undertook a significant challenge with respect to developing a control system that is capable of communicating with several different pieces of equipment with different communication protocols in real-time. The eGRID team developed a custom fiber optical network that is based upon FPGA

  11. The effects of blogs versus dialogue journals on open-response writing scores and attitudes of grade eight science students

    Science.gov (United States)

    Erickson, Diane K.

    Today's students have grown up surrounded by technology. They use cell phones, word processors, and the Internet with ease, talking with peers in their community and around the world through e-mails, chatrooms, instant messaging, online discussions, and weblogs ("blogs"). In the midst of this technological explosion, adolescents face a growing need for strong literacy skills in all subject areas for achievement in school and on mandated state and national high stakes tests. The purpose of this study was to examine the use of blogs as a tool for improving open-response writing in the secondary science classroom in comparison to the use of handwritten dialogue journals. The study used a mixed-method approach, gathering both quantitative and qualitative data from 94 students in four eighth-grade science classes. Two classes participated in online class blogs where they posted ideas about science and responded to the ideas of other classmates. Two classes participated in handwritten dialogue journals, writing ideas about science and exchanging journals to respond to the ideas of classmates. The study explored these research questions: Does the use of blogs, as compared to the use of handwritten dialogue journals, improve the open-response writing scores of eighth grade science students? How do students describe their experience using blogs to study science as compared to students using handwritten dialogue journals? and How do motivation, self-efficacy, and community manifest themselves in students who use blogs as compared to students who use handwritten dialogue journals? The quantitative aspect of the study used data from pre- and post-tests and from a Likert-scale post-survey. The pre- and post-writing on open-response science questions were scored using the Massachusetts Comprehensive Assessment System (MCAS) open-response scoring rubric. The study found no statistically significant difference in the writing scores between the blog group and the dialogue journal

  12. Solar Fridges and Personal Power Grids: How Berkeley Lab is Fighting Global Poverty (LBNL Science at the Theater)

    Energy Technology Data Exchange (ETDEWEB)

    Buluswar, Shashi [Director, LBNL Institute for Globally Transformative Technologies; Gadgil, Ashok

    2012-11-26

    At this November 26, 2012 Science at the Theater, scientists discussed the recently launched LBNL Institute for Globally Transformative Technologies (LIGTT) at Berkeley Lab. LIGTT is an ambitious mandate to discover and develop breakthrough technologies for combating global poverty. It was created with the belief that solutions will require more advanced R&D and a deep understanding of market needs in the developing world. Berkeley Lab's Ashok Gadgil, Shashi Buluswar and seven other LIGTT scientists discussed what it takes to develop technologies that will impact millions of people. These include: 1) Fuel efficient stoves for clean cooking: Our scientists are improving the Berkeley Darfur Stove, a high efficiency stove used by over 20,000 households in Darfur; 2) The ultra-low energy refrigerator: A lightweight, low-energy refrigerator that can be mounted on a bike so crops can survive the trip from the farm to the market; 3) The solar OB suitcase: A low-cost package of the five most critical biomedical devices for maternal and neonatal clinics; 4) UV Waterworks: A device for quickly, safely and inexpensively disinfecting water of harmful microorganisms.

  13. Smart grid security innovative solutions for a modernized grid

    CERN Document Server

    Skopik, Florian

    2015-01-01

    The Smart Grid security ecosystem is complex and multi-disciplinary, and relatively under-researched compared to the traditional information and network security disciplines. While the Smart Grid has provided increased efficiencies in monitoring power usage, directing power supplies to serve peak power needs and improving efficiency of power delivery, the Smart Grid has also opened the way for information security breaches and other types of security breaches. Potential threats range from meter manipulation to directed, high-impact attacks on critical infrastructure that could bring down regi

  14. Grid reliability

    Science.gov (United States)

    Saiz, P.; Andreeva, J.; Cirstoiu, C.; Gaidioz, B.; Herrala, J.; Maguire, E. J.; Maier, G.; Rocha, R.

    2008-07-01

    Thanks to the Grid, users have access to computing resources distributed all over the world. The Grid hides the complexity and the differences of its heterogeneous components. In such a distributed system, it is clearly very important that errors are detected as soon as possible, and that the procedure to solve them is well established. We focused on two of its main elements: the workload and the data management systems. We developed an application to investigate the efficiency of the different centres. Furthermore, our system can be used to categorize the most common error messages, and control their time evolution.

  15. Un'occasione per capire cos'è il Grid

    CERN Multimedia

    2006-01-01

    After the WWW, the giant computing infrastructure Grid was also created at CERN in Geneva; and to understand the Grid, the Egee (Enabling Grids for e-Science), financed by the European Commission and Cern, a fourth event of Grid Industry Day was organized in Catania

  16. Design of web platform for science and engineering in the model of open market

    Science.gov (United States)

    Demichev, A. P.; Kryukov, A. P.

    2016-09-01

    This paper presents a design and operation algorithms of a web-platform for convenient, secure and effective remote interaction on the principles of the open market of users and providers of scientific application software and databases.

  17. Download - Open TG-GATEs Pathological Image Database | LSDB Archive [Life Science Database Archive metadata

    Lifescience Database Archive (English)

    Full Text Available formation open_tggates_pathological_image.zip (370 KB) Simple search and download 3 Pathological Images (SVS... format) Digital pathological images in SVS format - Downlaod via FTP Joomla SEF

  18. Update History of This Database - Open TG-GATEs | LSDB Archive [Life Science Database Archive metadata

    Lifescience Database Archive (English)

    Full Text Available [ Credits ] BLAST Search Image Search Home About Archive Update History Contact us Open TG-GATEs Up...date History of This Database Date Update contents 2012/07/04 A part of the hematology and b...s Database Database Description Download License Update History of This Database Site Policy | Contact Us Update History of This Database - Open TG-GATEs | LSDB Archive ...

  19. Anatomy of BioJS, an open source community for the life sciences.

    Science.gov (United States)

    Yachdav, Guy; Goldberg, Tatyana; Wilzbach, Sebastian; Dao, David; Shih, Iris; Choudhary, Saket; Crouch, Steve; Franz, Max; García, Alexander; García, Leyla J; Grüning, Björn A; Inupakutika, Devasena; Sillitoe, Ian; Thanki, Anil S; Vieira, Bruno; Villaveces, José M; Schneider, Maria V; Lewis, Suzanna; Pettifer, Steve; Rost, Burkhard; Corpas, Manuel

    2015-07-08

    BioJS is an open source software project that develops visualization tools for different types of biological data. Here we report on the factors that influenced the growth of the BioJS user and developer community, and outline our strategy for building on this growth. The lessons we have learned on BioJS may also be relevant to other open source software projects.

  20. 77 FR 15996 - Science Advisory Board (SAB); Notice of Open Meeting

    Science.gov (United States)

    2012-03-19

    ... programs are of the highest quality and provide optimal support to resource management. Time and Date: The... Development Portfolio Review Task Force; (2) External Review of the Cooperative Institute for Climate Science...

  1. Database Description - Open TG-GATEs | LSDB Archive [Life Science Database Archive metadata

    Lifescience Database Archive (English)

    Full Text Available National Institute of Biomedical Innovation, National Institute of Health Scienc...es, and 15 pharmaceutical companies Contact address Toxicogenomics Informatics Project National Institute of Biomedical...roject (TGP) is a government-private companies collaborative project started by the National Institute of Biomedical...ed by the National Institute of Biomedical, the National Institute of Health Scie...ubmed ID: Original website information Database maintenance site Toxicogenomics Informatics Project, National Institute of Biomedical

  2. Open Science and Open Access

    OpenAIRE

    Bernal, Isabel; Oficina Técnica de Digital.CSIC

    2017-01-01

    Presentation at the Kick-off meeting of H2020-funded Metabolic Dysfunctions associated with Pharmacological Treatment of Schizophrenia project (Treatment) held in Madrid at the CSIC Institute de Investigaciones Biomédicas Alberto Sols on June 15-16, 2017.

  3. Ecosystem Based Business Model of Smart Grid

    DEFF Research Database (Denmark)

    Lundgaard, Morten Raahauge; Ma, Zheng; Jørgensen, Bo Nørregaard

    2015-01-01

    This paper tries to investigate the ecosystem based business model in a smart grid infrastructure and the potential of value capture in the highly complex macro infrastructure such as smart grid. This paper proposes an alternative perspective to study the smart grid business ecosystem to support...... the infrastructural challenges, such as the interoperability of business components for smart grid. So far little research has explored the business ecosystem in the smart grid concept. The study on the smart grid with the theory of business ecosystem may open opportunities to understand market catalysts. This study...... contributes an understanding of business ecosystem applicable for smart grid. Smart grid infrastructure is an intricate business ecosystem, which have several intentions to deliver the value proposition and what it should be. The findings help to identify and capture value from markets....

  4. Climate Solutions Presentations on Science On a Sphere (SOS) and SOS Explorer achieve acceptance of Climate Science among Policymakers as well as the Public: US National Academy of Sciences Symposium/Open House Example

    Science.gov (United States)

    Sievering, H.

    2015-12-01

    The outcomes of climate science are inherently rife with discussions of dire consequences for humans that leave many listeners feeling helpless and hopeless. We have found that a focus on clean energy solutions, without reference to dirty energy, substantially reduces (may even eliminate) the negativity associated with sea level rise, extreme weather and other climate change presentations. US audiences respond well to discussion of California's clean energy transformation with solar, wind, geothermal and water power together now approaching 25% of total energy supply for the world's sixth largest economy. For both policymakers and the general public, a "positive climate change" presentation does not generally suffice on its own. Clear visual display of climate science information is essential. We have found the Science On a Sphere (SOS) National Oceanic and Atmospheric Administration science education tool, to be exceptional in this regard. Further, broad dissemination is possible given the SOS network consists of over 120 sites in 23 countries. The new SOS Explorer system, an advanced science education tool, can readily utilize the over 500 available SOS data sets. We have recently developed an arctic amplification and mid-latitude climate change impacts program for the upcoming US National Academy of Sciences' Arctic Matters Symposium/Open House. This SOS and SOS Explorer education program will be described with emphasis on the climate solutions incorporated into this module targeted at US policymakers and invited open house public.

  5. OVERGRID: A Unified Overset Grid Generation Graphical Interface

    Science.gov (United States)

    Chan, William M.; Akien, Edwin W. (Technical Monitor)

    1999-01-01

    This paper presents a unified graphical interface and gridding strategy for performing overset grid generation. The interface called OVERGRID has been specifically designed to follow an efficient overset gridding strategy, and contains general grid manipulation capabilities as well as modules that are specifically suited for overset grids. General grid utilities include functions for grid redistribution, smoothing, concatenation, extraction, extrapolation, projection, and many others. Modules specially tailored for overset grids include a seam curve extractor, hyperbolic and algebraic surface grid generators, a hyperbolic volume grid generator, and a Cartesian box grid generator, Grid visualization is achieved using OpenGL while widgets are constructed with Tcl/Tk. The software is portable between various platforms from UNIX workstations to personal computers.

  6. 食品科学与工程实验室开放的管理模式%Managememl Mode the opening of Food Science and Engineering Laboratory

    Institute of Scientific and Technical Information of China (English)

    丁培峰

    2011-01-01

    Based on the necessity of the opening oi food science and engineering laboratory, and through practice, a set of suitable managementmode on food science and engineering opening laboratory was summarized.%针对食品科学与工程实验室开放的必要性,通过实践总结出了一套适合食品科学与工程实验室开放的管理模式.

  7. Flexible and Inflexible Energy Engagements – a Study of the Danish Smart Grid Strategy

    DEFF Research Database (Denmark)

    Schick, Lea; Gad, Christopher

    2015-01-01

    According to many visions for smart grids, consumers will come to play a more ‘active’ role in the energy systems of tomorrow. In this paper, we examine how the future ‘flexible electricity consumer’ is imagined in the Danish National Smart Grid Strategy. Our analysis of reports produced...... by the national Smart Grid Network shows that this vision relies on a techno-centric and rather ‘inflexible’ consumer figuration. However, rather than adopting a conventional social science approach in order to criticize this narrow imaginary, we show that potentials for critique and alternatives can be found...... internally in the Smart Grid Network. Paying attention to different stories, we thus aim to characterize particular forms of ‘infra-critique’ and ‘infra-reflexivity’ emerging from within the field. This mode of reflexivity, we argue, opens up to more flexible and reflexive conceptions of the ‘flexible...

  8. A Middleware road towards Web (Grid) Services

    CERN Document Server

    Ahmed, Zeeshan

    2010-01-01

    Middleware technologies is a very big field, containing a strong already done research as well as the currently running research to confirm already done research's results and the to have some new solution by theoretical as well as the experimental (practical) way. This document has been produced by Zeeshan Ahmed (Student: Connectivity Software Technologies Blekinge Institute of Technologies). This describes the research already done in the field of middleware technologies including Web Services, Grid Computing, Grid Services and Open Grid Service Infrastructure & Architecture. This document concludes with the overview of Web (Grid) Service, Chain of Web (Grid) Services and the necessary security issue.

  9. Open-inquiry driven overcoming of epistemological difficulties in engineering undergraduates: A case study in the context of thermal science

    Directory of Open Access Journals (Sweden)

    Nicola Pizzolato

    2014-02-01

    Full Text Available This paper addresses the efficacy of an open-inquiry approach that allows students to build on traditionally received knowledge. A sample of thirty engineering undergraduates, having already attended traditional university physics instruction, was selected for this study. The students were involved in a six-week long learning experience of open-inquiry research activities within the highly motivating context of developing a thermodynamically efficient space base on Mars. They designed and carried out their own scientific investigations, which involved gathering information, collecting and analyzing data, providing explanations, and sharing results. A questionnaire containing fifteen open-ended real-world problems in thermal science was administered to the students both prior to and after all activities, with the aim of investigating the nature of their difficulties in problem solving. Students’ answers were classified into three epistemological profiles and a prepost instruction comparison was carried out, using methods of statistical implicative analysis. The students obtained significant benefits from their open-inquiry experiences, in terms of the strengthening of their practical and reasoning abilities, by proficiently applying the learned concepts to face and solve real-world problem situations.

  10. Open Educational Resources from Performance Task using Video Analysis and Modeling - Tracker and K12 science education framework

    CERN Document Server

    Wee, Loo Kang

    2014-01-01

    This invited paper discusses why Physics performance task by grade 9 students in Singapore is worth participating in for two reasons; 1) the video analysis and modeling are open access, licensed creative commons attribution for advancing open educational resources in the world and 2) allows students to be like physicists, where the K12 science education framework is adopted. Personal reflections on how physics education can be made more meaningful in particular Practice 1: Ask Questions, Practice 2: Use Models and Practice 5: Mathematical and Computational Thinking using Video Modeling supported by evidence based data from video analysis. This paper hopes to spur fellow colleagues to look into open education initiatives such as our Singapore Tracker community open educational resources curate on http://weelookang.blogspot.sg/p/physics-applets-virtual-lab.html as well as digital libraries http://iwant2study.org/lookangejss/ directly accessible through Tracker 4.86, EJSS reader app on Android and iOS and EJS 5....

  11. Kids Enjoy Grids

    CERN Multimedia

    2007-01-01

    I want to come back and work here when I'm older,' was the spontaneous reaction of one of the children invited to CERN by the Enabling Grids for E-sciencE project for a 'Grids for Kids' day at the end of January. The EGEE project is led by CERN, and the EGEE gender action team organized the day to introduce children to grid technology at an early age. The school group included both boys and girls, aged 9 to 11. All of the presenters were women. 'In general, before this visit, the children thought that scientists always wore white coats and were usually male, with wild Einstein-like hair,' said Jackie Beaver, the class's teacher at the Institut International de Lancy, a school near Geneva. 'They were surprised and pleased to see that women became scientists, and that scientists were quite 'normal'.' The half-day event included presentations about why Grids are needed, a visit of the computer centre, some online games, and plenty of time for questions. In the end, everyone agreed that it was a big success a...

  12. The Rise of Open Access in the Creative, Educational and Science Commons

    Science.gov (United States)

    Kiel-Chisholm, Scott; Fitzgerald, Brian

    2006-01-01

    Management of intellectual property and, in particular, copyright is one of the most challenging issues in an increasingly digital world. The rise of the open access (OA) movement provides a new model for managing intellectual property in educational and research environments. OA aims to promote greater and more efficient access to educational and…

  13. Teaching Botanical Identification to Adults: Experiences of the UK Participatory Science Project "Open Air Laboratories"

    Science.gov (United States)

    Stagg, Bethan C.; Donkin, Maria

    2013-01-01

    Taxonomic education and botany are increasingly neglected in schools and universities, leading to a "missed generation" of adults that cannot identify organisms, especially plants. This study pilots three methods for teaching identification of native plant species to forty-three adults engaged in the participatory science project…

  14. Opening the Big Black Box: European study reveals visitors' impressions of science laboratories

    CERN Multimedia

    2004-01-01

    "On 29 - 30 March the findings of 'Inside the Big Black Box'- a Europe-wide science and society project - will be revealed during a two-day seminar hosted by CERN*. The principle aim of Inside the Big Black Box (IN3B) is to determine whether a working scientific laboratory can capture the curiosity of the general public through visits" (1 page)

  15. HIRENASD NLR grid

    Data.gov (United States)

    National Aeronautics and Space Administration — Structured multiblock grid of HIRENASD wing with medium grid density, about 10 mill grid points, 9.5 mill cells. Starting from coarse AIAA AEPW structured grid,...

  16. Surveying the citizen science landscape: an exploration of the design, delivery and impact of citizen science through the lens of the Open Air Laboratories (OPAL) programme.

    Science.gov (United States)

    Davies, Linda; Fradera, Roger; Riesch, Hauke; Lakeman-Fraser, Poppy

    2016-07-22

    This paper provides a short introduction to the topic of citizen science (CS) identifying the shift from the knowledge deficit model to more inclusive, participatory science. It acknowledges the benefits of new technology and the opportunities it brings for mass participation and data manipulation. It focuses on the increase in interest in CS in recent years and draws on experience gained from the Open Air Laboratories (OPAL) programme launched in England in 2007. The drivers and objectives for OPAL are presented together with background information on the partnership, methods and scales. The approaches used by researchers ranged from direct public participation in mass data collection through field surveys to research with minimal public engagement. The supporting services focused on education, particularly to support participants new to science, a media strategy and data services. Examples from OPAL are used to illustrate the different approaches to the design and delivery of CS that have emerged over recent years and the breadth of opportunities for public participation the current landscape provides. Qualitative and quantitative data from OPAL are used as evidence of the impact of CS. While OPAL was conceived ahead of the more recent formalisation of approaches to the design, delivery and analysis of CS projects and their impact, it nevertheless provides a range of examples against which to assess the various benefits and challenges emerging in this fast developing field.

  17. Grid reliability

    CERN Document Server

    Saiz, P; Rocha, R; Andreeva, J

    2007-01-01

    We are offering a system to track the efficiency of different components of the GRID. We can study the performance of both the WMS and the data transfers At the moment, we have set different parts of the system for ALICE, ATLAS, CMS and LHCb. None of the components that we have developed are VO specific, therefore it would be very easy to deploy them for any other VO. Our main goal is basically to improve the reliability of the GRID. The main idea is to discover as soon as possible the different problems that have happened, and inform the responsible. Since we study the jobs and transfers issued by real users, we see the same problems that users see. As a matter of fact, we see even more problems than the end user does, since we are also interested in following up the errors that GRID components can overcome by themselves (like for instance, in case of a job failure, resubmitting the job to a different site). This kind of information is very useful to site and VO administrators. They can find out the efficien...

  18. ASCR Science Network Requirements

    Energy Technology Data Exchange (ETDEWEB)

    Dart, Eli; Tierney, Brian

    2009-08-24

    The Energy Sciences Network (ESnet) is the primary provider of network connectivity for the US Department of Energy Office of Science, the single largest supporter of basic research in the physical sciences in the United States. In support of the Office of Science programs, ESnet regularly updates and refreshes its understanding of the networking requirements of the instruments, facilities, scientists, and science programs that it serves. This focus has helped ESnet to be a highly successful enabler of scientific discovery for over 20 years. In April 2009 ESnet and the Office of Advanced Scientific Computing Research (ASCR), of the DOE Office of Science, organized a workshop to characterize the networking requirements of the programs funded by ASCR. The ASCR facilities anticipate significant increases in wide area bandwidth utilization, driven largely by the increased capabilities of computational resources and the wide scope of collaboration that is a hallmark of modern science. Many scientists move data sets between facilities for analysis, and in some cases (for example the Earth System Grid and the Open Science Grid), data distribution is an essential component of the use of ASCR facilities by scientists. Due to the projected growth in wide area data transfer needs, the ASCR supercomputer centers all expect to deploy and use 100 Gigabit per second networking technology for wide area connectivity as soon as that deployment is financially feasible. In addition to the network connectivity that ESnet provides, the ESnet Collaboration Services (ECS) are critical to several science communities. ESnet identity and trust services, such as the DOEGrids certificate authority, are widely used both by the supercomputer centers and by collaborations such as Open Science Grid (OSG) and the Earth System Grid (ESG). Ease of use is a key determinant of the scientific utility of network-based services. Therefore, a key enabling aspect for scientists beneficial use of high

  19. Plenario: An Open Data Discovery and Exploration Platform for Urban Science

    Energy Technology Data Exchange (ETDEWEB)

    Catlett, Charlie; Malik, Tanu; Goldstein, Brett J.; Giuffrida, Jonathan; Shao, Yetong; Panella, Alessandro; Eder, Derek; van Zanten, Eric; Mitchum, Robert; Thaler, Severin; Foster, Ian

    2014-12-01

    The past decade has seen the widespread release of open data concerning city services, conditions, and activities by government bodies and public institutions of all sizes. Hundreds of open data portals now host thousands of datasets of many different types. These new data sources represent enormous po- tential for improved understanding of urban dynamics and processes—and, ultimately, for more livable, efficient, and prosperous communities. However, those who seek to realize this potential quickly discover that discovering and applying those data relevant to any particular question can be extraordinarily dif- ficult, due to decentralized storage, heterogeneous formats, and poor documentation. In this context, we introduce Plenario, a platform designed to automating time-consuming tasks associated with the discovery, exploration, and application of open city data—and, in so doing, reduce barriers to data use for researchers, policymakers, service providers, journalists, and members of the general public. Key innovations include a geospatial data warehouse that allows data from many sources to be registered into a common spatial and temporal frame; simple and intuitive interfaces that permit rapid discovery and exploration of data subsets pertaining to a particular area and time, regardless of type and source; easy export of such data subsets for further analysis; a user-configurable data ingest framework for automated importing and periodic updating of new datasets into the data warehouse; cloud hosting for elastic scaling and rapid creation of new Plenario instances; and an open source implementation to enable community contributions. We describe here the architecture and implementation of the Plenario platform, discuss lessons learned from its use by several communities, and outline plans for future work.

  20. SemMat: Federated Semantic Services Platform for Open materials Science and Engineering

    Science.gov (United States)

    2017-01-01

    interested organization to use our software package to develop their vocabulary, or build upon the current system. Annotation tools: We have developed ...PERFORMING ORGANIZATION NAME(S) AND ADDRESS(ES) Wright State University 3640 Colonel Glenn Hwy Dayton OH 45435-0001 8. PERFORMING ORGANIZATION REPORT...ABSTRACT An open source framework was developed to enable crowd-sourcing and curation of controlled vocabularies. The framework was then applied to

  1. Opening up to Big Data: Computer-Assisted Analysis of Textual Data in Social Sciences

    OpenAIRE

    Wiedemann, Gregor

    2013-01-01

    Two developments in computational text analysis may change the way qualitative data analysis in social sciences is performed: 1. the availability of digital text worth to investigate is growing rapidly, and 2. the improvement of algorithmic information extraction approaches, also called text mining, allows for further bridging the gap between qualitative and quantitative text analysis. The key factor hereby is the inclusion of context into computational linguistic models which extends convent...

  2. Sustainable Data Evolution Technology for Power Grid Optimization

    Energy Technology Data Exchange (ETDEWEB)

    2017-10-09

    The SDET Tool is used to create open-access power grid data sets and facilitate updates of these data sets by the community. Pacific Northwest National Laboratory (PNNL) and its power industry and software vendor partners are developing an innovative sustainable data evolution technology (SDET) to create open-access power grid datasets and facilitate updates to these datasets by the power grid community. The objective is to make this a sustained effort within and beyond the ARPA-E GRID DATA program so that the datasets can evolve over time and meet the current and future needs for power grid optimization and potentially other applications in power grid operation and planning.

  3. Big data, open science and the brain: lessons learned from genomics

    Directory of Open Access Journals (Sweden)

    Suparna eChoudhury

    2014-05-01

    Full Text Available The BRAIN Initiative aims to break new ground in the scale and speed of data collection in neuroscience, requiring tools to handle data in the magnitude of yottabytes (1024. The scale, investment and organization of it are being compared to the Human Genome Project (HGP, which has exemplified ‘big science’ for biology. In line with the trend towards Big Data in genomic research, the promise of the BRAIN Initiative, as well as the European Human Brain Project, rests on the possibility to amass vast quantities of data to model the complex interactions between the brain and behaviour and inform the diagnosis and prevention of neurological disorders and psychiatric disease. Advocates of this ‘data driven’ paradigm in neuroscience argue that harnessing the large quantities of data generated across laboratories worldwide has numerous methodological, ethical and economic advantages, but it requires the neuroscience community to adopt a culture of data sharing and open access to benefit from them. In this article, we examine the rationale for data sharing among advocates and briefly exemplify these in terms of new ‘open neuroscience’ projects. Then, drawing on the frequently invoked model of data sharing in genomics, we go on to demonstrate the complexities of data sharing, shedding light on the sociological and ethical challenges within the realms of institutions, researchers and participants, namely dilemmas around public/private interests in data, (lack of motivation to share in the academic community, and potential loss of participant anonymity. Our paper serves to highlight some foreseeable tensions around data sharing relevant to the emergent ‘open neuroscience’ movement.

  4. Publishing Life Science Data as Linked Open Data: the Case Study of miRBase

    CERN Document Server

    Dalamagas, Theodore; Papastefanatos, George; Stavrakas, Yannis; Hatzigeorgiou, Artemis G

    2012-01-01

    This paper presents our Linked Open Data (LOD) infrastructures for genomic and experimental data related to microRNA biomolecules. Legacy data from two well-known microRNA databases with experimental data and observations, as well as change and version information about microRNA entities, are fused and exported as LOD. Our LOD server assists biologists to explore biological entities and their evolution, and provides a SPARQL endpoint for applications and services to query historical miRNA data and track changes, their causes and effects.

  5. The EDRN knowledge environment: an open source, scalable informatics platform for biological sciences research

    Science.gov (United States)

    Crichton, Daniel; Mahabal, Ashish; Anton, Kristen; Cinquini, Luca; Colbert, Maureen; Djorgovski, S. George; Kincaid, Heather; Kelly, Sean; Liu, David

    2017-05-01

    We describe here the Early Detection Research Network (EDRN) for Cancer's knowledge environment. It is an open source platform built by NASA's Jet Propulsion Laboratory with contributions from the California Institute of Technology, and Giesel School of Medicine at Dartmouth. It uses tools like Apache OODT, Plone, and Solr, and borrows heavily from JPL's Planetary Data System's ontological infrastructure. It has accumulated data on hundreds of thousands of biospecemens and serves over 1300 registered users across the National Cancer Institute (NCI). The scalable computing infrastructure is built such that we are being able to reach out to other agencies, provide homogeneous access, and provide seamless analytics support and bioinformatics tools through community engagement.

  6. An open annotation ontology for science on web 3.0.

    Science.gov (United States)

    Ciccarese, Paolo; Ocana, Marco; Garcia Castro, Leyla Jael; Das, Sudeshna; Clark, Tim

    2011-05-17

    There is currently a gap between the rich and expressive collection of published biomedical ontologies, and the natural language expression of biomedical papers consumed on a daily basis by scientific researchers. The purpose of this paper is to provide an open, shareable structure for dynamic integration of biomedical domain ontologies with the scientific document, in the form of an Annotation Ontology (AO), thus closing this gap and enabling application of formal biomedical ontologies directly to the literature as it emerges. Initial requirements for AO were elicited by analysis of integration needs between biomedical web communities, and of needs for representing and integrating results of biomedical text mining. Analysis of strengths and weaknesses of previous efforts in this area was also performed. A series of increasingly refined annotation tools were then developed along with a metadata model in OWL, and deployed for feedback and additional requirements the ontology to users at a major pharmaceutical company and a major academic center. Further requirements and critiques of the model were also elicited through discussions with many colleagues and incorporated into the work. This paper presents Annotation Ontology (AO), an open ontology in OWL-DL for annotating scientific documents on the web. AO supports both human and algorithmic content annotation. It enables "stand-off" or independent metadata anchored to specific positions in a web document by any one of several methods. In AO, the document may be annotated but is not required to be under update control of the annotator. AO contains a provenance model to support versioning, and a set model for specifying groups and containers of annotation. AO is freely available under open source license at http://purl.org/ao/, and extensive documentation including screencasts is available on AO's Google Code page: http://code.google.com/p/annotation-ontology/ . The Annotation Ontology meets critical requirements for

  7. Gene expression data (CEL files) - Open TG-GATEs | LSDB Archive [Life Science Database Archive metadata

    Lifescience Database Archive (English)

    Full Text Available [ Credits ] BLAST Search Image Search Home About Archive Update History Contact us Open TG-GATEs Gene expres...sion data (CEL files) Data detail Data name Gene expression data (CEL files) Descri...Tab Separated Value (TSV) format is included. CEL is one of the file formats that expresses gene expression ...data (raw data) generated from Affymetrix GeneChip®. Data file File name: Gene expression data from rat samp...ze: 12.0GB total File name: Gene expression data from human samples File URL: ftp://ftp.biosciencedbc.jp/arc

  8. Open Access, Intellectual Property, and How Biotechnology Becomes a New Software Science

    CERN Document Server

    Murtagh, Fionn

    2009-01-01

    Innovation is slowing greatly in the pharmaceutical sector. It is considered here how part of the problem is due to overly limiting intellectual property relations in the sector. On the other hand, computing and software in particular are characterized by great richness of intellectual property frameworks. Could the intellectual property ecosystem of computing come to the aid of the biosciences and life sciences? We look at how the answer might well be yes, by looking at (i) the extent to which a drug mirrors a software program, and (ii) what is to be gleaned from trends in research publishing in the life and biosciences.

  9. An open data repository and a data processing software toolset of an equivalent Nordic grid model matched to historical electricity market data

    Directory of Open Access Journals (Sweden)

    Luigi Vanfretti

    2017-04-01

    This Nordic 44 equivalent model was also used in iTesla project (iTesla [3] to carry out simulations within a dynamic security assessment toolset (iTesla, 2016 [4], and has been further enhanced during the ITEA3 OpenCPS project (iTEA3 [5]. The raw, processed data and output models utilized within the iTesla platform (iTesla, 2016 [4] are also available in the repository. The CIM and Modelica snapshots of the “Nordic 44” model for the year 2015 are available in a Zenodo repository.

  10. Establishment of the Slovenian Universities' Repositories and of the National Open Science Portal

    Directory of Open Access Journals (Sweden)

    Milan Ojsteršek

    2014-12-01

    Full Text Available AbstractThe paper presents the legal, organisational and technical perspectives regarding the implementation of the Slovenian national open access infrastructure for electronic theses and dissertations as well as for research publications. The infrastructure consists of four institutional repositories and a national portal that aggregates content from the university repositories and other Slovenian archives in order to provide a common search engine, recommendation of similar publications, and similar text detection. We have developed the software which is integrated with the universities' information and authentication systems and with the COBISS.SI. During the project the necessary legal background was defined and processes for mandatory submission of electronic theses and dissertations as well as of research publications were designed. The processes for data exchange between the institutional repositories and the national portal, and the processes for similar text detection and recommendation system were established. Bilingual web and mobile applications, a recommendation system and the interface suitable for persons with disabilities are provided to the users from around the world. The repositories are an effective promotion tool for universities and their researchers. It is expected that they will improve the recognition of Slovenian universities in the world. The complex national open access infrastructure with similar text detection support and integration with other systems will enable the storage of almost eighty percent of peer-reviewed scientific papers, annually published by Slovenian researchers. The majority of electronic theses and dissertations yearly produced at the Slovenian higher education institutions will also be accessible.

  11. Bridging the Gap Between Earth Science Open Data Producers and Consumers Using a Standards based approach

    Science.gov (United States)

    Stephan, E.; Sivaraman, C.

    2016-12-01

    The Web brought together science communities creating collaborative opportunities that were previously unimaginable. This was due to the novel ways technology enabled users to share information that would otherwise not be available. This means that data and software that previously could not be discovered without direct contact with data or software creators can now be downloaded with the click of a mouse button, and the same products can now outlive the lifespan of their research projects. While in many ways these technological advancements provide benefit to collaborating scientists, a critical producer-consumer knowledge gap is created when collaborating scientists rely solely on web sites, web browsers, or similar technology to exchange services, software, and data. Without some best practices and common approaches from Web publishers, collaborating scientific consumers have no inherent way to trust the results or other products being shared, producers have no way to convey their scientific credibility, and publishers risk obscurity where data is hidden in the deep Web. By leveraging recommendations from the W3C Data Activity, scientific communities can adopt best practices for data publication enabling consumers to explore, reuse, reproduce, and contribute their knowledge about the data. This talk will discuss the application of W3C Data on the Web Best Practices in support of published earth science data and feature the Data Usage Vocabulary.

  12. CILogon-HA. Higher Assurance Federated Identities for DOE Science

    Energy Technology Data Exchange (ETDEWEB)

    Basney, James [Univ. of Illinois, Urbana-Champaign, IL (United States)

    2015-08-01

    The CILogon-HA project extended the existing open source CILogon service (initially developed with funding from the National Science Foundation) to provide credentials at multiple levels of assurance to users of DOE facilities for collaborative science. CILogon translates mechanism and policy across higher education and grid trust federations, bridging from the InCommon identity federation (which federates university and DOE lab identities) to the Interoperable Global Trust Federation (which defines standards across the Worldwide LHC Computing Grid, the Open Science Grid, and other cyberinfrastructure). The CILogon-HA project expanded the CILogon service to support over 160 identity providers (including 6 DOE facilities) and 3 internationally accredited certification authorities. To provide continuity of operations upon the end of the CILogon-HA project period, project staff transitioned the CILogon service to operation by XSEDE.

  13. Roadmap for the ARC Grid Middleware

    DEFF Research Database (Denmark)

    Kleist, Josva; Eerola, Paula; Ekelöf, Tord

    2006-01-01

    The Advanced Resource Connector (ARC) or the NorduGrid middleware is an open source software solution enabling production quality computational and data Grids, with special emphasis on scalability, stability, reliability and performance. Since its first release in May 2002, the middleware is depl...

  14. An Open Hardware seismic data recorder - a solid basis for citizen science

    Science.gov (United States)

    Mertl, Stefan

    2015-04-01

    "Ruwai" is a 24-Bit Open Hardware seismic data recorder. It is built up of four stackable printed circuit boards fitting the Arduino Mega 2560 microcontroller prototyping platform. An interface to the BeagleBone Black single-board computer enables extensive data storage, -processing and networking capabilities. The four printed circuit boards provide a uBlox Lea-6T GPS module and real-time clock (GPS Timing shield), an Texas Instruments ADS1274 24-Bit analog to digital converter (ADC main shield), an analog input section with a Texas Instruments PGA281 programmable gain amplifier and an analog anti-aliasing filter (ADC analog interface pga) and the power conditioning based on 9-36V DC input (power supply shield). The Arduino Mega 2560 is used for controlling the hardware components, timestamping sampled data using the GPS timing information and transmitting the data to the BeagleBone Black single-board computer. The BeagleBone Black provides local data storage, wireless mesh networking using the optimized link state routing daemon and differential GNSS positioning using the RTKLIB software. The complete hardware and software is published under free software - or open hardware licenses and only free software (e.g. KiCad) was used for the development to facilitate the reusability of the design and increases the sustainability of the project. "Ruwai" was developed within the framework of the "Community Environmental Observation Network (CEON)" (http://www.mertl-research.at/ceon/) which was supported by the Internet Foundation Austria (IPA) within the NetIdee 2013 call.

  15. Opening up to Big Data: Computer-Assisted Analysis of Textual Data in Social Sciences

    Directory of Open Access Journals (Sweden)

    Gregor Wiedemann

    2013-05-01

    Full Text Available Two developments in computational text analysis may change the way qualitative data analysis in social sciences is performed: 1. the availability of digital text worth to investigate is growing rapidly, and 2. the improvement of algorithmic information extraction approaches, also called text mining, allows for further bridging the gap between qualitative and quantitative text analysis. The key factor hereby is the inclusion of context into computational linguistic models which extends conventional computational content analysis towards the extraction of meaning. To clarify methodological differences of various computer-assisted text analysis approaches the article suggests a typology from the perspective of a qualitative researcher. This typology shows compatibilities between manual qualitative data analysis methods and computational, rather quantitative approaches for large scale mixed method text analysis designs. URN: http://nbn-resolving.de/urn:nbn:de:0114-fqs1302231

  16. Computer Sciences Applied to Management at Open University of Catalonia: Development of Competences of Teamworks

    Science.gov (United States)

    Pisa, Carlos Cabañero; López, Enric Serradell

    Teamwork is considered one of the most important professional skills in today's business environment. More specifically, the collaborative work between professionals and information technology managers from various functional areas is a strategic key in competitive business. Several university-level programs are focusing on developing these skills. This article presents the case of the course Computer Science Applied to Management (hereafter CSAM) that has been designed with the objective to develop the ability to work cooperatively in interdisciplinary teams. For their design and development have been addressed to the key elements of efficiency that appear in the literature, most notably the establishment of shared objectives and a feedback system, the management of the harmony of the team, their level of autonomy, independence, diversity and level of supervision. The final result is a subject in which, through a working virtual platform, interdisciplinary teams solve a problem raised by a case study.

  17. Clinical research data sharing: what an open science world means for researchers involved in evidence synthesis.

    Science.gov (United States)

    Ross, Joseph S

    2016-09-20

    The International Committee of Medical Journal Editors (ICMJE) recently announced a bold step forward to require data generated by interventional clinical trials that are published in its member journals to be responsibly shared with external investigators. The movement toward a clinical research culture that supports data sharing has important implications for the design, conduct, and reporting of systematic reviews and meta-analyses. While data sharing is likely to enhance the science of evidence synthesis, facilitating the identification and inclusion of all relevant research, it will also pose key challenges, such as requiring broader search strategies and more thorough scrutiny of identified research. Furthermore, the adoption of data sharing initiatives by the clinical research community should challenge the community of researchers involved in evidence synthesis to follow suit, including the widespread adoption of systematic review registration, results reporting, and data sharing, to promote transparency and enhance the integrity of the research process.

  18. Grid Access Methods and Applications

    NARCIS (Netherlands)

    Wiekens, B.

    2005-01-01

    In experimental sciences the need for IT facilities that can solve ever larger and more complex problems grows. Among these problems are large simulations and calculating physical models. Grid Computing offers techniques that allow computer resources to be shared among various organizations. Organiz

  19. HashDist: Reproducible, Relocatable, Customizable, Cross-Platform Software Stacks for Open Hydrological Science

    Science.gov (United States)

    Ahmadia, A. J.; Kees, C. E.

    2014-12-01

    Developing scientific software is a continuous balance between not reinventing the wheel and getting fragile codes to interoperate with one another. Binary software distributions such as Anaconda provide a robust starting point for many scientific software packages, but this solution alone is insufficient for many scientific software developers. HashDist provides a critical component of the development workflow, enabling highly customizable, source-driven, and reproducible builds for scientific software stacks, available from both the IPython Notebook and the command line. To address these issues, the Coastal and Hydraulics Laboratory at the US Army Engineer Research and Development Center has funded the development of HashDist in collaboration with Simula Research Laboratories and the University of Texas at Austin. HashDist is motivated by a functional approach to package build management, and features intelligent caching of sources and builds, parametrized build specifications, and the ability to interoperate with system compilers and packages. HashDist enables the easy specification of "software stacks", which allow both the novice user to install a default environment and the advanced user to configure every aspect of their build in a modular fashion. As an advanced feature, HashDist builds can be made relocatable, allowing the easy redistribution of binaries on all three major operating systems as well as cloud, and supercomputing platforms. As a final benefit, all HashDist builds are reproducible, with a build hash specifying exactly how each component of the software stack was installed. This talk discusses the role of HashDist in the hydrological sciences, including its use by the Coastal and Hydraulics Laboratory in the development and deployment of the Proteus Toolkit as well as the Rapid Operational Access and Maneuver Support project. We demonstrate HashDist in action, and show how it can effectively support development, deployment, teaching, and

  20. Amplifying the Impact of Open Access: Wikipedia and the Diffusion of Science

    CERN Document Server

    Teplitskiy, Misha; Duede, Eamon

    2015-01-01

    With the rise of Wikipedia as a first-stop source for scientific knowledge, it is important to compare its representation of that knowledge to that of the academic literature. This article approaches such a comparison through academic references made within the worlds 50 largest Wikipedias. Previous studies have raised concerns that Wikipedia editors may simply use the most easily accessible academic sources rather than sources of the highest academic status. We test this claim by identifying the 250 most heavily used journals in each of 26 research fields (4,721 journals, 19.4M articles in total) indexed by the Scopus database, and modeling whether topic, academic status, and accessibility make articles from these journals more or less likely to be referenced on Wikipedia. We find that, controlling for field and impact factor, the odds that an open access journal is referenced on the English Wikipedia are 47% higher compared to closed access journals. Moreover, in most of the worlds Wikipedias a journals hig...

  1. Intelligent geospatial data retrieval based on the geospatial grid portal

    Science.gov (United States)

    Yuan, Jie; Yue, Peng; Gong, Jianya

    2008-12-01

    The Open Geospatial Consortium (OGC) standard-compliant services define a set of standard interfaces for geospatial Web services to achieve the interoperability in an open distributed computing environment. Grid technology is a distributed computing infrastructure to allow distributed resources sharing and coordinated problem solving. Based on the OGC standards for geospatial services and grid technology, we propose the geospatial grid portal to integrate and interoperate grid-enabled geospatial services. The implementation of the geospatial grid portal is based on a three-tier architecture which consists of grid-enabled geospatial services tier, grid service portal tier and application tier. The OGC standard-compliant services are deployed in a grid environment, the so-called grid-enabled geospatial services. Grid service portals for each type of geospatial services, including WFS, WMS, WCS and CSW, provide a single point of Web entry to discover and access different types of geospatial information. A resource optimization mechanism is incorporated into these service portals to optimize the selection of grid nodes. At the top tier, i.e. the application tier, the client interacts with a semantic middleware for the grid CSW portal, thus allows the semantics-enabled search. The proposed approach can not only optimize the grid resource selection among multiple grid nodes, but also incorporate the power of Semantic Web technology into geospatial grid portal to allow the precise discovery of geospatial data.

  2. Comparative analysis of existing models for power-grid synchronization

    Science.gov (United States)

    Nishikawa, Takashi; Motter, Adilson E.

    2015-01-01

    The dynamics of power-grid networks is becoming an increasingly active area of research within the physics and network science communities. The results from such studies are typically insightful and illustrative, but are often based on simplifying assumptions that can be either difficult to assess or not fully justified for realistic applications. Here we perform a comprehensive comparative analysis of three leading models recently used to study synchronization dynamics in power-grid networks—a fundamental problem of practical significance given that frequency synchronization of all power generators in the same interconnection is a necessary condition for a power grid to operate. We show that each of these models can be derived from first principles within a common framework based on the classical model of a generator, thereby clarifying all assumptions involved. This framework allows us to view power grids as complex networks of coupled second-order phase oscillators with both forcing and damping terms. Using simple illustrative examples, test systems, and real power-grid datasets, we study the inherent frequencies of the oscillators as well as their coupling structure, comparing across the different models. We demonstrate, in particular, that if the network structure is not homogeneous, generators with identical parameters need to be modeled as non-identical oscillators in general. We also discuss an approach to estimate the required (dynamical) system parameters that are unavailable in typical power-grid datasets, their use for computing the constants of each of the three models, and an open-source MATLAB toolbox that we provide for these computations.

  3. Earth-Base: A Free And Open Source, RESTful Earth Sciences Platform

    Science.gov (United States)

    Kishor, P.; Heim, N. A.; Peters, S. E.; McClennen, M.

    2012-12-01

    This presentation describes the motivation, concept, and architecture behind Earth-Base, a web-based, RESTful data-management, analysis and visualization platform for earth sciences data. Traditionally web applications have been built directly accessing data from a database using a scripting language. While such applications are great at bring results to a wide audience, they are limited in scope to the imagination and capabilities of the application developer. Earth-Base decouples the data store from the web application by introducing an intermediate "data application" tier. The data application's job is to query the data store using self-documented, RESTful URIs, and send the results back formatted as JavaScript Object Notation (JSON). Decoupling the data store from the application allows virtually limitless flexibility in developing applications, both web-based for human consumption or programmatic for machine consumption. It also allows outside developers to use the data in their own applications, potentially creating applications that the original data creator and app developer may not have even thought of. Standardized specifications for URI-based querying and JSON-formatted results make querying and developing applications easy. URI-based querying also allows utilizing distributed datasets easily. Companion mechanisms for querying data snapshots aka time-travel, usage tracking and license management, and verification of semantic equivalence of data are also described. The latter promotes the "What You Expect Is What You Get" (WYEIWYG) principle that can aid in data citation and verification.

  4. Open source hardware solutions for low-cost, do-it-yourself environmental monitoring, citizen science, and STEM education

    Science.gov (United States)

    Hicks, S. D.; Aufdenkampe, A. K.; Horsburgh, J. S.; Arscott, D. B.; Muenz, T.; Bressler, D. W.

    2016-12-01

    The explosion in DIY open-source hardware and software has resulted in the development of affordable and accessible technologies, like drones and weather stations, that can greatly assist the general public in monitoring environmental health and its degradation. It is widely recognized that education and support of audiences in pursuit of STEM literacy and the application of emerging technologies is a challenge for the future of citizen science and for preparing high school graduates to be actively engaged in environmental stewardship. It is also clear that detecting environmental change/degradation over time and space will be greatly enhanced with expanded use of networked, remote monitoring technologies by watershed organizations and citizen scientists if data collection and reporting are properly carried out and curated. However, there are few focused efforts to link citizen scientists and school programs with these emerging tools. We have started a multi-year program to develop hardware and teaching materials for training students and citizen scientists about the use of open source hardware in environmental monitoring. Scientists and educators around the world have started building their own dataloggers and devices using a variety of boards based on open source electronics. This new hardware is now providing researchers with an inexpensive alternative to commercial data logging and transmission hardware. We will present a variety of hardware solutions using the Arduino-compatible EnviroDIY Mayfly board (http://envirodiy.org/mayfly) that can be used to build and deploy a rugged environmental monitoring station using a wide variety of sensors and options, giving the users a fully customizable device for making measurements almost anywhere. A database and visualization system is being developed that will allow the users to view and manage the data their devices are collecting. We will also present our plan for developing curricula and leading workshops to various

  5. Open evaluation (OE: A vision for entirely transparent post-publication peer review and rating for science

    Directory of Open Access Journals (Sweden)

    Nikolaus eKriegeskorte

    2012-10-01

    Full Text Available The two major functions of a scientific publishing system are to provide access to and evaluation of scientific papers. While open access (OA is becoming a reality, open evaluation (OE, the other side of coin, has received less attention. Evaluation steers the attention of the scientific community and thus the very course of science. It also influences the use of scientific findings in public policy. The current system of scientific publishing provides only journal prestige as an indication of the quality of new papers and relies on a non-transparent and noisy pre-publication peer review process, which delays publication by many months on average. Here I propose an OE system, in which papers are evaluated post-publication in an ongoing fashion by means of open peer review and rating. Through signed ratings and reviews, scientists steer the attention of their field and build their reputation. Reviewers are motivated to be objective, because low-quality or self-serving signed evaluations will negatively impact their reputation. A core feature of this proposal is a division of powers between the accumulation of evaluative evidence and the analysis of this evidence by paper evaluation functions (PEFs. PEFs can be freely defined by individuals or groups (e.g. scientific societies and provide a plurality of perspectives on the scientific literature. Simple PEFs will use averages of ratings, weighting reviewers (e.g. by H-factor and rating scales (e.g. by relevance to a decision process in different ways. Complex PEFs will use advanced statistical techniques to infer the quality of a paper. Papers with initially promising ratings will be more deeply evaluated. The continual refinement of PEFs in response to attempts by individuals to influence evaluations in their own favor will make the system ungameable. OA and OE together have the power to revolutionize scientific publishing and usher in a new culture of transparency, constructive criticism, and

  6. Power Grid:Connecting the world

    Institute of Scientific and Technical Information of China (English)

    Liu Liang; Zhu Li

    2012-01-01

    With the acceleration of global economic integration,the trend has been towards opening up markets.Large enterprises in countries around the world,the developed countries in particular,attach great importance to going abroad,with the aim to optimally allocate energy resources in a wider range.Actively responding to the "going out" strategy of the State,the two giant power grid enterprises in China,State Grid Corporation of China (SGCC) and China Southern Power Grid (CSG),have made plans for grid development in line with the State's energy strategy and global resources allocation.

  7. JGrass-NewAge hydrological system: an open-source platform for the replicability of science.

    Science.gov (United States)

    Bancheri, Marialaura; Serafin, Francesco; Formetta, Giuseppe; Rigon, Riccardo; David, Olaf

    2017-04-01

    JGrass-NewAge is an open source semi-distributed hydrological modelling system. It is based on the object modelling framework (OMS version 3), on the JGrasstools and on the Geotools. OMS3 allows to create independent packages of software which can be connected at run-time in a working modelling solution. These components are available as library/dependency or as repository to fork in order to add further features. Different tools are adopted to make easier the integration, the interoperability and the use of each package. Most of the components are Gradle integrated, since it represents the state-of-art of the building systems, especially for Java projects. The continuous integration is a further layer between local source code (client-side) and remote repository (server-side) and ensures the building and the testing of the source code at each commit. Finally, the use of Zenodo makes the code hosted in GitHub unique, citable and traceable, with a defined DOI. Following the previous standards, each part of the hydrological cycle is implemented in JGrass-NewAge as a component that can be selected, adopted, and connected to obtain a user "customized" hydrological model. A variety of modelling solutions are possible, allowing a complete hydrological analysis. Moreover, thanks to the JGrasstools and the Geotools, the visualization of the data and of the results using a selected GIS is possible. After the geomorphological analysis of the watershed, the spatial interpolation of the meteorological inputs can be performed using both deterministic (IDW) and geostatistic (Kriging) algorithms. For the radiation balance, the shortwave and longwave radiation can be estimated, which are, in turn, inputs for the simulation of the evapotranspiration, according to Priestly-Taylor and Penman-Monteith formulas. Three degree-day models are implemented for the snow melting and SWE. The runoff production can be simulated using two different components, "Adige" and "Embedded Reservoirs

  8. The MammoGrid Project Grids Architecture

    CERN Document Server

    McClatchey, R; Manset, D; Hauer, T; Estrella, F; Saiz, P; Rogulin, D; Clatchey, Richard Mc; Buncic, Predrag; Manset, David; Hauer, Tamas; Estrella, Florida; Saiz, Pablo; Rogulin, Dmitri

    2003-01-01

    The aim of the recently EU-funded MammoGrid project is, in the light of emerging Grid technology, to develop a European-wide database of mammograms that will be used to develop a set of important healthcare applications and investigate the potential of this Grid to support effective co-working between healthcare professionals throughout the EU. The MammoGrid consortium intends to use a Grid model to enable distributed computing that spans national borders. This Grid infrastructure will be used for deploying novel algorithms as software directly developed or enhanced within the project. Using the MammoGrid clinicians will be able to harness the use of massive amounts of medical image data to perform epidemiological studies, advanced image processing, radiographic education and ultimately, tele-diagnosis over communities of medical "virtual organisations". This is achieved through the use of Grid-compliant services [1] for managing (versions of) massively distributed files of mammograms, for handling the distri...

  9. Grid accounting service: state and future development

    Energy Technology Data Exchange (ETDEWEB)

    Levshina, T. [Fermilab; Sehgal, C. [Fermilab; Bockelman, B. [Nebraska U.; Weitzel, D. [Nebraska U.; Guru, A. [Nebraska U.

    2014-01-01

    During the last decade, large-scale federated distributed infrastructures have been continually developed and expanded. One of the crucial components of a cyber-infrastructure is an accounting service that collects data related to resource utilization and identity of users using resources. The accounting service is important for verifying pledged resource allocation per particular groups and users, providing reports for funding agencies and resource providers, and understanding hardware provisioning requirements. It can also be used for end-to-end troubleshooting as well as billing purposes. In this work we describe Gratia, a federated accounting service jointly developed at Fermilab and Holland Computing Center at University of Nebraska-Lincoln. The Open Science Grid, Fermilab, HCC, and several other institutions have used Gratia in production for several years. The current development activities include expanding Virtual Machines provisioning information, XSEDE allocation usage accounting, and Campus Grids resource utilization. We also identify the direction of future work: improvement and expansion of Cloud accounting, persistent and elastic storage space allocation, and the incorporation of WAN and LAN network metrics.

  10. Grid accounting service: state and future development

    Science.gov (United States)

    Levshina, T.; Sehgal, C.; Bockelman, B.; Weitzel, D.; Guru, A.

    2014-06-01

    During the last decade, large-scale federated distributed infrastructures have been continually developed and expanded. One of the crucial components of a cyber-infrastructure is an accounting service that collects data related to resource utilization and identity of users using resources. The accounting service is important for verifying pledged resource allocation per particular groups and users, providing reports for funding agencies and resource providers, and understanding hardware provisioning requirements. It can also be used for end-to-end troubleshooting as well as billing purposes. In this work we describe Gratia, a federated accounting service jointly developed at Fermilab and Holland Computing Center at University of Nebraska-Lincoln. The Open Science Grid, Fermilab, HCC, and several other institutions have used Gratia in production for several years. The current development activities include expanding Virtual Machines provisioning information, XSEDE allocation usage accounting, and Campus Grids resource utilization. We also identify the direction of future work: improvement and expansion of Cloud accounting, persistent and elastic storage space allocation, and the incorporation of WAN and LAN network metrics.

  11. 77 FR 71169 - Smart Grid Advisory Committee Meeting

    Science.gov (United States)

    2012-11-29

    ... National Institute of Standards and Technology Smart Grid Advisory Committee Meeting AGENCY: National... Smart Grid Advisory Committee (SGAC or Committee), will meet in open session on Tuesday, December 18... NIST Smart Grid Program Plan. The agenda may change to accommodate Committee business. The final agenda...

  12. Triple-layer smart grid business model

    DEFF Research Database (Denmark)

    Ma, Zheng; Lundgaard, Morten; Jørgensen, Bo Nørregaard

    2016-01-01

    Viewing the smart grid with the theory of business models may open opportunities in understanding and capturing values in new markets. This study tries to discover and map the smart grid ecosystem-based business model framework with two different environments (sub-Saharan Africa and Denmark......), and identifies the parameters for the smart grid solutions to the emerging markets. This study develops a triple-layer business model including the organizational (Niche), environmental (Intermediate), and global (Dominators) factors. The result uncovers an interface of market factors and stakeholders...... in a generic smart grid constellation. The findings contribute the transferability potential of the smart grid solutions between countries, and indicate the potential to export and import smart grid solutions based on the business modeling....

  13. From testbed to reality grid computing steps up a gear

    CERN Multimedia

    2004-01-01

    "UK plans for Grid computing changed gear this week. The pioneering European DataGrid (EDG) project came to a successful conclusion at the end of March, and on 1 April a new project, known as Enabling Grids for E-Science in Europe (EGEE), begins" (1 page)

  14. Vote for the GridCafé!

    CERN Multimedia

    2004-01-01

    CERN's GridCafé website (http://www.gridcafe.org) has been nominated for the 8th Annual Webby Awards, together with four other finalists in the Technical Achievement category. The Webby Awards have been hailed as the "online Oscars" by Time Magazine, and are the leading international honours for websites, so this nomination represents a significant achievement. The winner in this category last year was Google. The GridCafé website, which was launched at Telecom '03 and forms part of the Microcosm exhibit on computing, introduces Grid technology to the general public, and provides information on all major Grid projects around the world, focusing in particular on the pioneering Grid developments being carried out by CERN and its many international partners for the Large Hadron Collider project. Being nominated for a Webby Award represents a great opportunity to draw positive media attention to Grid technology, to CERN and to science in general. Last year's nominees were covered...

  15. Toward a Grid Work flow Formal Composition

    Energy Technology Data Exchange (ETDEWEB)

    Hlaoui, Y. B.; BenAyed, L. J.

    2007-07-01

    This paper exposes a new approach for the composition of grid work flow models. This approach proposes an abstract syntax for the UML Activity Diagrams (UML-AD) and a formal foundation for grid work flow composition in form of a work flow algebra based on UML-AD. This composition fulfils the need for collaborative model development particularly the specification and the reduction of the complexity of grid work flow model verification. This complexity has arisen with the increase in scale of grid work flow applications such as science and e-business applications since large amounts of computational resources are required and multiple parties could be involved in the development process and in the use of grid work flows. Furthermore, the proposed algebra allows the definition of work flow views which are useful to limit the access to predefined users in order to ensure the security of grid work flow applications. (Author)

  16. An Introduction to Grid Computing Using EGEE

    Science.gov (United States)

    Walsh, John; Coghlan, Brian; Childs, Stephen

    Grid is an evolving and maturing architecture based on several well-established services, including amongst others, distributed computing, role and group management, distributed data management and Public Key Encryption systems Currently the largest scientific grid infrastructure is Enabling Grids e-Science (EGEE), comprised of approximately ˜250 sites, ˜50,000 CPUs and tens of petabytes of storage. Moreover, EGEE covers a large variety of scientific disciplines including Astrophysics. The scope of this work is to provide the keen astrophysicist with an introductory overview of the motivations for using Grid, and of the core production EGEE services and its supporting software and/or middleware (known by the name gLite). We present an overview of the available set of commands, tools and portals as used within these Grid communities. In addition, we present the current scheme for supporting MPI programs on these Grids.

  17. Introduction to grid computing

    CERN Document Server

    Magoules, Frederic; Tan, Kiat-An; Kumar, Abhinit

    2009-01-01

    A Thorough Overview of the Next Generation in ComputingPoised to follow in the footsteps of the Internet, grid computing is on the verge of becoming more robust and accessible to the public in the near future. Focusing on this novel, yet already powerful, technology, Introduction to Grid Computing explores state-of-the-art grid projects, core grid technologies, and applications of the grid.After comparing the grid with other distributed systems, the book covers two important aspects of a grid system: scheduling of jobs and resource discovery and monitoring in grid. It then discusses existing a

  18. Smart Grid Special; Smart Grid Special

    Energy Technology Data Exchange (ETDEWEB)

    Mokoginta, L. [Energiecooperatie ' Wij Krijgen Kippen' , Amsterdam (Netherlands); Messing, M. [Stichting Energietransitie Nederland, Boxtel (Netherlands); Slootweg, H. [Technische Universiteit Eindhoven TUE, Eindhoven (Netherlands); Van der Steen, L.; Brugman, L. [SquareWise, Amsterdam (Netherlands); Bles, M.; Blom, M. [CE Delft, Delft (Netherlands); Nachtegaal, H.; Hoekstra, R. [Bijl partners in public relations, Rotterdam (Netherlands); Van Zutphen, M. [CapGemini, Utrecht (Netherlands); Bakker, D. [PNO Consultants, Schiphol (Netherlands); Van Leeuwen, M. [Norton Rose, Amsterdam (Netherlands); Van Vlerken, J.; De Leeuw, M.; Wijnants, H.J.; Holwerda, B.; Bosch, N.

    2012-06-15

    A series of 17 articles is dedicated to various aspects of smart grids: expert opinions, the key role of smart grids in a sustainable energy transition, the role of the energy consumer and the grid operators, an energy transition project in the South of Amsterdam (Netherlands), the need for collaboration (e.g. through the Smart Energy Collective), the establishment of local energy corporations, the question whether smart grids are a hype or a necessity, costs and benefits of smart grids, deployment of intelligent smart grids in business areas (experimental areas), the opportunity of deploying Direct Current (DC) grids for an improved energy balance, the Smart Power City Apeldoorn project (SPCA), the experimental area of CloudPower on the isle of Texel, innovation contracts for smart grids, the increase of local, small-scale electricity production, and smart grid pilot projects on Europe. [Dutch] In 17 artikelen wordt aandacht besteed aan diverse aspecten van 'smart grids': meningen van experts, de sleutelrol van smart grids in een duurzame energietransitie, de rol van de energieconsument en de netbeheerders, een energietransitie-project in Amsterdam-Zuid, de noodzaak tot samenwerking (onder meer d.m.v. het Smart Energy Collective), de oprichting van lokale energiecooperaties, de vraag of smart grids een hype zijn of noodzaak, kosten en baten van smart grids, de toepassing van intelligente energienetwerken op bedrijventerreinen ('proeftuinen'), de mogelijkheid om gelijkspanningsnetten toe te passen voor een betere energiebalans, het project Smart Power City Apeldoorn (SPCA), de proeftuin CloudPower op Texel, innovatiecontracten m.b.t. smart grids, de toename van lokale, kleinschalige elektriciteitsproductie, smart grid demonstratieprojecten in Europa.

  19. Grid Technologies for Virtual Laboratories in Engineering Education

    Directory of Open Access Journals (Sweden)

    Christian Schmid

    2008-02-01

    Full Text Available In this paper, Grid technologies are introduced to build e-Learning environments for engineering education. Service-oriented Grids open new fields of applications, the Learning Grids. The learning services concept based on a learning model and their deployment through Grid technologies are excellent means to integrate virtual laboratories into e-Learning environments for engineering education. The paper goes into the most important technical details, introduces into the used learning model, and shows the au-thoring of Grid resources for virtual laboratories. Examples from a virtual laboratory demonstrate the advantages of a Grid.

  20. FIFE-Jobsub: a grid submission system for intensity frontier experiments at Fermilab

    Science.gov (United States)

    Box, Dennis

    2014-06-01

    The Fermilab Intensity Frontier Experiments use an integrated submission system known as FIFE-jobsub, part of the FIFE (Fabric for Frontier Experiments) initiative, to submit batch jobs to the Open Science Grid. FIFE-jobsub eases the burden on experimenters by integrating data transfer and site selection details in an easy to use and well-documented format. FIFE-jobsub automates tedious details of maintaining grid proxies for the lifetime of the grid job. Data transfer is handled using the Intensity Frontier Data Handling Client (IFDHC) [1] tool suite, which facilitates selecting the appropriate data transfer method from many possibilities while protecting shared resources from overload. Chaining of job dependencies into Directed Acyclic Graphs (Condor DAGS) is well supported and made easier through the use of input flags and parameters.

  1. Open3DQSAR

    DEFF Research Database (Denmark)

    Tosco, Paolo; Balle, Thomas

    2011-01-01

    Open3DQSAR is a freely available open-source program aimed at chemometric analysis of molecular interaction fields. MIFs can be imported from different sources (GRID, CoMFA/CoMSIA, quantum-mechanical electrostatic potential or electron density grids) or generated by Open3DQSAR itself. Much focus...... has been put on automation through the implementation of a scriptable interface, as well as on high computational performance achieved by algorithm parallelization. Flexibility and interoperability with existing molecular modeling software make Open3DQSAR a powerful tool in pharmacophore assessment...

  2. Utilizing Public Access Data and Open Source Statistical Programs to Teach Climate Science to Interdisciplinary Undergraduate Students

    Science.gov (United States)

    Collins, L.

    2014-12-01

    Students in the Environmental Studies major at the University of Southern California fulfill their curriculum requirements by taking a broad range of courses in the social and natural sciences. Climate change is often taught in 1-2 lectures in these courses with limited examination of this complex topic. Several upper division elective courses focus on the science, policy, and social impacts of climate change. In an upper division course focused on the scientific tools used to determine paleoclimate and predict future climate, I have developed a project where students download, manipulate, and analyze data from the National Climatic Data Center. Students are required to download 100 or more years of daily temperature records and use the statistical program R to analyze that data, calculating daily, monthly, and yearly temperature averages along with changes in the number of extreme hot or cold days (≥90˚F and ≤30˚F, respectively). In parallel, they examine population growth, city expansion, and changes in transportation looking for correlations between the social data and trends observed in the temperature data. Students examine trends over time to determine correlations to urban heat island effect. This project exposes students to "real" data, giving them the tools necessary to critically analyze scientific studies without being experts in the field. Utilizing the existing, public, online databases provides almost unlimited, free data. Open source statistical programs provide a cost-free platform for examining the data although some in-class time is required to help students navigate initial data importation and analysis. Results presented will highlight data compiled over three years of course projects.

  3. International outreach for promoting open geoscience content in Finnish university libraries - libraries as the advocates of citizen science awareness on emerging open geospatial data repositories in Finnish society

    Science.gov (United States)

    Rousi, A. M.; Branch, B. D.; Kong, N.; Fosmire, M.

    2013-12-01

    In their Finnish National Spatial Strategy 2010-2015 the Finland's Ministry of Agriculture and Forestry delineated e.g. that spatial data skills should support citizens everyday activities and facilitate decision-making and participation of citizens. Studies also predict that open data, particularly open spatial data, would create, when fully realizing their potential, a 15% increase into the turnovers of Finnish private sector companies. Finnish libraries have a long tradition of serving at the heart of Finnish information society. However, with the emerging possibilities of educating their users on open spatial data a very few initiatives have been made. The National Survey of Finland opened its data in 2012. Finnish technology university libraries, such as Aalto University Library, are open environments for all citizens, and seem suitable of being the first thriving entities in educating citizens on open geospatial data. There are however many obstacles to overcome, such as lack of knowledge about policies, lack of understanding of geospatial data services and insufficient know-how of GIS software among the personnel. This framework examines the benefits derived from an international collaboration between Purdue University Libraries and Aalto University Library to create local strategies in implementing open spatial data education initiatives in Aalto University Library's context. The results of this international collaboration are explicated for the benefit of the field as a whole.

  4. Social Support Network for the Elderly Attending the Open University Program for Senior Citizens at the School of Arts, Sciences and Humanities, University of Sao Paulo, Brazil

    Science.gov (United States)

    Domingues, Marisa Accioly; Ordonez, Tiago Nascimento; Lima-Silva, Thais Bento; Torres, Maria Juliana; de Barros, Thabata Cruz; Cachioni, Meire

    2013-01-01

    This study describes the social support network of older adults enrolled in the Open University for Senior Citizens at the School of Arts, Sciences and Humanities, University of Sao Paulo. A cross-sectional study was conducted with a sample of 117 elderly or older adults, mostly female (78%), married (53%), retired (82%), and aged on average…

  5. Data correction on July 4th, 2012 - Open TG-GATEs | LSDB Archive [Life Science Database Archive metadata

    Lifescience Database Archive (English)

    Full Text Available [ Credits ] BLAST Search Image Search Home About Archive Update History Contact us Open...chemistry open_tggates_biochemistry.zip (666 KB) Simple search and download CEL file attachments Open...bout This Database Database Description Download License Update History of This Database Site Policy | Contact Us Data correction on July 4th, 2012 - Open TG-GATEs | LSDB Archive ...

  6. The Scholarly Communication Speed of Library and Information Science Open Access Journals as Measured by First-Citation

    Directory of Open Access Journals (Sweden)

    Tai-Chi Yu

    2016-06-01

    Full Text Available Based on the time of journal article first-citation appearance, this study analyzed the citation speeds of Open Access (OA journals within Library and Information Science (LIS field indexed in Scopus and WoS database. Articles published between 2010 and 2014 by a total of 8 Full-OA journals and 13 Hybrid-OA journals indexed by 2010-2013 edition of JCR were collected and analyzed in June 2015. Results showed that there were 639 articles being cited in Scopus and the average firstcitation speed was 1.17 year. On the other hand, there were 434 articles being cited in WoS with a slightly higher first-time citation rate of 1.37 year. Most of the articles studied were cited for the first time in the same year or the year after of its publication. There were some articles being cited even before its official publication. Within the Hybrid-OA journals, articles belong to the OA mechanism did have shorter speed citation time than non-OA ones. This study suggested that further studies could adopt the concept of Altmetrics to investigate the first-usage speeds through the formal and informal communication channels. [Article content in Chinese

  7. The TimeStudio Project: An open source scientific workflow system for the behavioral and brain sciences.

    Science.gov (United States)

    Nyström, Pär; Falck-Ytter, Terje; Gredebäck, Gustaf

    2016-06-01

    This article describes a new open source scientific workflow system, the TimeStudio Project, dedicated to the behavioral and brain sciences. The program is written in MATLAB and features a graphical user interface for the dynamic pipelining of computer algorithms developed as TimeStudio plugins. TimeStudio includes both a set of general plugins (for reading data files, modifying data structures, visualizing data structures, etc.) and a set of plugins specifically developed for the analysis of event-related eyetracking data as a proof of concept. It is possible to create custom plugins to integrate new or existing MATLAB code anywhere in a workflow, making TimeStudio a flexible workbench for organizing and performing a wide range of analyses. The system also features an integrated sharing and archiving tool for TimeStudio workflows, which can be used to share workflows both during the data analysis phase and after scientific publication. TimeStudio thus facilitates the reproduction and replication of scientific studies, increases the transparency of analyses, and reduces individual researchers' analysis workload. The project website ( http://timestudioproject.com ) contains the latest releases of TimeStudio, together with documentation and user forums.

  8. Update History of This Database - Open TG-GATEs Pathological Image Database | LSDB Archive [Life Science Database Archive metadata

    Lifescience Database Archive (English)

    Full Text Available [ Credits ] BLAST Search Image Search Home About Archive Update History Contact us Open... TG-GATEs Pathological Image Database Update History of This Database Date Update contents 2012/05/24 Open... TG-GATEs Pathological Image Database English archive site is opened. 2012/03/30 Open TG-GATEs Pathologica...Contact Us Update History of This Database - Open TG-GATEs Pathological Image Database | LSDB Archive ...

  9. Bayesian grid matching

    DEFF Research Database (Denmark)

    Hartelius, Karsten; Carstensen, Jens Michael

    2003-01-01

    A method for locating distorted grid structures in images is presented. The method is based on the theories of template matching and Bayesian image restoration. The grid is modeled as a deformable template. Prior knowledge of the grid is described through a Markov random field (MRF) model which...... nodes and the arc prior models variations in row and column spacing across the grid. Grid matching is done by placing an initial rough grid over the image and applying an ensemble annealing scheme to maximize the posterior distribution of the grid. The method can be applied to noisy images with missing...

  10. Open Access am Standort D – erweiterte Perspektiven für die Wissenschaft / Open Access in Germany – new prospects for science and scholarship

    Directory of Open Access Journals (Sweden)

    Ilg-Hartbecke, Karin

    2009-06-01

    Full Text Available The increasing implementation of the Open Access idea has given rise to new scholarly information-supply and communication models in both the German and international research landscapes. For example, over half the research-oriented universities in Germany now operate their own institutional repository. In the long term, Open Access repositories represent an ideal basic infrastructure for handling scholarly publications since they offer worldwide accessibility, enhanced visibility and value-added services such as disciplinary and cross-disciplinary search options, usage statistics and citation analysis. This contribution provides an up-to-date overview and describes not only the current framework conditions and developments but also the barriers to development in the German repository landscape. It also deals with initiatives designed to introduce publishers, authors and general information providers to Open Access practices.

  11. The Role of Citizen Science in Risk Mitigation and Disaster Response: A Case Study of 2015 Nepalese Earthquake Using OpenStreetMap

    Science.gov (United States)

    Rieger, C.; Byrne, J. M.

    2015-12-01

    Citizen science includes networks of ordinary people acting as sensors, observing and recording information for science. OpenStreetMap is one such sensor network which empowers citizens to collaboratively produce a global picture from free geographic information. The success of this open source software is extended by the development of freely used open databases for the user community. Participating citizens do not require a high level of skill. Final results are processed by professionals following quality assurance protocols before map information is released. OpenStreetMap is not only the cheapest source of timely maps in many cases but also often the only source. This is particularly true in developing countries. Emergency responses to the recent earthquake in Nepal illustrates the value for rapidly updated geographical information. This includes emergency management, damage assessment, post-disaster response, and future risk mitigation. Local disaster conditions (landslides, road closings, bridge failures, etc.) were documented for local aid workers by citizen scientists working remotely. Satellites and drones provide digital imagery of the disaster zone and OpenStreetMap participants shared the data from locations around the globe. For the Nepal earthquake, OpenStreetMap provided a team of volunteers on the ground through their Humanitarian OpenStreetMap Team (HOT) which contribute data to the disaster response through smartphones and laptops. This, combined with global citizen science efforts, provided immediate geographically useful maps to assist aid workers, including the Red Cross and Canadian DART Team, and the Nepalese government. As of August 2014, almost 1.7 million users provided over 2.5 billion edits to the OpenStreetMap map database. Due to the increased usage of smartphones, GPS-enabled devices, and the growing participation in citizen science projects, data gathering is proving an effective way to contribute as a global citizen. This paper

  12. Mapping of grid faults and grid codes

    DEFF Research Database (Denmark)

    Iov, F.; Hansen, Anca Daniela; Sørensen, Poul Ejnar

    for such investigations. The grid connection requirements for wind turbines have increased significantly during the last 5-10 years. Especially the requirements for wind turbines to stay connected to the grid during and after voltage sags, imply potential challenges in the design of wind turbines. These requirements pose...... challenges for the design of both the electrical system and the mechanical structure of wind turbines. An overview over the frequency of grid faults and the grid connection requirements in different relevant countries is done in this report. The most relevant study cases for the quantification of the loads......The present report is a part of the research project ''Grid fault and designbasis for wind turbine'' supported by Energinet.dk through the grant PSO F&U 6319. The objective of this project is to investigate into the consequences of the new grid connection requirements for the fatigue and extreme...

  13. Mapping of grid faults and grid codes

    DEFF Research Database (Denmark)

    Iov, Florin; Hansen, A.D.; Sørensen, P.

    The present report is a part of the research project "Grid fault and design basis for wind turbine" supported by Energinet.dk through the grant PSO F&U 6319. The objective of this project is to investigate into the consequences of the new grid connection requirements for the fatigue and extreme...... for such investigations. The grid connection requirements for wind turbines have increased significantly during the last 5-10 years. Especially the requirements for wind turbines to stay connected to the grid during and after voltage sags, imply potential challenges in the design of wind turbines. These requirements pose...... challenges for the design of both the electrical system and the mechanical structure of wind turbines. An overview over the frequency of grid faults and the grid connection requirements in different relevant countries is done in this report. The most relevant study cases for the quantification of the loads...

  14. 国际开放科学数据实证资源及利用研究%Empirical Resources and Utilization of International Open Science Data

    Institute of Scientific and Technical Information of China (English)

    熊易

    2016-01-01

    Scientific data information resources is an important research output.It is also an indispensable part of scientific research. This article explores the empirical resources and utilization of international open science data from many aspects including the overview of open science data,the"UN Data","OECD Library","IMF database","Open Data for Africa"empirical resources and their application of open science data.%科学数据信息资源是一种重要的科研成果产出,也是科学研究不可或缺的重要组成部分.从开放科学数据及其意义,"联合国数据"、"经合组织图书馆"、"国际货币基金组织数据库"、"非洲开放数据"开放科学数据实证资源及利用等方面,对国际开放科学数据实证资源及利用进行了探讨.

  15. Building a multi-scaled geospatial temporal ecology database from disparate data sources: fostering open science and data reuse.

    Science.gov (United States)

    Soranno, Patricia A; Bissell, Edward G; Cheruvelil, Kendra S; Christel, Samuel T; Collins, Sarah M; Fergus, C Emi; Filstrup, Christopher T; Lapierre, Jean-Francois; Lottig, Noah R; Oliver, Samantha K; Scott, Caren E; Smith, Nicole J; Stopyak, Scott; Yuan, Shuai; Bremigan, Mary Tate; Downing, John A; Gries, Corinna; Henry, Emily N; Skaff, Nick K; Stanley, Emily H; Stow, Craig A; Tan, Pang-Ning; Wagner, Tyler; Webster, Katherine E

    2015-01-01

    Although there are considerable site-based data for individual or groups of ecosystems, these datasets are widely scattered, have different data formats and conventions, and often have limited accessibility. At the broader scale, national datasets exist for a large number of geospatial features of land, water, and air that are needed to fully understand variation among these ecosystems. However, such datasets originate from different sources and have different spatial and temporal resolutions. By taking an open-science perspective and by combining site-based ecosystem datasets and national geospatial datasets, science gains the ability to ask important research questions related to grand environmental challenges that operate at broad scales. Documentation of such complicated database integration efforts, through peer-reviewed papers, is recommended to foster reproducibility and future use of the integrated database. Here, we describe the major steps, challenges, and considerations in building an integrated database of lake ecosystems, called LAGOS (LAke multi-scaled GeOSpatial and temporal database), that was developed at the sub-continental study extent of 17 US states (1,800,000 km(2)). LAGOS includes two modules: LAGOSGEO, with geospatial data on every lake with surface area larger than 4 ha in the study extent (~50,000 lakes), including climate, atmospheric deposition, land use/cover, hydrology, geology, and topography measured across a range of spatial and temporal extents; and LAGOSLIMNO, with lake water quality data compiled from ~100 individual datasets for a subset of lakes in the study extent (~10,000 lakes). Procedures for the integration of datasets included: creating a flexible database design; authoring and integrating metadata; documenting data provenance; quantifying spatial measures of geographic data; quality-controlling integrated and derived data; and extensively documenting the database. Our procedures make a large, complex, and integrated

  16. Building a multi-scaled geospatial temporal ecology database from disparate data sources: Fostering open science through data reuse

    Science.gov (United States)

    Soranno, Patricia A.; Bissell, E.G.; Cheruvelil, Kendra S.; Christel, Samuel T.; Collins, Sarah M.; Fergus, C. Emi; Filstrup, Christopher T.; Lapierre, Jean-Francois; Lotting, Noah R.; Oliver, Samantha K.; Scott, Caren E.; Smith, Nicole J.; Stopyak, Scott; Yuan, Shuai; Bremigan, Mary Tate; Downing, John A.; Gries, Corinna; Henry, Emily N.; Skaff, Nick K.; Stanley, Emily H.; Stow, Craig A.; Tan, Pang-Ning; Wagner, Tyler; Webster, Katherine E.

    2015-01-01

    Although there are considerable site-based data for individual or groups of ecosystems, these datasets are widely scattered, have different data formats and conventions, and often have limited accessibility. At the broader scale, national datasets exist for a large number of geospatial features of land, water, and air that are needed to fully understand variation among these ecosystems. However, such datasets originate from different sources and have different spatial and temporal resolutions. By taking an open-science perspective and by combining site-based ecosystem datasets and national geospatial datasets, science gains the ability to ask important research questions related to grand environmental challenges that operate at broad scales. Documentation of such complicated database integration efforts, through peer-reviewed papers, is recommended to foster reproducibility and future use of the integrated database. Here, we describe the major steps, challenges, and considerations in building an integrated database of lake ecosystems, called LAGOS (LAke multi-scaled GeOSpatial and temporal database), that was developed at the sub-continental study extent of 17 US states (1,800,000 km2). LAGOS includes two modules: LAGOSGEO, with geospatial data on every lake with surface area larger than 4 ha in the study extent (~50,000 lakes), including climate, atmospheric deposition, land use/cover, hydrology, geology, and topography measured across a range of spatial and temporal extents; and LAGOSLIMNO, with lake water quality data compiled from ~100 individual datasets for a subset of lakes in the study extent (~10,000 lakes). Procedures for the integration of datasets included: creating a flexible database design; authoring and integrating metadata; documenting data provenance; quantifying spatial measures of geographic data; quality-controlling integrated and derived data; and extensively documenting the database. Our procedures make a large, complex, and integrated

  17. Defending the Power Grid from Hackers

    Energy Technology Data Exchange (ETDEWEB)

    Eber, Kevin

    2016-10-01

    A new initiative underway at the National Renewable Energy Laboratory is intended to prevent hackers from gaining control of parts of the nation's power grid, potentially damaging electrical equipment and causing localized power outages. Our nation's power grid is evolving to be more responsive to changing power needs, more able to integrate renewable energy, more efficient, and more reliable. One key element of this evolution is adding communication and control devices to the power grid, closer to the end user, so that utilities have greater situational awareness of the grid and can respond quickly to disturbances. But these new devices and their communications requirements can also open up the power grid to potential cyber attacks.

  18. A pressure drop model for PWR grids

    Energy Technology Data Exchange (ETDEWEB)

    Oh, Dong Seok; In, Wang Ki; Bang, Je Geon; Jung, Youn Ho; Chun, Tae Hyun [Korea Atomic Energy Research Institute, Taejon (Korea, Republic of)

    1998-12-31

    A pressure drop model for the PWR grids with and without mixing device is proposed at single phase based on the fluid mechanistic approach. Total pressure loss is expressed in additive way for form and frictional losses. The general friction factor correlations and form drag coefficients available in the open literatures are used to the model. As the results, the model shows better predictions than the existing ones for the non-mixing grids, and reasonable agreements with the available experimental data for mixing grids. Therefore it is concluded that the proposed model for pressure drop can provide sufficiently good approximation for grid optimization and design calculation in advanced grid development. 7 refs., 3 figs., 3 tabs. (Author)

  19. Parallel grid population

    Science.gov (United States)

    Wald, Ingo; Ize, Santiago

    2015-07-28

    Parallel population of a grid with a plurality of objects using a plurality of processors. One example embodiment is a method for parallel population of a grid with a plurality of objects using a plurality of processors. The method includes a first act of dividing a grid into n distinct grid portions, where n is the number of processors available for populating the grid. The method also includes acts of dividing a plurality of objects into n distinct sets of objects, assigning a distinct set of objects to each processor such that each processor determines by which distinct grid portion(s) each object in its distinct set of objects is at least partially bounded, and assigning a distinct grid portion to each processor such that each processor populates its distinct grid portion with any objects that were previously determined to be at least partially bounded by its distinct grid portion.

  20. Just Roll with It? Rolling Volumes vs. Discrete Issues in Open Access Library and Information Science Journals

    Directory of Open Access Journals (Sweden)

    Jill Cirasella

    2013-08-01

    Full Text Available INTRODUCTION Articles in open access (OA journals can be published on a rolling basis, as they become ready, or in complete, discrete issues. This study examines the prevalence of and reasons for rolling volumes vs. discrete issues among scholarly OA library and information science (LIS journals based in the United States. METHODS A survey was distributed to journal editors, asking them about their publication model and their reasons for and satisfaction with that model. RESULTS Of the 21 responding journals, 12 publish in discrete issues, eight publish in rolling volumes, and one publishes in rolling volumes with an occasional special issue. Almost all editors, regardless of model, cited ease of workflow as a justification for their chosen publication model, suggesting that there is no single best workflow for all journals. However, while all rolling-volume editors reported being satisfied with their model, satisfaction was less universal among discrete-issue editors. DISCUSSION The unexpectedly high number of rolling-volume journals suggests that LIS journal editors are making forward-looking choices about publication models even though the topic has not been much addressed in the library literature. Further research is warranted; possibilities include expanding the study’s geographic scope, broadening the study to other disciplines, and investigating publication model trends across the entire scholarly OA universe. CONCLUSION Both because satisfaction is high among editors of rolling-volume journals and because readers and authors appreciate quick publication times, the rolling-volume model will likely become even more prevalent in coming years.

  1. Using a Massive Open Online Course (MOOC) for Earth Science Education: Who Did We Teach and What Did We Learn?

    Science.gov (United States)

    Gold, Anne; Gordon, Eric

    2016-04-01

    Over the last decade, Massive Open Online Courses (MOOCs) have rapidly gained traction as a way to provide virtually anyone with an internet connection free access to a broad variety of high-quality college-level courses. That means Earth science instructors can now teach courses that reach tens of thousands of students--an incredible opportunity, but one that also poses many novel challenges. In April 2015, we used the Coursera platform to run a MOOC entitled "Water in the Western United States," to deliver a survey course of broad interest and partly as a venue to make research efforts accessible to a wide audience. Leveraging a previous online course run on a smaller MOOC platform (Canvas), we created a course largely based on short expert video lectures tied together by various types of assessments.Over a dozen experts provided short lectures offering a survey course that touches on the social, legal, natural, and societal aspects of the topic.This style of MOOC, in which the content is not delivered by one expert but by many, helped us showcase the breadth of available expertise both at the University of Colorado and elsewhere. In this presentation we will discuss the challenges that arose from planning a MOOC with no information about the characteristics of the student body, teaching thousands of unidentified students, and understanding the nature of online learning in an increasingly mobile-dominated world. We will also discuss the opportunities a MOOC offers for changes in undergraduate education, sharing across campuses or even across levels, and promoting flipped classroom-style learning. Finally, we will describe the general characteristics of our MOOC student body and describe lessons learned from our experience while aiming to place the MOOC experience into a larger conversation about the future of education at multiple levels.

  2. Grid-based Visualization Framework

    Science.gov (United States)

    Thiebaux, M.; Tangmunarunkit, H.; Kesselman, C.

    2003-12-01

    Advances in science and engineering have put high demands on tools for high-performance large-scale visual data exploration and analysis. For example, earthquake scientists can now study earthquake phenomena from first principle physics-based simulations. These simulations can generate large amounts of data, possibly high spatial resolution, and long time series. Single-system visualization software running on commodity machines cannot scale up to the large amounts of data generated by these simulations. To address this problem, we propose a flexible and extensible Grid-based visualization framework for time-critical, interactively controlled visual browsing of spatially and temporally large datasets in a Grid environment. Our framework leverages Grid resources for scalable computation and data storage to maintain performance and interactivity with large visualization jobs. Our framework utilizes Globus Toolkit 2.4 components for security (i.e., GSI), resource allocation and management (i.e., DUROC, GRAM) and communication (i.e., Globus-IO) to couple commodity desktops with remote, scalable storage and computational resources in a Grid for interactive data exploration. There are two major components in this framework---Grid Data Transport (GDT) and the Grid Visualization Utility (GVU). GDT provides libraries for performing parallel data filtering and parallel data exchange among Grid resources. GDT allows arbitrary data filtering to be integrated into the system. It also facilitates multi-tiered pipeline topology construction of compute resources and displays. In addition to scientific visualization applications, GDT can be used to support other applications that require parallel processing and parallel transfer of partial ordered independent files, such as file-set transfer. On top of GDT, we have developed the Grid Visualization Utility (GVU), which is designed to assist visualization dataset management, including file formatting, data transport and automatic

  3. Smart grid security

    CERN Document Server

    Goel, Sanjay; Papakonstantinou, Vagelis; Kloza, Dariusz

    2015-01-01

    This book on smart grid security is meant for a broad audience from managers to technical experts. It highlights security challenges that are faced in the smart grid as we widely deploy it across the landscape. It starts with a brief overview of the smart grid and then discusses some of the reported attacks on the grid. It covers network threats, cyber physical threats, smart metering threats, as well as privacy issues in the smart grid. Along with the threats the book discusses the means to improve smart grid security and the standards that are emerging in the field. The second part of the b

  4. Wireless communications networks for the smart grid

    CERN Document Server

    Ho, Quang-Dung; Rajalingham, Gowdemy; Le-Ngoc, Tho

    2014-01-01

    This brief presents a comprehensive review of the network architecture and communication technologies of the smart grid communication network (SGCN). It then studies the strengths, weaknesses and applications of two promising wireless mesh routing protocols that could be used to implement the SGCN. Packet transmission reliability, latency and robustness of these two protocols are evaluated and compared by simulations in various practical SGCN scenarios. Finally, technical challenges and open research opportunities of the SGCN are addressed. Wireless Communications Networks for Smart Grid provi

  5. Proposal for grid computing for nuclear applications

    Energy Technology Data Exchange (ETDEWEB)

    Idris, Faridah Mohamad; Ismail, Saaidi; Haris, Mohd Fauzi B.; Sulaiman, Mohamad Safuan B.; Aslan, Mohd Dzul Aiman Bin.; Samsudin, Nursuliza Bt.; Ibrahim, Maizura Bt.; Ahmad, Megat Harun Al Rashid B. Megat; Yazid, Hafizal B.; Jamro, Rafhayudi B.; Azman, Azraf B.; Rahman, Anwar B. Abdul; Ibrahim, Mohd Rizal B. Mamat; Muhamad, Shalina Bt. Sheik; Hassan, Hasni [Malaysian Nuclear Agency, Bangi, 43000 Kajang, Selangor (Malaysia); Abdullah, Wan Ahmad Tajuddin Wan; Ibrahim, Zainol Abidin; Zolkapli, Zukhaimira; Anuar, Afiq Aizuddin; Norjoharuddeen, Nurfikri [Physics Department, University of Malaya, 56003 Kuala Lumpur (Malaysia); and others

    2014-02-12

    The use of computer clusters for computational sciences including computational physics is vital as it provides computing power to crunch big numbers at a faster rate. In compute intensive applications that requires high resolution such as Monte Carlo simulation, the use of computer clusters in a grid form that supplies computational power to any nodes within the grid that needs computing power, has now become a necessity. In this paper, we described how the clusters running on a specific application could use resources within the grid, to run the applications to speed up the computing process.

  6. Publication and Protection of Sensitive Site Information in a Grid Infrastructure

    Energy Technology Data Exchange (ETDEWEB)

    Cholia, Shreyas; Cholia, Shreyas; Porter, R. Jefferson

    2008-03-31

    In order to create a successful grid infrastructure, sites and resource providers must be able to publish information about their underlying resources and services. This information makes it easier for users and virtual organizations to make intelligent decisions about resource selection and scheduling, and can be used by the grid infrastructure for accounting and troubleshooting services. However, such an outbound stream may include data deemed sensitive by a resource-providing site, exposing potential security vulnerabilities or private user information to the world at large, including malicious entities. This study analyzes the various vectors of information being published from sites to grid infrastructures. In particular, it examines the data being published to, and collected by the Open Science Grid, including resource selection, monitoring, accounting, troubleshooting, logging and site verification data. We analyze the risks and potential threat models posed by the publication and collection of such data. We also offer some recommendations and best practices for sites and grid infrastructures to manage and protect sensitive data.

  7. Publication and Protection of Sensitive Site Information in a Grid Infrastructure

    Energy Technology Data Exchange (ETDEWEB)

    Cholia, Shreyas; Cholia, Shreyas; Porter, R. Jefferson

    2008-03-31

    In order to create a successful grid infrastructure, sites and resource providers must be able to publish information about their underlying resources and services. This information makes it easier for users and virtual organizations to make intelligent decisions about resource selection and scheduling, and can be used by the grid infrastructure for accounting and troubleshooting services. However, such an outbound stream may include data deemed sensitive by a resource-providing site, exposing potential security vulnerabilities or private user information to the world at large, including malicious entities. This study analyzes the various vectors of information being published from sites to grid infrastructures. In particular, it examines the data being published to, and collected by the Open Science Grid, including resource selection, monitoring, accounting, troubleshooting, logging and site verification data. We analyze the risks and potential threat models posed by the publication and collection of such data. We also offer some recommendations and best practices for sites and grid infrastructures to manage and protect sensitive data.

  8. RSW Modified Inflow Grid

    Data.gov (United States)

    National Aeronautics and Space Administration — After discussions by the organizing committee, and some research using the RSW grids, a modification has been made on the RSW grids. The inflow boundary has now been...

  9. HIRENASD coarse structured grid

    Data.gov (United States)

    National Aeronautics and Space Administration — blockstructured hexahedral grid, 6.7 mio elements, 24 degree minimum grid angle, CGNS format version 2.4, double precision Binary, Plot3D file Please contact...

  10. Pathological Image Information - Open TG-GATEs Pathological Image Database | LSDB Archive [Life Science Database Archive metadata

    Lifescience Database Archive (English)

    Full Text Available Image Information Description of data contents Information regarding each high-resolution digital image in t...he Pathological Images (SVS format) section. Data file File name: open_tggates_pathological_image....zip File URL: ftp://ftp.biosciencedbc.jp/archive/open-tggates-pathological-images/LATEST/op...view/open_tggates_pathological_image#en Data acquisition method Information enter...ts. Liver or Kidney. FILE_LOCATION FTP server path to the pathological images. CA

  11. Defining an open access resource strategy for research libraries:Part Ⅲ—The strategies and practices of National Science Library

    Institute of Scientific and Technical Information of China (English)

    Xiaolin; ZHANG; Xiwen; LIU; Lin; LI; Yan; ZENG; Li-Ping; KU

    2012-01-01

    Purpose:This paper describes the strategies and practices of National Science Library(NSL),Chinese Academy of Sciences(CAS)in promoting open access(OA)and developing OA resources.Design/methodology/approach:Multi-facet frameworks are devised to guide the development of OA strategies and practices.Key OA initiatives are briefly described along the main aspects of the strategies as they contribute to implementation of the OA strategies.Findings:NSL defined its role as the Chief OA Officer for CAS and a key OA promoter for China.Accordingly,NSL has engaged in multiple fronts of promoting OA,including development of OA strategies for CAS,establishment of itself as an OA knowledge&promotion center,development of the CAS IR system,and support for OA publishing by CAS authors.Research limitations:OA is still evolving,so are the strategies and practices as many actions are experimental and explorative in nature.Open books,open data,and open educational resources are yet to be covered.Comparative studies are needed,so is the evidence-based impact analysis.Practical implications:Institutions can adopt,adapt,or compare with examples and lessons learned are described here.Originality/value:The multi-faceted frameworks,working principles,and lessons learned are based on NSL’s practices which can be valuable to the overall OA development.

  12. Using CyberShake Workflows to Manage Big Seismic Hazard Data on Large-Scale Open-Science HPC Resources

    Science.gov (United States)

    Callaghan, S.; Maechling, P. J.; Juve, G.; Vahi, K.; Deelman, E.; Jordan, T. H.

    2015-12-01

    migrating from file-based communication to MPI messaging, to greatly reduce the I/O demands and node-hour requirements of CyberShake. We will also present performance metrics from CyberShake Study 15.4, and discuss challenges that producers of Big Data on open-science HPC resources face moving forward.

  13. Smart grid in China

    DEFF Research Database (Denmark)

    Sommer, Simon; Ma, Zheng; Jørgensen, Bo Nørregaard

    2015-01-01

    China is planning to transform its traditional power grid in favour of a smart grid, since it allows a more economically efficient and a more environmentally friendly transmission and distribution of electricity. Thus, a nationwide smart grid is likely to save tremendous amounts of resources...

  14. Geospatial Applications on Different Parallel and Distributed Systems in enviroGRIDS Project

    Science.gov (United States)

    Rodila, D.; Bacu, V.; Gorgan, D.

    2012-04-01

    The execution of Earth Science applications and services on parallel and distributed systems has become a necessity especially due to the large amounts of Geospatial data these applications require and the large geographical areas they cover. The parallelization of these applications comes to solve important performance issues and can spread from task parallelism to data parallelism as well. Parallel and distributed architectures such as Grid, Cloud, Multicore, etc. seem to offer the necessary functionalities to solve important problems in the Earth Science domain: storing, distribution, management, processing and security of Geospatial data, execution of complex processing through task and data parallelism, etc. A main goal of the FP7-funded project enviroGRIDS (Black Sea Catchment Observation and Assessment System supporting Sustainable Development) [1] is the development of a Spatial Data Infrastructure targeting this catchment region but also the development of standardized and specialized tools for storing, analyzing, processing and visualizing the Geospatial data concerning this area. For achieving these objectives, the enviroGRIDS deals with the execution of different Earth Science applications, such as hydrological models, Geospatial Web services standardized by the Open Geospatial Consortium (OGC) and others, on parallel and distributed architecture to maximize the obtained performance. This presentation analysis the integration and execution of Geospatial applications on different parallel and distributed architectures and the possibility of choosing among these architectures based on application characteristics and user requirements through a specialized component. Versions of the proposed platform have been used in enviroGRIDS project on different use cases such as: the execution of Geospatial Web services both on Web and Grid infrastructures [2] and the execution of SWAT hydrological models both on Grid and Multicore architectures [3]. The current

  15. Five "Mainstays" of Power Grid Innovation

    Institute of Scientific and Technical Information of China (English)

    2006-01-01

    @@ It is an important reason for Microsoft, Bell and General Electric etc dominating always on forward position in world's science and technologies to bring full play to autonomous innovative capability of enterprises' research and development center. In the Power Grid Science and Technology Working Conference 2006, Mr. Lu Qizhou, the vice president of the State Grid Corporation(SG) pointed out that during the 11th Five-Year Plan period, the SG will emphatically push forward optimal conformity on scientific and technological resources among its affiliated research institutes, tap their back bone and leading effects in technical innovation system.

  16. Framework for Grid Manufacturing

    Institute of Scientific and Technical Information of China (English)

    陈笠; 邓宏; 邓倩妮; 吴振宇

    2004-01-01

    With the development of networked manufacturing, it is more and more imminent to solve problems caused by inherent limitations of network technology, such as heterogeneity, collaboration collision, and decentralized control.This paper presents a framework for grid manufacturing, which neatly combines grid technology with the infrastructure of advanced manufacturing technology.The paper studies grid-oriented knowledge description and acquisition, and constructs a distributed knowledge grid model.The paper also deals with the protocol of node description in collaborative design, and describes a distributed collaborative design model.The protocol and node technology leads to a collaborative production model for grid manufacturing.The framework for grid manufacturing offers an effective and feasible solution for the problems of networked manufacturing.The grid manufacturing will become an advanced distributed manufacturing model and promote the development of advanced manufacturing technologies.

  17. The R package 'icosa' for coarse resolution global triangular and penta-hexagonal gridding

    Science.gov (United States)

    Kocsis, Adam T.

    2017-04-01

    With the development of the internet and the computational power of personal computers, open source programming environments have become indispensable for science in the past decade. This includes the increase of the GIS capacity of the free R environment, which was originally developed for statistical analyses. The flexibility of R made it a preferred programming tool in a multitude of disciplines from the area of the biological and geological sciences. Many of these subdisciplines operate with incidence (occurrence) data that are in a large number of cases to be grained before further analyses can be conducted. This graining is executed mostly by gridding data to cells of a Gaussian grid of various resolutions to increase the density of data in a single unit of the analyses. This method has obvious shortcomings despite the ease of its application: well-known systematic biases are induced to cell sizes and shapes that can interfere with the results of statistical procedures, especially if the number of incidence points influences the metrics in question. The 'icosa' package employs a common method to overcome this obstacle by implementing grids with roughly equal cell sizes and shapes that are based on tessellated icosahedra. These grid objects are essentially polyhedra with xyz Cartesian vertex data that are linked to tables of faces and edges. At its current developmental stage, the package uses a single method of tessellation which balances grid cell size and shape distortions, but its structure allows the implementation of various other types of tessellation algorithms. The resolution of the grids can be set by the number of breakpoints inserted into a segment forming an edge of the original icosahedron. Both the triangular and their inverted penta-hexagonal grids are available for creation with the package. The package also incorporates functions to look up coordinates in the grid very effectively and data containers to link data to the grid structure. The

  18. Translating Biotechnology to Knowledge-Based Innovation, Peace, and Development? Deploy a Science Peace Corps—An Open Letter to World Leaders

    Science.gov (United States)

    Abou-Zeid, Alaa H.; Ağırbaşlı, Mehmet; Akintola, Simisola O.; Aynacıoğlu, Şükrü; Bayram, Mustafa; Bragazzi, Nicola Luigi; Dandara, Collet; Dereli, Türkay; Dove, Edward S.; Elbeyli, Levent; Endrenyi, Laszlo; Erciyas, Kamile; Faris, Jack; Ferguson, Lynnette R.; Göğüş, Fahrettin; Güngör, Kıvanç; Gürsoy, Mervi; Gürsoy, Ulvi K.; Karaömerlioğlu, M. Asım; Kickbusch, Ilona; Kılıç, Türker; Kılınç, Metin; Kocagöz, Tanıl; Lin, Biaoyang; LLerena, Adrián; Manolopoulos, Vangelis G.; Nair, Bipin; Özkan, Bülent; Pang, Tikki; Şardaş, Semra; Srivastava, Sanjeeva; Toraman, Cengiz; Üstün, Kemal; Warnich, Louise; Wonkam, Ambroise; Yakıcıer, Mustafa Cengiz; Yaşar, Ümit

    2014-01-01

    , “nearly all men can stand adversity, but if you want to test a man's character, give him power.” We therefore petition President Barack Obama, other world leaders, and international development agencies in positions of power around the globe, to consider deploying a Science Peace Corps to cultivate the essential (and presently missing) ties among life sciences, foreign policy, development, and peace agendas. A Science Peace Corps requires support by a credible and independent intergovernmental organization or development agency for funding, and arbitration in the course of volunteer work when the global versus local (glocal) value-based priorities and human rights intersect in synergy or conflict. In all, Science Peace Corps is an invitation to a new pathway for competence in 21st century science that is locally productive and globally competitive. It can open up scientific institutions to broader considerations and broader inputs, and thus cultivate vital translational science in a world sorely in need of solidarity and sustainable responses to the challenges of 21st century science and society. “Let me say in conclusion, this University is not maintained by its alumni, or by the state, merely to help its graduates have an economic advantage in the life struggle. There is certainly a greater purpose, and I'm sure you recognize it. Therefore, I do not apologize for asking for your support in this campaign.” President John F. Kennedy On the occasion of the Peace Corps Campaign, On the steps of the University of Michigan Union PMID:24955641

  19. Translating biotechnology to knowledge-based innovation, peace, and development? Deploy a Science Peace Corps--an open letter to world leaders.

    Science.gov (United States)

    Hekim, Nezih; Coşkun, Yavuz; Sınav, Ahmet; Abou-Zeid, Alaa H; Ağırbaşlı, Mehmet; Akintola, Simisola O; Aynacıoğlu, Şükrü; Bayram, Mustafa; Bragazzi, Nicola Luigi; Dandara, Collet; Dereli, Türkay; Dove, Edward S; Elbeyli, Levent; Endrenyi, Laszlo; Erciyas, Kamile; Faris, Jack; Ferguson, Lynnette R; Göğüş, Fahrettin; Güngör, Kıvanç; Gürsoy, Mervi; Gürsoy, Ulvi K; Karaömerlioğlu, M Asım; Kickbusch, Ilona; Kılıç, Türker; Kılınç, Metin; Kocagöz, Tanıl; Lin, Biaoyang; LLerena, Adrián; Manolopoulos, Vangelis G; Nair, Bipin; Özkan, Bülent; Pang, Tikki; Sardaş, Şemra; Srivastava, Sanjeeva; Toraman, Cengiz; Üstün, Kemal; Warnich, Louise; Wonkam, Ambroise; Yakıcıer, Mustafa Cengiz; Yaşar, Ümit; Özdemir, Vural

    2014-07-01

    all men can stand adversity, but if you want to test a man's character, give him power." We therefore petition President Barack Obama, other world leaders, and international development agencies in positions of power around the globe, to consider deploying a Science Peace Corps to cultivate the essential (and presently missing) ties among life sciences, foreign policy, development, and peace agendas. A Science Peace Corps requires support by a credible and independent intergovernmental organization or development agency for funding, and arbitration in the course of volunteer work when the global versus local (glocal) value-based priorities and human rights intersect in synergy or conflict. In all, Science Peace Corps is an invitation to a new pathway for competence in 21(st) century science that is locally productive and globally competitive. It can open up scientific institutions to broader considerations and broader inputs, and thus cultivate vital translational science in a world sorely in need of solidarity and sustainable responses to the challenges of 21(st) century science and society.

  20. National power grid simulation capability : need and issues

    Energy Technology Data Exchange (ETDEWEB)

    Petri, Mark C. [Argonne National Lab. (ANL), Argonne, IL (United States)

    2009-06-02

    On December 9 and 10, 2008, the Department of Homeland Security (DHS) Science and Technology Directorate sponsored a national workshop at Argonne National Laboratory to explore the need for a comprehensive modeling and simulation capability for the national electric power grid system. The workshop brought together leading electric power grid experts from federal agencies, the national laboratories, and academia to discuss the current state of power grid science and engineering and to assess if important challenges are being met. The workshop helped delineate gaps between grid needs and current capabilities and identify issues that must be addressed if a solution is to be implemented. This report is a result of the workshop and highlights power grid modeling and simulation needs, the barriers that must be overcome to address them, and the benefits of a national power grid simulation capability.