WorldWideScience

Sample records for open science grid

  1. The open science grid

    International Nuclear Information System (INIS)

    Pordes, R.

    2004-01-01

    The U.S. LHC Tier-1 and Tier-2 laboratories and universities are developing production Grids to support LHC applications running across a worldwide Grid computing system. Together with partners in computer science, physics grid projects and active experiments, we will build a common national production grid infrastructure which is open in its architecture, implementation and use. The Open Science Grid (OSG) model builds upon the successful approach of last year's joint Grid2003 project. The Grid3 shared infrastructure has for over eight months provided significant computational resources and throughput to a range of applications, including ATLAS and CMS data challenges, SDSS, LIGO, and biology analyses, and computer science demonstrators and experiments. To move towards LHC-scale data management, access and analysis capabilities, we must increase the scale, services, and sustainability of the current infrastructure by an order of magnitude or more. Thus, we must achieve a significant upgrade in its functionalities and technologies. The initial OSG partners will build upon a fully usable, sustainable and robust grid. Initial partners include the US LHC collaborations, DOE and NSF Laboratories and Universities and Trillium Grid projects. The approach is to federate with other application communities in the U.S. to build a shared infrastructure open to other sciences and capable of being modified and improved to respond to needs of other applications, including CDF, D0, BaBar, and RHIC experiments. We describe the application-driven, engineered services of the OSG, short term plans and status, and the roadmap for a consortium, its partnerships and national focus

  2. The Open Science Grid

    Energy Technology Data Exchange (ETDEWEB)

    Pordes, Ruth; /Fermilab; Kramer, Bill; Olson, Doug; / /LBL, Berkeley; Livny, Miron; Roy, Alain; /Wisconsin U., Madison; Avery, Paul; /Florida U.; Blackburn, Kent; /Caltech; Wenaus, Torre; /Brookhaven; Wurthwein, Frank; /UC, San Diego; Gardner, Rob; Wilde, Mike; /Chicago U. /Indiana U.

    2007-06-01

    The Open Science Grid (OSG) provides a distributed facility where the Consortium members provide guaranteed and opportunistic access to shared computing and storage resources. OSG provides support for and evolution of the infrastructure through activities that cover operations, security, software, troubleshooting, addition of new capabilities, and support for existing and engagement with new communities. The OSG SciDAC-2 project provides specific activities to manage and evolve the distributed infrastructure and support its use. The innovative aspects of the project are the maintenance and performance of a collaborative (shared & common) petascale national facility over tens of autonomous computing sites, for many hundreds of users, transferring terabytes of data a day, executing tens of thousands of jobs a day, and providing robust and usable resources for scientific groups of all types and sizes. More information can be found at the OSG web site: www.opensciencegrid.org.

  3. New science on the Open Science Grid

    Energy Technology Data Exchange (ETDEWEB)

    Pordes, R; Altunay, M; Sehgal, C [Fermi National Accelerator Laboratory, Batavia, IL 60510 (United States); Avery, P [University of Florida, Gainesville, FL 32611 (United States); Bejan, A; Gardner, R; Wilde, M [University of Chicago, Chicago, IL 60607 (United States); Blackburn, K [California Institute of Technology, Pasadena, CA 91125 (United States); Blatecky, A; McGee, J [Renaissance Computing Institute, Chapel Hill, NC 27517 (United States); Kramer, B; Olson, D; Roy, A [Lawrence Berkeley National Laboratory, Berkeley, CA 94720 (United States); Livny, M [University of Wisconsin, Madison, Madison, WI 53706 (United States); Potekhin, M; Quick, R; Wenaus, T [Indiana University, Bloomington, IN 47405 (United States); Wuerthwein, F [University of California, San Diego, La Jolla, CA 92093 (United States)], E-mail: ruth@fnal.gov

    2008-07-15

    The Open Science Grid (OSG) includes work to enable new science, new scientists, and new modalities in support of computationally based research. There are frequently significant sociological and organizational changes required in transformation from the existing to the new. OSG leverages its deliverables to the large-scale physics experiment member communities to benefit new communities at all scales through activities in education, engagement, and the distributed facility. This paper gives both a brief general description and specific examples of new science enabled on the OSG. More information is available at the OSG web site: www.opensciencegrid.org.

  4. New science on the Open Science Grid

    International Nuclear Information System (INIS)

    Pordes, R; Altunay, M; Sehgal, C; Avery, P; Bejan, A; Gardner, R; Wilde, M; Blackburn, K; Blatecky, A; McGee, J; Kramer, B; Olson, D; Roy, A; Livny, M; Potekhin, M; Quick, R; Wenaus, T; Wuerthwein, F

    2008-01-01

    The Open Science Grid (OSG) includes work to enable new science, new scientists, and new modalities in support of computationally based research. There are frequently significant sociological and organizational changes required in transformation from the existing to the new. OSG leverages its deliverables to the large-scale physics experiment member communities to benefit new communities at all scales through activities in education, engagement, and the distributed facility. This paper gives both a brief general description and specific examples of new science enabled on the OSG. More information is available at the OSG web site: www.opensciencegrid.org

  5. Enabling Campus Grids with Open Science Grid Technology

    International Nuclear Information System (INIS)

    Weitzel, Derek; Fraser, Dan; Pordes, Ruth; Bockelman, Brian; Swanson, David

    2011-01-01

    The Open Science Grid is a recognized key component of the US national cyber-infrastructure enabling scientific discovery through advanced high throughput computing. The principles and techniques that underlie the Open Science Grid can also be applied to Campus Grids since many of the requirements are the same, even if the implementation technologies differ. We find five requirements for a campus grid: trust relationships, job submission, resource independence, accounting, and data management. The Holland Computing Center's campus grid at the University of Nebraska-Lincoln was designed to fulfill the requirements of a campus grid. A bridging daemon was designed to bring non-Condor clusters into a grid managed by Condor. Condor features which make it possible to bridge Condor sites into a multi-campus grid have been exploited at the Holland Computing Center as well.

  6. Enabling campus grids with open science grid technology

    Energy Technology Data Exchange (ETDEWEB)

    Weitzel, Derek [Nebraska U.; Bockelman, Brian [Nebraska U.; Swanson, David [Nebraska U.; Fraser, Dan [Argonne; Pordes, Ruth [Fermilab

    2011-01-01

    The Open Science Grid is a recognized key component of the US national cyber-infrastructure enabling scientific discovery through advanced high throughput computing. The principles and techniques that underlie the Open Science Grid can also be applied to Campus Grids since many of the requirements are the same, even if the implementation technologies differ. We find five requirements for a campus grid: trust relationships, job submission, resource independence, accounting, and data management. The Holland Computing Center's campus grid at the University of Nebraska-Lincoln was designed to fulfill the requirements of a campus grid. A bridging daemon was designed to bring non-Condor clusters into a grid managed by Condor. Condor features which make it possible to bridge Condor sites into a multi-campus grid have been exploited at the Holland Computing Center as well.

  7. The Open Science Grid status and architecture

    Energy Technology Data Exchange (ETDEWEB)

    Pordes, Ruth; Petravick, Don; /Fermilab; Kramer, Bill; Olsen, James D.; /LBL, Berkeley; Livny, Miron; Roy, Gordon A.; /Wisconsin U., Madison; Avery, Paul Ralph; /Florida U.; Blackburn, Kent; /Caltech; Wenaus, Torre J.; /Brookhaven; Wuerthwein, Frank K.; /UC, San Diego; Foster, Ian; /Chicago U. /Indiana U.

    2007-09-01

    The Open Science Grid (OSG) provides a distributed facility where the Consortium members provide guaranteed and opportunistic access to shared computing and storage resources. The OSG project[1] is funded by the National Science Foundation and the Department of Energy Scientific Discovery through Advanced Computing program. The OSG project provides specific activities for the operation and evolution of the common infrastructure. The US ATLAS and US CMS collaborations contribute to and depend on OSG as the US infrastructure contributing to the World Wide LHC Computing Grid on which the LHC experiments distribute and analyze their data. Other stakeholders include the STAR RHIC experiment, the Laser Interferometer Gravitational-Wave Observatory (LIGO), the Dark Energy Survey (DES) and several Fermilab Tevatron experiments- CDF, D0, MiniBoone etc. The OSG implementation architecture brings a pragmatic approach to enabling vertically integrated community specific distributed systems over a common horizontal set of shared resources and services. More information can be found at the OSG web site: www.opensciencegrid.org.

  8. The Open Science Grid status and architecture

    International Nuclear Information System (INIS)

    Pordes, R; Petravick, D; Kramer, B; Olson, D; Livny, M; Roy, A; Avery, P; Blackburn, K; Wenaus, T; Wuerthwein, F; Foster, I; Gardner, R; Wilde, M; Blatecky, A; McGee, J; Quick, R

    2008-01-01

    The Open Science Grid (OSG) provides a distributed facility where the Consortium members provide guaranteed and opportunistic access to shared computing and storage resources. The OSG project[1] is funded by the National Science Foundation and the Department of Energy Scientific Discovery through Advanced Computing program. The OSG project provides specific activities for the operation and evolution of the common infrastructure. The US ATLAS and US CMS collaborations contribute to and depend on OSG as the US infrastructure contributing to the World Wide LHC Computing Grid on which the LHC experiments distribute and analyze their data. Other stakeholders include the STAR RHIC experiment, the Laser Interferometer Gravitational-Wave Observatory (LIGO), the Dark Energy Survey (DES) and several Fermilab Tevatron experiments- CDF, D0, MiniBoone etc. The OSG implementation architecture brings a pragmatic approach to enabling vertically integrated community specific distributed systems over a common horizontal set of shared resources and services. More information can be found at the OSG web site: www.opensciencegrid.org

  9. Public storage for the Open Science Grid

    International Nuclear Information System (INIS)

    Levshina, T; Guru, A

    2014-01-01

    The Open Science Grid infrastructure doesn't provide efficient means to manage public storage offered by participating sites. A Virtual Organization that relies on opportunistic storage has difficulties finding appropriate storage, verifying its availability, and monitoring its utilization. The involvement of the production manager, site administrators and VO support personnel is required to allocate or rescind storage space. One of the main requirements for Public Storage implementation is that it should use SRM or GridFTP protocols to access the Storage Elements provided by the OSG Sites and not put any additional burden on sites. By policy, no new services related to Public Storage can be installed and run on OSG sites. Opportunistic users also have difficulties in accessing the OSG Storage Elements during the execution of jobs. A typical users' data management workflow includes pre-staging common data on sites before a job's execution, then storing for a subsequent download to a local institution the output data produced by a job on a worker node. When the amount of data is significant, the only means to temporarily store the data is to upload it to one of the Storage Elements. In order to do that, a user's job should be aware of the storage location, availability, and free space. After a successful data upload, users must somehow keep track of the data's location for future access. In this presentation we propose solutions for storage management and data handling issues in the OSG. We are investigating the feasibility of using the integrated Rule-Oriented Data System developed at RENCI as a front-end service to the OSG SEs. The current architecture, state of deployment and performance test results will be discussed. We will also provide examples of current usage of the system by beta-users.

  10. Public storage for the Open Science Grid

    Science.gov (United States)

    Levshina, T.; Guru, A.

    2014-06-01

    The Open Science Grid infrastructure doesn't provide efficient means to manage public storage offered by participating sites. A Virtual Organization that relies on opportunistic storage has difficulties finding appropriate storage, verifying its availability, and monitoring its utilization. The involvement of the production manager, site administrators and VO support personnel is required to allocate or rescind storage space. One of the main requirements for Public Storage implementation is that it should use SRM or GridFTP protocols to access the Storage Elements provided by the OSG Sites and not put any additional burden on sites. By policy, no new services related to Public Storage can be installed and run on OSG sites. Opportunistic users also have difficulties in accessing the OSG Storage Elements during the execution of jobs. A typical users' data management workflow includes pre-staging common data on sites before a job's execution, then storing for a subsequent download to a local institution the output data produced by a job on a worker node. When the amount of data is significant, the only means to temporarily store the data is to upload it to one of the Storage Elements. In order to do that, a user's job should be aware of the storage location, availability, and free space. After a successful data upload, users must somehow keep track of the data's location for future access. In this presentation we propose solutions for storage management and data handling issues in the OSG. We are investigating the feasibility of using the integrated Rule-Oriented Data System developed at RENCI as a front-end service to the OSG SEs. The current architecture, state of deployment and performance test results will be discussed. We will also provide examples of current usage of the system by beta-users.

  11. Migrating Open Science Grid to RPMs

    International Nuclear Information System (INIS)

    Roy, Alain

    2012-01-01

    We recently completed a significant transition in the Open Science Grid (OSG) in which we moved our software distribution mechanism from the useful but niche system called Pacman to a community-standard native package system, RPM. In this paper we explore some of the lessons learned during this transition as well as our earlier work, lessons that we believe are valuable not only for software distribution and packaging, but also for software engineering in a distributed computing environment where reliability is critical. We discuss the benefits found in moving to a community standard, including the abilities to reuse existing packaging, to donate existing packaging back to the community, and to leverage existing skills in the community. We describe our approach to testing in which we test our software against multiple versions of the OS, including pre-releases of the OS, in order to find surprises before our users do. Finally, we discuss our large-scale evaluation testing and community testing, which are essential for both quality and community acceptance.

  12. DZero data-intensive computing on the Open Science Grid

    International Nuclear Information System (INIS)

    Abbott, B.; Baranovski, A.; Diesburg, M.; Garzoglio, G.; Kurca, T.; Mhashilkar, P.

    2007-01-01

    High energy physics experiments periodically reprocess data, in order to take advantage of improved understanding of the detector and the data processing code. Between February and May 2007, the DZero experiment has reprocessed a substantial fraction of its dataset. This consists of half a billion events, corresponding to about 100 TB of data, organized in 300,000 files. The activity utilized resources from sites around the world, including a dozen sites participating to the Open Science Grid consortium (OSG). About 1,500 jobs were run every day across the OSG, consuming and producing hundreds of Gigabytes of data. Access to OSG computing and storage resources was coordinated by the SAM-Grid system. This system organized job access to a complex topology of data queues and job scheduling to clusters, using a SAM-Grid to OSG job forwarding infrastructure. For the first time in the lifetime of the experiment, a data intensive production activity was managed on a general purpose grid, such as OSG. This paper describes the implications of using OSG, where all resources are granted following an opportunistic model, the challenges of operating a data intensive activity over such large computing infrastructure, and the lessons learned throughout the project

  13. DZero data-intensive computing on the Open Science Grid

    International Nuclear Information System (INIS)

    Abbott, B; Baranovski, A; Diesburg, M; Garzoglio, G; Mhashilkar, P; Kurca, T

    2008-01-01

    High energy physics experiments periodically reprocess data, in order to take advantage of improved understanding of the detector and the data processing code. Between February and May 2007, the DZero experiment has reprocessed a substantial fraction of its dataset. This consists of half a billion events, corresponding to about 100 TB of data, organized in 300,000 files. The activity utilized resources from sites around the world, including a dozen sites participating to the Open Science Grid consortium (OSG). About 1,500 jobs were run every day across the OSG, consuming and producing hundreds of Gigabytes of data. Access to OSG computing and storage resources was coordinated by the SAM-Grid system. This system organized job access to a complex topology of data queues and job scheduling to clusters, using a SAM-Grid to OSG job forwarding infrastructure. For the first time in the lifetime of the experiment, a data intensive production activity was managed on a general purpose grid, such as OSG. This paper describes the implications of using OSG, where all resources are granted following an opportunistic model, the challenges of operating a data intensive activity over such large computing infrastructure, and the lessons learned throughout the project

  14. Automatic Integration Testbeds validation on Open Science Grid

    Science.gov (United States)

    Caballero, J.; Thapa, S.; Gardner, R.; Potekhin, M.

    2011-12-01

    A recurring challenge in deploying high quality production middleware is the extent to which realistic testing occurs before release of the software into the production environment. We describe here an automated system for validating releases of the Open Science Grid software stack that leverages the (pilot-based) PanDA job management system developed and used by the ATLAS experiment. The system was motivated by a desire to subject the OSG Integration Testbed to more realistic validation tests. In particular those which resemble to every extent possible actual job workflows used by the experiments thus utilizing job scheduling at the compute element (CE), use of the worker node execution environment, transfer of data to/from the local storage element (SE), etc. The context is that candidate releases of OSG compute and storage elements can be tested by injecting large numbers of synthetic jobs varying in complexity and coverage of services tested. The native capabilities of the PanDA system can thus be used to define jobs, monitor their execution, and archive the resulting run statistics including success and failure modes. A repository of generic workflows and job types to measure various metrics of interest has been created. A command-line toolset has been developed so that testbed managers can quickly submit "VO-like" jobs into the system when newly deployed services are ready for testing. A system for automatic submission has been crafted to send jobs to integration testbed sites, collecting the results in a central service and generating regular reports for performance and reliability.

  15. Automatic Integration Testbeds validation on Open Science Grid

    International Nuclear Information System (INIS)

    Caballero, J; Potekhin, M; Thapa, S; Gardner, R

    2011-01-01

    A recurring challenge in deploying high quality production middleware is the extent to which realistic testing occurs before release of the software into the production environment. We describe here an automated system for validating releases of the Open Science Grid software stack that leverages the (pilot-based) PanDA job management system developed and used by the ATLAS experiment. The system was motivated by a desire to subject the OSG Integration Testbed to more realistic validation tests. In particular those which resemble to every extent possible actual job workflows used by the experiments thus utilizing job scheduling at the compute element (CE), use of the worker node execution environment, transfer of data to/from the local storage element (SE), etc. The context is that candidate releases of OSG compute and storage elements can be tested by injecting large numbers of synthetic jobs varying in complexity and coverage of services tested. The native capabilities of the PanDA system can thus be used to define jobs, monitor their execution, and archive the resulting run statistics including success and failure modes. A repository of generic workflows and job types to measure various metrics of interest has been created. A command-line toolset has been developed so that testbed managers can quickly submit 'VO-like' jobs into the system when newly deployed services are ready for testing. A system for automatic submission has been crafted to send jobs to integration testbed sites, collecting the results in a central service and generating regular reports for performance and reliability.

  16. Automated conversion of Docker images to CVMFS for LIGO and the Open Science Grid

    CERN Multimedia

    CERN. Geneva

    2018-01-01

    In this lightning talk, I will discuss the development of a webhook-based tool for automatically converting Docker images from DockerHub and private registries to CVMFS filesystems. The tool is highly reliant on previous work by the Open Science Grid for scripted nightly conversion of images from DockerHub.

  17. Analysis of the current use, benefit, and value of the Open Science Grid

    International Nuclear Information System (INIS)

    Pordes, R; Weichel, J

    2010-01-01

    The Open Science Grid usage has ramped up more than 25% in the past twelve months due to both the increase in throughput of the core stakeholders - US LHC, LIGO and Run II - and increase in usage by non-physics communities. It is important to understand the value collaborative projects, such as the OSG, contribute to the scientific community. This needs to be cognizant of the environment of commercial cloud offerings, the evolving and maturing middleware for grid based distributed computing, and the evolution in science and research dependence on computation. We present a first categorization of OSG value and analysis across several different aspects of the Consortium's goals and activities. And lastly, we presents some of the upcoming challenges of LHC data analysis ramp up and our ongoing contributions to the World Wide LHC Computing Grid.

  18. Analysis of the Current Use, Benefit, and Value of the Open Science Grid

    Energy Technology Data Exchange (ETDEWEB)

    Pordes, R.; /Fermilab

    2009-04-01

    The Open Science Grid usage has ramped up more than 25% in the past twelve months due to both the increase in throughput of the core stakeholders - US LHC, LIGO and Run II - and increase in usage by nonphysics communities. It is important to understand the value collaborative projects, such as the OSG, contribute to the scientific community. This needs to be cognizant of the environment of commercial cloud offerings, the evolving and maturing middleware for grid based distributed computing, and the evolution in science and research dependence on computation. We present a first categorization of OSG value and analysis across several different aspects of the Consortium's goals and activities. And lastly, we presents some of the upcoming challenges of LHC data analysis ramp up and our ongoing contributions to the World Wide LHC Computing Grid.

  19. The Open Science Grid – Support for Multi-Disciplinary Team Science – the Adolescent Years

    International Nuclear Information System (INIS)

    Bauerdick, Lothar; Ernst, Michael; Fraser, Dan; Livny, Miron; Pordes, Ruth; Sehgal, Chander; Würthwein, Frank

    2012-01-01

    As it enters adolescence the Open Science Grid (OSG) is bringing a maturing fabric of Distributed High Throughput Computing (DHTC) services that supports an expanding HEP community to an increasingly diverse spectrum of domain scientists. Working closely with researchers on campuses throughout the US and in collaboration with national cyberinfrastructure initiatives, we transform their computing environment through new concepts, advanced tools and deep experience. We discuss examples of these including: the pilot-job overlay concepts and technologies now in use throughout OSG and delivering 1.4 Million CPU hours/day; the role of campus infrastructures- built out from concepts of sharing across multiple local faculty clusters (made good use of already by many of the HEP Tier-2 sites in the US); the work towards the use of clouds and access to high throughput parallel (multi-core and GPU) compute resources; and the progress we are making towards meeting the data management and access needs of non-HEP communities with general tools derived from the experience of the parochial tools in HEP (integration of Globus Online, prototyping with IRODS, investigations into Wide Area Lustre). We will also review our activities and experiences as HTC Service Provider to the recently awarded NSF XD XSEDE project, the evolution of the US NSF TeraGrid project, and how we are extending the reach of HTC through this activity to the increasingly broad national cyberinfrastructure. We believe that a coordinated view of the HPC and HTC resources in the US will further expand their impact on scientific discovery.

  20. The Open Science Grid - Support for Multi-Disciplinary Team Science - the Adolescent Years

    Science.gov (United States)

    Bauerdick, Lothar; Ernst, Michael; Fraser, Dan; Livny, Miron; Pordes, Ruth; Sehgal, Chander; Würthwein, Frank; Open Science Grid

    2012-12-01

    As it enters adolescence the Open Science Grid (OSG) is bringing a maturing fabric of Distributed High Throughput Computing (DHTC) services that supports an expanding HEP community to an increasingly diverse spectrum of domain scientists. Working closely with researchers on campuses throughout the US and in collaboration with national cyberinfrastructure initiatives, we transform their computing environment through new concepts, advanced tools and deep experience. We discuss examples of these including: the pilot-job overlay concepts and technologies now in use throughout OSG and delivering 1.4 Million CPU hours/day; the role of campus infrastructures- built out from concepts of sharing across multiple local faculty clusters (made good use of already by many of the HEP Tier-2 sites in the US); the work towards the use of clouds and access to high throughput parallel (multi-core and GPU) compute resources; and the progress we are making towards meeting the data management and access needs of non-HEP communities with general tools derived from the experience of the parochial tools in HEP (integration of Globus Online, prototyping with IRODS, investigations into Wide Area Lustre). We will also review our activities and experiences as HTC Service Provider to the recently awarded NSF XD XSEDE project, the evolution of the US NSF TeraGrid project, and how we are extending the reach of HTC through this activity to the increasingly broad national cyberinfrastructure. We believe that a coordinated view of the HPC and HTC resources in the US will further expand their impact on scientific discovery.

  1. The Open Science Grid – Support for Multi-Disciplinary Team Science – the Adolescent Years

    CERN Multimedia

    CERN. Geneva

    2012-01-01

    As it enters adolescence the Open Science Grid (OSG) is bringing a maturing fabric of Distributed High Throughput Computing (DHTC) services that supports an expanding HEP community to an increasingly diverse spectrum of domain scientists. Working closely with researchers on campuses throughout the US and in collaboration with national cyberinfrastructure initiatives, we transform their computing environment through new concepts, advanced tools and deep experience. We discuss examples of these including: the pilot-job overlay concepts and technologies now in use throughout OSG and delivering 1.4 Million CPU hours/day; the role of campus infrastructures- built out from concepts of sharing across multiple local faculty clusters (made good use of already by many of the HEP Tier-2 sites in the US); the work towards the use of clouds and access to high throughput parallel (multi-core and GPU) compute resources; and the progress we are making towards meeting the data management and access needs of non-HEP communiti...

  2. The CEDPS troubleshooting architecture and deployment on the open science grid

    International Nuclear Information System (INIS)

    Tierney, Brian L; Gunter, Dan; Schopf, Jennifer M

    2007-01-01

    Tracking failures and poor performance across a widely distributed system of resources has proven challenging for many ongoing DOE applications. An example is the Open Science Grid (OSG) project, which currently experiences a roughly 15% job failure rate. This can be an issue not only for Grid computing but for anyone performing large-scale data transfers to remote machines because of the large number of interconnected components and services. As part of the Center for Enabling Distributed Petascale Science (CEDPS) project we have been building an infrastructure to work with current middleware and existing system tools to more easily track failures and discover anomalous behavior. This consists of a common logging format, the extension of syslog-ng for centralized collection of data, a data summarizer to more easily manage the volume of logging, and an anomaly detection system that can connect to a warning system when unexpected behaviors occur. We are currently working with OSG to deploy a prototype of the full system. The initial logs gathered will be used to extend the analysis tools and to increase the reliability of the services for the SciDAC end user community

  3. The event notification and alarm system for the Open Science Grid operations center

    Science.gov (United States)

    Hayashi, S.; Teige and, S.; Quick, R.

    2012-12-01

    The Open Science Grid Operations (OSG) Team operates a distributed set of services and tools that enable the utilization of the OSG by several HEP projects. Without these services users of the OSG would not be able to run jobs, locate resources, obtain information about the status of systems or generally use the OSG. For this reason these services must be highly available. This paper describes the automated monitoring and notification systems used to diagnose and report problems. Described here are the means used by OSG Operations to monitor systems such as physical facilities, network operations, server health, service availability and software error events. Once detected, an error condition generates a message sent to, for example, Email, SMS, Twitter, an Instant Message Server, etc. The mechanism being developed to integrate these monitoring systems into a prioritized and configurable alarming system is emphasized.

  4. The event notification and alarm system for the Open Science Grid operations center

    Energy Technology Data Exchange (ETDEWEB)

    Hayashi, S; Teige and, S; Quick, R [Indiana University, University Information Technology Services (United States)

    2012-12-13

    The Open Science Grid Operations (OSG) Team operates a distributed set of services and tools that enable the utilization of the OSG by several HEP projects. Without these services users of the OSG would not be able to run jobs, locate resources, obtain information about the status of systems or generally use the OSG. For this reason these services must be highly available. This paper describes the automated monitoring and notification systems used to diagnose and report problems. Described here are the means used by OSG Operations to monitor systems such as physical facilities, network operations, server health, service availability and software error events. Once detected, an error condition generates a message sent to, for example, Email, SMS, Twitter, an Instant Message Server, etc. The mechanism being developed to integrate these monitoring systems into a prioritized and configurable alarming system is emphasized.

  5. The event notification and alarm system for the Open Science Grid operations center

    International Nuclear Information System (INIS)

    Hayashi, S; Teige and, S; Quick, R

    2012-01-01

    The Open Science Grid Operations (OSG) Team operates a distributed set of services and tools that enable the utilization of the OSG by several HEP projects. Without these services users of the OSG would not be able to run jobs, locate resources, obtain information about the status of systems or generally use the OSG. For this reason these services must be highly available. This paper describes the automated monitoring and notification systems used to diagnose and report problems. Described here are the means used by OSG Operations to monitor systems such as physical facilities, network operations, server health, service availability and software error events. Once detected, an error condition generates a message sent to, for example, Email, SMS, Twitter, an Instant Message Server, etc. The mechanism being developed to integrate these monitoring systems into a prioritized and configurable alarming system is emphasized.

  6. Large Scale Monte Carlo Simulation of Neutrino Interactions Using the Open Science Grid and Commercial Clouds

    International Nuclear Information System (INIS)

    Norman, A.; Boyd, J.; Davies, G.; Flumerfelt, E.; Herner, K.; Mayer, N.; Mhashilhar, P.; Tamsett, M.; Timm, S.

    2015-01-01

    Modern long baseline neutrino experiments like the NOvA experiment at Fermilab, require large scale, compute intensive simulations of their neutrino beam fluxes and backgrounds induced by cosmic rays. The amount of simulation required to keep the systematic uncertainties in the simulation from dominating the final physics results is often 10x to 100x that of the actual detector exposure. For the first physics results from NOvA this has meant the simulation of more than 2 billion cosmic ray events in the far detector and more than 200 million NuMI beam spill simulations. Performing these high statistics levels of simulation have been made possible for NOvA through the use of the Open Science Grid and through large scale runs on commercial clouds like Amazon EC2. We details the challenges in performing large scale simulation in these environments and how the computing infrastructure for the NOvA experiment has been adapted to seamlessly support the running of different simulation and data processing tasks on these resources. (paper)

  7. GOC-TX: A Reliable Ticket Synchronization Application for the Open Science Grid

    Science.gov (United States)

    Hayashi, Soichi; Gopu, Arvind; Quick, Robert

    2011-12-01

    One of the major operational issues faced by large multi-institutional collaborations is permitting its users and support staff to use their native ticket tracking environment while also exchanging these tickets with collaborators. After several failed attempts at email-parser based ticket exchanges, the OSG Operations Group has designed a comprehensive ticket synchronizing application. The GOC-TX application uses web-service interfaces offered by various commercial, open source and other homegrown ticketing systems, to synchronize tickets between two or more of these systems. GOC-TX operates independently from any ticketing system. It can be triggered by one ticketing system via email, active messaging, or a web-services call to check for current sync-status, pull applicable recent updates since prior synchronizations to the source ticket, and apply the updates to a destination ticket. The currently deployed production version of GOC-TX is able to synchronize tickets between the Numara Footprints ticketing system used by the OSG and the following systems: European Grid Initiative's system Global Grid User Support (GGUS) and the Request Tracker (RT) system used by Brookhaven. Additional interfaces to the BMC Remedy system used by Fermilab, and to other instances of RT used by other OSG partners, are expected to be completed in summer 2010. A fully configurable open source version is expected to be made available by early autumn 2010. This paper will cover the structure of the GOC-TX application, its evolution, and the problems encountered by OSG Operations group with ticket exchange within the OSG Collaboration.

  8. GOC-TX: A Reliable Ticket Synchronization Application for the Open Science Grid

    International Nuclear Information System (INIS)

    Hayashi, Soichi; Gopu, Arvind; Quick, Robert

    2011-01-01

    One of the major operational issues faced by large multi-institutional collaborations is permitting its users and support staff to use their native ticket tracking environment while also exchanging these tickets with collaborators. After several failed attempts at email-parser based ticket exchanges, the OSG Operations Group has designed a comprehensive ticket synchronizing application. The GOC-TX application uses web-service interfaces offered by various commercial, open source and other homegrown ticketing systems, to synchronize tickets between two or more of these systems. GOC-TX operates independently from any ticketing system. It can be triggered by one ticketing system via email, active messaging, or a web-services call to check for current sync-status, pull applicable recent updates since prior synchronizations to the source ticket, and apply the updates to a destination ticket. The currently deployed production version of GOC-TX is able to synchronize tickets between the Numara Footprints ticketing system used by the OSG and the following systems: European Grid Initiative's system Global Grid User Support (GGUS) and the Request Tracker (RT) system used by Brookhaven. Additional interfaces to the BMC Remedy system used by Fermilab, and to other instances of RT used by other OSG partners, are expected to be completed in summer 2010. A fully configurable open source version is expected to be made available by early autumn 2010. This paper will cover the structure of the GOC-TX application, its evolution, and the problems encountered by OSG Operations group with ticket exchange within the OSG Collaboration.

  9. Open Science Grid (OSG) Ticket Synchronization: Keeping Your Home Field Advantage In A Distributed Environment

    International Nuclear Information System (INIS)

    Gross, Kyle; Hayashi, Soichi; Teige, Scott; Quick, Robert

    2012-01-01

    Large distributed computing collaborations, such as the Worldwide LHC Computing Grid (WLCG), face many issues when it comes to providing a working grid environment for their users. One of these is exchanging tickets between various ticketing systems in use by grid collaborations. Ticket systems such as Footprints, RT, Remedy, and ServiceNow all have different schema that must be addressed in order to provide a reliable exchange of information between support entities and users in different grid environments. To combat this problem, OSG Operations has created a ticket synchronization interface called GOC-TX that relies on web services instead of error-prone email parsing methods of the past. Synchronizing tickets between different ticketing systems allows any user or support entity to work on a ticket in their home environment, thus providing a familiar and comfortable place to provide updates without having to learn another ticketing system. The interface is built in a way that it is generic enough that it can be customized for nearly any ticketing system with a web-service interface with only minor changes. This allows us to be flexible and rapidly bring new ticket synchronization online. Synchronization can be triggered by different methods including mail, web services interface, and active messaging. GOC-TX currently interfaces with Global Grid User Support (GGUS) for WLCG, Remedy at Brookhaven National Lab (BNL), and Request Tracker (RT) at the Virtual Data Toolkit (VDT). Work is progressing on the Fermi National Accelerator Laboratory (FNAL) ServiceNow synchronization. This paper will explain the problems faced by OSG and how they led OSG to create and implement this ticket synchronization system along with the technical details that allow synchronization to be preformed at a production level.

  10. Open Science Grid (OSG) Ticket Synchronization: Keeping Your Home Field Advantage In A Distributed Environment

    Science.gov (United States)

    Gross, Kyle; Hayashi, Soichi; Teige, Scott; Quick, Robert

    2012-12-01

    Large distributed computing collaborations, such as the Worldwide LHC Computing Grid (WLCG), face many issues when it comes to providing a working grid environment for their users. One of these is exchanging tickets between various ticketing systems in use by grid collaborations. Ticket systems such as Footprints, RT, Remedy, and ServiceNow all have different schema that must be addressed in order to provide a reliable exchange of information between support entities and users in different grid environments. To combat this problem, OSG Operations has created a ticket synchronization interface called GOC-TX that relies on web services instead of error-prone email parsing methods of the past. Synchronizing tickets between different ticketing systems allows any user or support entity to work on a ticket in their home environment, thus providing a familiar and comfortable place to provide updates without having to learn another ticketing system. The interface is built in a way that it is generic enough that it can be customized for nearly any ticketing system with a web-service interface with only minor changes. This allows us to be flexible and rapidly bring new ticket synchronization online. Synchronization can be triggered by different methods including mail, web services interface, and active messaging. GOC-TX currently interfaces with Global Grid User Support (GGUS) for WLCG, Remedy at Brookhaven National Lab (BNL), and Request Tracker (RT) at the Virtual Data Toolkit (VDT). Work is progressing on the Fermi National Accelerator Laboratory (FNAL) ServiceNow synchronization. This paper will explain the problems faced by OSG and how they led OSG to create and implement this ticket synchronization system along with the technical details that allow synchronization to be preformed at a production level.

  11. Grid for Earth Science Applications

    Science.gov (United States)

    Petitdidier, Monique; Schwichtenberg, Horst

    2013-04-01

    The civil society at large has addressed to the Earth Science community many strong requirements related in particular to natural and industrial risks, climate changes, new energies. The main critical point is that on one hand the civil society and all public ask for certainties i.e. precise values with small error range as it concerns prediction at short, medium and long term in all domains; on the other hand Science can mainly answer only in terms of probability of occurrence. To improve the answer or/and decrease the uncertainties, (1) new observational networks have been deployed in order to have a better geographical coverage and more accurate measurements have been carried out in key locations and aboard satellites. Following the OECD recommendations on the openness of research and public sector data, more and more data are available for Academic organisation and SMEs; (2) New algorithms and methodologies have been developed to face the huge data processing and assimilation into simulations using new technologies and compute resources. Finally, our total knowledge about the complex Earth system is contained in models and measurements, how we put them together has to be managed cleverly. The technical challenge is to put together databases and computing resources to answer the ES challenges. However all the applications are very intensive computing. Different compute solutions are available and depend on the characteristics of the applications. One of them is Grid especially efficient for independent or embarrassingly parallel jobs related to statistical and parametric studies. Numerous applications in atmospheric chemistry, meteorology, seismology, hydrology, pollution, climate and biodiversity have been deployed successfully on Grid. In order to fulfill requirements of risk management, several prototype applications have been deployed using OGC (Open geospatial Consortium) components with Grid middleware. The Grid has permitted via a huge number of runs to

  12. Scalable Open Source Smart Grid Simulator (SGSim)

    DEFF Research Database (Denmark)

    Ebeid, Emad Samuel Malki; Jacobsen, Rune Hylsberg; Stefanni, Francesco

    2017-01-01

    . This paper presents an open source smart grid simulator (SGSim). The simulator is based on open source SystemC Network Simulation Library (SCNSL) and aims to model scalable smart grid applications. SGSim has been tested under different smart grid scenarios that contain hundreds of thousands of households...

  13. Sustaining and Extending the Open Science Grid: Science Innovation on a PetaScale Nationwide Facility (DE-FC02-06ER41436) SciDAC-2 Closeout Report

    Energy Technology Data Exchange (ETDEWEB)

    Livny, Miron [Univ. of Wisconsin, Madison, WI (United States); Shank, James [Boston Univ., MA (United States); Ernst, Michael [Brookhaven National Lab. (BNL), Upton, NY (United States); Blackburn, Kent [California Inst. of Technology (CalTech), Pasadena, CA (United States); Goasguen, Sebastien [Clemson Univ., SC (United States); Tuts, Michael [Columbia Univ., New York, NY (United States); Gibbons, Lawrence [Cornell Univ., Ithaca, NY (United States); Pordes, Ruth [Fermi National Accelerator Lab. (FNAL), Batavia, IL (United States); Sliz, Piotr [Harvard Medical School, Boston, MA (United States); Deelman, Ewa [Univ. of Southern California, Los Angeles, CA (United States). Information Sciences Inst.; Barnett, William [Indiana Univ., Bloomington, IN (United States); Olson, Doug [Lawrence Berkeley National Lab. (LBNL), Berkeley, CA (United States); McGee, John [Univ. of North Carolina, Chapel Hill, NC (United States). Renaissance Computing Inst.; Cowles, Robert [SLAC National Accelerator Lab., Menlo Park, CA (United States); Wuerthwein, Frank [Univ. of California, San Diego, CA (United States); Gardner, Robert [Univ. of Chicago, IL (United States); Avery, Paul [Univ. of Florida, Gainesville, FL (United States); Wang, Shaowen [Univ. of Illinois, Champaign, IL (United States); Univ. of Iowa, Iowa City, IA (United States); Lincoln, David Swanson [Univ. of Nebraska, Lincoln, NE (United States)

    2015-02-11

    Under this SciDAC-2 grant the project’s goal w a s t o stimulate new discoveries by providing scientists with effective and dependable access to an unprecedented national distributed computational facility: the Open Science Grid (OSG). We proposed to achieve this through the work of the Open Science Grid Consortium: a unique hands-on multi-disciplinary collaboration of scientists, software developers and providers of computing resources. Together the stakeholders in this consortium sustain and use a shared distributed computing environment that transforms simulation and experimental science in the US. The OSG consortium is an open collaboration that actively engages new research communities. We operate an open facility that brings together a broad spectrum of compute, storage, and networking resources and interfaces to other cyberinfrastructures, including the US XSEDE (previously TeraGrid), the European Grids for ESciencE (EGEE), as well as campus and regional grids. We leverage middleware provided by computer science groups, facility IT support organizations, and computing programs of application communities for the benefit of consortium members and the US national CI.

  14. Assessment of grid optimisation measures for the German transmission grid using open source grid data

    Science.gov (United States)

    Böing, F.; Murmann, A.; Pellinger, C.; Bruckmeier, A.; Kern, T.; Mongin, T.

    2018-02-01

    The expansion of capacities in the German transmission grid is a necessity for further integration of renewable energy sources into the electricity sector. In this paper, the grid optimisation measures ‘Overhead Line Monitoring’, ‘Power-to-Heat’ and ‘Demand Response in the Industry’ are evaluated and compared against conventional grid expansion for the year 2030. Initially, the methodical approach of the simulation model is presented and detailed descriptions of the grid model and the used grid data, which partly originates from open-source platforms, are provided. Further, this paper explains how ‘Curtailment’ and ‘Redispatch’ can be reduced by implementing grid optimisation measures and how the depreciation of economic costs can be determined considering construction costs. The developed simulations show that the conventional grid expansion is more efficient and implies more grid relieving effects than the evaluated grid optimisation measures.

  15. Open Science Training Handbook

    OpenAIRE

    Sonja Bezjak; April Clyburne-Sherin; Philipp Conzett; Pedro Fernandes; Edit Görögh; Kerstin Helbig; Bianca Kramer; Ignasi Labastida; Kyle Niemeyer; Fotis Psomopoulos; Tony Ross-Hellauer; René Schneider; Jon Tennant; Ellen Verbakel; Helene Brinken

    2018-01-01

    For a readable version of the book, please visit https://book.fosteropenscience.eu A group of fourteen authors came together in February 2018 at the TIB (German National Library of Science and Technology) in Hannover to create an open, living handbook on Open Science training. High-quality trainings are fundamental when aiming at a cultural change towards the implementation of Open Science principles. Teaching resources provide great support for Open Science instructors and trainers. The ...

  16. Opening science to the world; opening the world to science

    CERN Multimedia

    Andrew Purcell

    2015-01-01

    ‘Engaging the research community towards an Open Science Commons’ was the main theme of the European Grid Infrastructure (EGI) annual conference that was held in Lisbon from 18 to 22 May. At the conference, the EGI­Engage project was launched and the European Open Science Cloud was discussed.   Tiziana Ferrari, technical director of EGI.eu, speaks at the EGI Annual conference in Lisbon this year. The EGI­Engage project was launched during the opening session of the conference by Tiziana Ferrari, technical director of EGI.eu. This project, which has been funded through the EU’s Horizon 2020 Framework Programme for Research and Innovation, aims to accelerate progress towards the implementation of the Open Science Commons. It seeks to do so by expanding the capabilities of a European backbone of federated services for computing, storage, data, communication, knowledge and expertise, as well as related community­-specific capabilities. &l...

  17. Science Opens Doors

    Science.gov (United States)

    Smyth, Steve; Smyth, Jen

    2016-01-01

    Science Opens Doors is the creation of Clive Thompson of the Horners' Livery Company. The Science Opens Doors project philosophy is strongly based upon the King's College London ASPIRES project, which established that children like doing science in junior school (ages 7-11), but that by the age of 12-14 they are firmly against becoming scientists.…

  18. Trends in life science grid: from computing grid to knowledge grid

    Directory of Open Access Journals (Sweden)

    Konagaya Akihiko

    2006-12-01

    Full Text Available Abstract Background Grid computing has great potential to become a standard cyberinfrastructure for life sciences which often require high-performance computing and large data handling which exceeds the computing capacity of a single institution. Results This survey reviews the latest grid technologies from the viewpoints of computing grid, data grid and knowledge grid. Computing grid technologies have been matured enough to solve high-throughput real-world life scientific problems. Data grid technologies are strong candidates for realizing "resourceome" for bioinformatics. Knowledge grids should be designed not only from sharing explicit knowledge on computers but also from community formulation for sharing tacit knowledge among a community. Conclusion Extending the concept of grid from computing grid to knowledge grid, it is possible to make use of a grid as not only sharable computing resources, but also as time and place in which people work together, create knowledge, and share knowledge and experiences in a community.

  19. Open Education and the Open Science Economy

    Science.gov (United States)

    Peters, Michael A.

    2009-01-01

    Openness as a complex code word for a variety of digital trends and movements has emerged as an alternative mode of "social production" based on the growing and overlapping complexities of open source, open access, open archiving, open publishing, and open science. This paper argues that the openness movement with its reinforcing structure of…

  20. Open Media Science

    DEFF Research Database (Denmark)

    Møller Moltke Martiny, Kristian; Pedersen, David Budtz; Hansted, Allan Alfred Birkegaard

    2016-01-01

    In this article, we present three challenges to the emerging Open Science (OS) movement: the challenge of communication, collaboration and cultivation of scientific research. We argue that to address these challenges OS needs to include other forms of data than what can be captured in a text...... and extend into a fully-fledged Open Media movement engaging with new media and non-traditional formats of science communication. We discuss two cases where experiments with open media have driven new collaborations between scientists and documentarists. We use the cases to illustrate different advantages...... of using open media to face the challenges of OS....

  1. Open Media Science

    DEFF Research Database (Denmark)

    Martiny, Kristian Møller Moltke; Pedersen, David Budtz; Hansted, Alfred Birkegaard

    2016-01-01

    and extend into a fully-fledged Open Media movement engaging with new media and non-traditional formats of science communication. We discuss two cases where experiments with open media have driven new collaborations between scientists and documentarists. We use the cases to illustrate different advantages...

  2. Open Media Science

    DEFF Research Database (Denmark)

    Møller Moltke Martiny, Kristian; Pedersen, David Budtz; Hansted, Allan Alfred Birkegaard

    2016-01-01

    In this article, we present three challenges to the emerging Open Science (OS) movement: the challenge of communication, collaboration and cultivation of scientific research. We argue that to address these challenges OS needs to include other forms of data than what can be captured in a text...... and extend into a fully-fledged Open Media movement engaging with new media and non-traditional formats of science communication. We discuss two cases where experiments with open media have driven new collaborations between scientists and documentarists. We use the cases to illustrate different advantages...

  3. Secure Interoperable Open Smart Grid Demonstration Project

    Energy Technology Data Exchange (ETDEWEB)

    Magee, Thoman [Consolidated Edison Company Of New York, Inc., NY (United States)

    2014-12-28

    The Consolidated Edison, Inc., of New York (Con Edison) Secure Interoperable Open Smart Grid Demonstration Project (SGDP), sponsored by the United States (US) Department of Energy (DOE), demonstrated that the reliability, efficiency, and flexibility of the grid can be improved through a combination of enhanced monitoring and control capabilities using systems and resources that interoperate within a secure services framework. The project demonstrated the capability to shift, balance, and reduce load where and when needed in response to system contingencies or emergencies by leveraging controllable field assets. The range of field assets includes curtailable customer loads, distributed generation (DG), battery storage, electric vehicle (EV) charging stations, building management systems (BMS), home area networks (HANs), high-voltage monitoring, and advanced metering infrastructure (AMI). The SGDP enables the seamless integration and control of these field assets through a common, cyber-secure, interoperable control platform, which integrates a number of existing legacy control and data systems, as well as new smart grid (SG) systems and applications. By integrating advanced technologies for monitoring and control, the SGDP helps target and reduce peak load growth, improves the reliability and efficiency of Con Edison’s grid, and increases the ability to accommodate the growing use of distributed resources. Con Edison is dedicated to lowering costs, improving reliability and customer service, and reducing its impact on the environment for its customers. These objectives also align with the policy objectives of New York State as a whole. To help meet these objectives, Con Edison’s long-term vision for the distribution grid relies on the successful integration and control of a growing penetration of distributed resources, including demand response (DR) resources, battery storage units, and DG. For example, Con Edison is expecting significant long-term growth of DG

  4. Open hardware for open science

    CERN Multimedia

    CERN Bulletin

    2011-01-01

    Inspired by the open source software movement, the Open Hardware Repository was created to enable hardware developers to share the results of their R&D activities. The recently published CERN Open Hardware Licence offers the legal framework to support this knowledge and technology exchange.   Two years ago, a group of electronics designers led by Javier Serrano, a CERN engineer, working in experimental physics laboratories created the Open Hardware Repository (OHR). This project was initiated in order to facilitate the exchange of hardware designs across the community in line with the ideals of “open science”. The main objectives include avoiding duplication of effort by sharing results across different teams that might be working on the same need. “For hardware developers, the advantages of open hardware are numerous. For example, it is a great learning tool for technologies some developers would not otherwise master, and it avoids unnecessary work if someone ha...

  5. OpenADR Open Source Toolkit: Developing Open Source Software for the Smart Grid

    Energy Technology Data Exchange (ETDEWEB)

    McParland, Charles

    2011-02-01

    Demand response (DR) is becoming an increasingly important part of power grid planning and operation. The advent of the Smart Grid, which mandates its use, further motivates selection and development of suitable software protocols to enable DR functionality. The OpenADR protocol has been developed and is being standardized to serve this goal. We believe that the development of a distributable, open source implementation of OpenADR will benefit this effort and motivate critical evaluation of its capabilities, by the wider community, for providing wide-scale DR services

  6. Open life science research, open software and the open century

    Directory of Open Access Journals (Sweden)

    Youhua Chen

    2015-05-01

    Full Text Available At the age of knowledge explosion and mass scientific information, I highlighted the importance of conducting open science in life and medical researches through the extensive usage of open software and documents. The proposal of conducting open science is to reduce the limited repeatability of researches in life science. I outlined the essential steps for conducting open life science and the necessary standards for creating, reusing and reproducing open materials. Different Creative Commons licenses were presented and compared of their usage scope and restriction. As a conclusion, I argued that open materials should be widely adopted in doing life and medical researches.

  7. A new science infrastruture: the grid

    International Nuclear Information System (INIS)

    Sun Gongxing

    2003-01-01

    As the depth and scale of science reserch growing, it's requirement of computing power will become bigger and bigger, as well as the global collaboration is being enhanced. therefore, integration and sharing of all available resources among the participating organizations is required, including computing, storage, networks, even human resource and intelligant instruments. Grid technology is developed for the goal mentioned above, and could become an infrastructure the future science research and engineering. As a global computing technology, there are a lot of key technologies to be addressed. In the paper, grid architecture and secure infrastructure and application domains and tools will be described, at last we will give the grid prospect in the future. (authors)

  8. Neutron Science TeraGrid Gateway

    International Nuclear Information System (INIS)

    Lynch, Vickie E.; Chen, Meili; Cobb, John W.; Kohl, James Arthur; Miller, Stephen D.; Speirs, David A.; Vazhkudai, Sudharshan S.

    2010-01-01

    The unique contributions of the Neutron Science TeraGrid Gateway (NSTG) are the connection of national user facility instrument data sources to the integrated cyberinfrastructure of the National Science FoundationTeraGrid and the development of a neutron science gateway that allows neutron scientists to use TeraGrid resources to analyze their data, including comparison of experiment with simulation. The NSTG is working in close collaboration with the Spallation Neutron Source (SNS) at Oak Ridge as their principal facility partner. The SNS is a next-generation neutron source. It has completed construction at a cost of $1.4 billion and is ramping up operations. The SNS will provide an order of magnitude greater flux than any previous facility in the world and will be available to all of the nation's scientists, independent of funding source, on a peer-reviewed merit basis. With this new capability, the neutron science community is facing orders of magnitude larger data sets and is at a critical point for data analysis and simulation. There is a recognized need for new ways to manage and analyze data to optimize both beam time and scientific output. The TeraGrid is providing new capabilities in the gateway for simulations using McStas and a fitting service on distributed TeraGrid resources to improved turnaround. NSTG staff are also exploring replicating experimental data in archival storage. As part of the SNS partnership, the NSTG provides access to gateway support, cyberinfrastructure outreach, community development, and user support for the neutron science community. This community includes not only SNS staff and users but extends to all the major worldwide neutron scattering centers.

  9. The Grid is open, so please come in…

    CERN Multimedia

    Caroline Duc

    2012-01-01

    During the week of 17 to 21 September 2012, the European Grid Infrastructure Technical Forum was held in Prague. At this event, organised by EGI (European Grid Infrastructure), grid computing experts set about tackling the challenge of opening their doors to a still wider community. This provided an excellent opportunity to look back at similar initiatives by EGI in the past.   EGI's aim is to coordinate the computing resources of the European Grid Infrastructure and to encourage exchanges between the collaboration and users. Initially dedicated mainly to high-energy particle physics, the European Grid Infrastructure is now welcoming new disciplines and communities. The EGI Technical Forum is organised once a year and is a key date in the community's calendar. The 2012 edition, organised in Prague, was an opportunity to review the advances made and to look constructively into a future where the use of computing grids becomes more widespread. Since 2010, EGI has supported the ...

  10. The International Symposium on Grids and Clouds and the Open Grid Forum

    Science.gov (United States)

    addressed while OGF exposed the state of current developments and issues to be resolved if commonalities are to be exploited. Another first is for the Proceedings for 2011, an open access online publishing scheme will ensure these Proceedings will appear more quickly and more people will have access to the results, providing a long-term online archive of the event. The symposium attracted more than 212 participants from 29 countries spanning Asia, Europe and the Americas. Coming so soon after the earthquake and tsunami in Japan, the participation of our Japanese colleagues was particularly appreciated. Keynotes by invited speakers highlighted the impact of distributed computing infrastructures in the social sciences and humanities, high energy physics, earth and life sciences. Plenary sessions entitled Grid Activities in Asia Pacific surveyed the state of grid deployment across 11 Asian countries. Through the parallel sessions, the impact of distributed computing infrastructures in a range of research disciplines was highlighted. Operational procedures, middleware and security aspects were addressed in a dedicated sessions. The symposium was covered online in real-time by the GridCast team from the GridTalk project. A running blog including summarises of specific sessions as well as video interviews with keynote speakers and personalities and photos. As with all regions of the world, grid and cloud computing has to be prove it is adding value to researchers if it is be accepted by them and demonstrate its impact on society as a while if it to be supported by national governments, funding agencies and the general public. ISGC has helped foster the emergence of a strong regional interest in the earth and life sciences, notably for natural disaster mitigation and bioinformatics studies. Prof. Simon C. Lin organised an intense social programme with a gastronomic tour of Taipei culminating with a banquet for all the symposium's participants at the hotel Palais de Chine. I would

  11. Openness, Web 2.0 Technology, and Open Science

    Science.gov (United States)

    Peters, Michael A.

    2010-01-01

    Open science is a term that is being used in the literature to designate a form of science based on open source models or that utilizes principles of open access, open archiving and open publishing to promote scientific communication. Open science increasingly also refers to open governance and more democratized engagement and control of science…

  12. Grid3: An Application Grid Laboratory for Science

    CERN Multimedia

    CERN. Geneva

    2004-01-01

    level services required by the participating experiments. The deployed infrastructure has been operating since November 2003 with 27 sites, a peak of 2800 processors, work loads from 10 different applications exceeding 1300 simultaneous jobs, and data transfers among sites of greater than 2 TB/day. The Grid3 infrastructure was deployed from grid level services provided by groups and applications within the collaboration. The services were organized into four distinct "grid level services" including: Grid3 Packaging, Monitoring and Information systems, User Authentication and the iGOC Grid Operatio...

  13. Open Science Interview mit PA

    OpenAIRE

    Scheliga, Kaja

    2014-01-01

    This interview is part of a series of interviews on open science and digital scholarship conducted in 2013 with researchers from various backgrounds. For an analysis of the interviews see: Scheliga, Kaja and Sascha Friesike. 2014. “Putting open science into practice: A social dilemma?” First Monday. Volume 19, Number 9. DOI: http://dx.doi.org/10.5210/fm.v19i9.5381

  14. Open Science Interview mit IB

    OpenAIRE

    Scheliga, Kaja

    2014-01-01

    This interview is part of a series of interviews on open science and digital scholarship conducted in 2013 with researchers from various backgrounds. For an analysis of the interviews see: Scheliga, Kaja and Sascha Friesike. 2014. “Putting open science into practice: A social dilemma?” First Monday. Volume 19, Number 9. DOI: http://dx.doi.org/10.5210/fm.v19i9.5381

  15. Recommendations for open data science.

    Science.gov (United States)

    Gymrek, Melissa; Farjoun, Yossi

    2016-01-01

    Life science research increasingly relies on large-scale computational analyses. However, the code and data used for these analyses are often lacking in publications. To maximize scientific impact, reproducibility, and reuse, it is crucial that these resources are made publicly available and are fully transparent. We provide recommendations for improving the openness of data-driven studies in life sciences.

  16. Defining Success in Open Science.

    Science.gov (United States)

    Ali-Khan, Sarah E; Jean, Antoine; MacDonald, Emily; Gold, E Richard

    2018-01-01

    Mounting evidence indicates that worldwide, innovation systems are increasing unsustainable. Equally, concerns about inequities in the science and innovation process, and in access to its benefits, continue. Against a backdrop of growing health, economic and scientific challenges global stakeholders are urgently seeking to spur innovation and maximize the just distribution of benefits for all. Open Science collaboration (OS) - comprising a variety of approaches to increase open, public, and rapid mobilization of scientific knowledge - is seen to be one of the most promising ways forward. Yet, many decision-makers hesitate to construct policy to support the adoption and implementation of OS without access to substantive, clear and reliable evidence. In October 2017, international thought-leaders gathered at an Open Science Leadership Forum in the Washington DC offices of the Bill and Melinda Gates Foundation to share their views on what successful Open Science looks like. Delegates from developed and developing nations, national governments, science agencies and funding bodies, philanthropy, researchers, patient organizations and the biotechnology, pharma and artificial intelligence (AI) industries discussed the outcomes that would rally them to invest in OS, as well as wider issues of policy and implementation. This first of two reports, summarizes delegates' views on what they believe OS will deliver in terms of research, innovation and social impact in the life sciences. Through open and collaborative process over the next months, we will translate these success outcomes into a toolkit of quantitative and qualitative indicators to assess when, where and how open science collaborations best advance research, innovation and social benefit. Ultimately, this work aims to develop and openly share tools to allow stakeholders to evaluate and re-invent their innovation ecosystems, to maximize value for the global public and patients, and address long-standing questions

  17. Virtual Experiments on the Neutron Science TeraGrid Gateway

    International Nuclear Information System (INIS)

    Lynch, Vickie E; Cobb, John W; Farhi, Emmanuel N; Miller, Stephen D; Taylor, M

    2008-01-01

    The TeraGrid's outreach effort to the neutron science community is creating an environment that is encouraging the exploration of advanced cyberinfrastructure being incorporated into facility operations in a way that leverages facility operations to multiply the scientific output of its users, including many NSF supported scientists in many disciplines. The Neutron Science TeraGrid Gateway serves as an exploratory incubator for several TeraGrid projects. Virtual neutron scattering experiments from one exploratory project will be highlighted

  18. Opening science: New publication forms in science

    Directory of Open Access Journals (Sweden)

    Scheliga, Kaja

    2014-12-01

    Full Text Available [english] Digital technologies change how scientists access and process information and consequently impact publication forms in science. Even though the core of scientific publications has remained the same, established publication formats, such as the scientific paper or book, are succumbing to the transitions caused by digital technologies. At the same time, new online tools enable new publication forms, such as blogs, microblogs or wikis, to emerge. This article explores the changing and emerging publications forms in science and also reflects upon the changing role of libraries. The transformations of publishing forms are discussed in the context of open science.

  19. Tools for open geospatial science

    Science.gov (United States)

    Petras, V.; Petrasova, A.; Mitasova, H.

    2017-12-01

    Open science uses open source to deal with reproducibility challenges in data and computational sciences. However, just using open source software or making the code public does not make the research reproducible. Moreover, the scientists face the challenge of learning new unfamiliar tools and workflows. In this contribution, we will look at a graduate-level course syllabus covering several software tools which make validation and reuse by a wider professional community possible. For the novices in the open science arena, we will look at how scripting languages such as Python and Bash help us reproduce research (starting with our own work). Jupyter Notebook will be introduced as a code editor, data exploration tool, and a lab notebook. We will see how Git helps us not to get lost in revisions and how Docker is used to wrap all the parts together using a single text file so that figures for a scientific paper or a technical report can be generated with a single command. We will look at examples of software and publications in the geospatial domain which use these tools and principles. Scientific contributions to GRASS GIS, a powerful open source desktop GIS and geoprocessing backend, will serve as an example of why and how to publish new algorithms and tools as part of a bigger open source project.

  20. Grids in Europe - a computing infrastructure for science

    International Nuclear Information System (INIS)

    Kranzlmueller, D.

    2008-01-01

    Grids provide sheer unlimited computing power and access to a variety of resources to todays scientists. Moving from a research topic of computer science to a commodity tool for science and research in general, grid infrastructures are built all around the world. This talk provides an overview of the developments of grids in Europe, the status of the so-called national grid initiatives as well as the efforts towards an integrated European grid infrastructure. The latter, summarized under the title of the European Grid Initiative (EGI), promises a permanent and reliable grid infrastructure and its services in a way similar to research networks today. The talk describes the status of these efforts, the plans for the setup of this pan-European e-Infrastructure, and the benefits for the application communities. (author)

  1. OpenMP parallelization of a gridded SWAT (SWATG)

    Science.gov (United States)

    Zhang, Ying; Hou, Jinliang; Cao, Yongpan; Gu, Juan; Huang, Chunlin

    2017-12-01

    Large-scale, long-term and high spatial resolution simulation is a common issue in environmental modeling. A Gridded Hydrologic Response Unit (HRU)-based Soil and Water Assessment Tool (SWATG) that integrates grid modeling scheme with different spatial representations also presents such problems. The time-consuming problem affects applications of very high resolution large-scale watershed modeling. The OpenMP (Open Multi-Processing) parallel application interface is integrated with SWATG (called SWATGP) to accelerate grid modeling based on the HRU level. Such parallel implementation takes better advantage of the computational power of a shared memory computer system. We conducted two experiments at multiple temporal and spatial scales of hydrological modeling using SWATG and SWATGP on a high-end server. At 500-m resolution, SWATGP was found to be up to nine times faster than SWATG in modeling over a roughly 2000 km2 watershed with 1 CPU and a 15 thread configuration. The study results demonstrate that parallel models save considerable time relative to traditional sequential simulation runs. Parallel computations of environmental models are beneficial for model applications, especially at large spatial and temporal scales and at high resolutions. The proposed SWATGP model is thus a promising tool for large-scale and high-resolution water resources research and management in addition to offering data fusion and model coupling ability.

  2. Close connections between open science and open-source software

    Directory of Open Access Journals (Sweden)

    YouHua Chen

    2014-09-01

    Full Text Available Open science is increasingly gaining attention in recent years. In this mini-review, we briefly discuss and summarize the reasons of introducing open science into academic publications for scientists. We argue that open-source software (like R and Python software can be the universal and important platforms for doing open science because of their appealing features: open source, easy-reading document, commonly used in various scientific disciplines like statistics, chemistry and biology. At last, the challenges and future perspectives of performing open science are discussed.

  3. Approaches to Open Data for Science in Spain

    Directory of Open Access Journals (Sweden)

    E Wulff-Barreiro

    2011-10-01

    Full Text Available As observational data has attained new legal status, allowing their integration into open Internet systems, and experimental data continues to be assembled in common and free platforms, state of the art, easy to access data repositories have been designed in Spain. These repositories have removed many obstacles to re-utilization of GIS and other data. European legislation has also made advances in opening biodiversity data, including a European space in the Latin-American grid infrastructure. Open access biomedical repositories attract commercial attention while astronomical, meteorological, and oncological institutions promote data quality and access. This paper describes recent approaches to open access data for science in Spain.

  4. Enhancing GIS Capabilities for High Resolution Earth Science Grids

    Science.gov (United States)

    Koziol, B. W.; Oehmke, R.; Li, P.; O'Kuinghttons, R.; Theurich, G.; DeLuca, C.

    2017-12-01

    Applications for high performance GIS will continue to increase as Earth system models pursue more realistic representations of Earth system processes. Finer spatial resolution model input and output, unstructured or irregular modeling grids, data assimilation, and regional coordinate systems present novel challenges for GIS frameworks operating in the Earth system modeling domain. This presentation provides an overview of two GIS-driven applications that combine high performance software with big geospatial datasets to produce value-added tools for the modeling and geoscientific community. First, a large-scale interpolation experiment using National Hydrography Dataset (NHD) catchments, a high resolution rectilinear CONUS grid, and the Earth System Modeling Framework's (ESMF) conservative interpolation capability will be described. ESMF is a parallel, high-performance software toolkit that provides capabilities (e.g. interpolation) for building and coupling Earth science applications. ESMF is developed primarily by the NOAA Environmental Software Infrastructure and Interoperability (NESII) group. The purpose of this experiment was to test and demonstrate the utility of high performance scientific software in traditional GIS domains. Special attention will be paid to the nuanced requirements for dealing with high resolution, unstructured grids in scientific data formats. Second, a chunked interpolation application using ESMF and OpenClimateGIS (OCGIS) will demonstrate how spatial subsetting can virtually remove computing resource ceilings for very high spatial resolution interpolation operations. OCGIS is a NESII-developed Python software package designed for the geospatial manipulation of high-dimensional scientific datasets. An overview of the data processing workflow, why a chunked approach is required, and how the application could be adapted to meet operational requirements will be discussed here. In addition, we'll provide a general overview of OCGIS

  5. Open Genetic Code: on open source in the life sciences

    OpenAIRE

    Deibel, Eric

    2014-01-01

    The introduction of open source in the life sciences is increasingly being suggested as an alternative to patenting. This is an alternative, however, that takes its shape at the intersection of the life sciences and informatics. Numerous examples can be identified wherein open source in the life sciences refers to access, sharing and collaboration as informatic practices. This includes open source as an experimental model and as a more sophisticated approach of genetic engineering. The first ...

  6. Grid Integration of Electric Vehicles in Open Electricity Markets

    DEFF Research Database (Denmark)

    congestion management scenario within electric distribution networks •optimal EV charging management with the fleet operator concept and smart charging management •EV battery technology, modelling and tests •the use of EVs for balancing power fluctuations from renewable energy sources, looking at power......Presenting the policy drivers, benefits and challenges for grid integration of electric vehicles (EVs) in the open electricity market environment, this book provides a comprehensive overview of existing electricity markets and demonstrates how EVs are integrated into these different markets...... of the technologies for EV integration, this volume is informative for research professors and graduate students in power systems; it will also appeal to EV manufacturers, regulators, EV market professionals, energy providers and traders, mobility providers, EV charging station companies, and policy makers....

  7. CJEP will offer open science badges.

    Science.gov (United States)

    Pexman, Penny M

    2017-03-01

    This editorial announces the decision of the Canadian Journal of Experimental Psychology (CJEP) to offer Open Science Framework (OSF) Badges. The Centre for Open Science provides tools to facilitate open science practices. These include the OSF badges. The badges acknowledge papers that meet standards for openness of data, methods, or research process. They are now described in the CJEP Submission Guidelines, and are provided in the editorial. (PsycINFO Database Record (c) 2017 APA, all rights reserved).

  8. Unlocking the potential of smart grid technologies with behavioral science

    Directory of Open Access Journals (Sweden)

    Nicole eSintov

    2015-04-01

    Full Text Available Smart grid systems aim to provide a more stable and adaptable electricity infrastructure, and to maximize energy efficiency. Grid-linked technologies vary widely in form and function, but generally share common potentials: to reduce energy consumption via efficiency and/or curtailment, to shift use to off-peak times of day, and to enable distributed storage and generation options. Although end users are key players in these systems, they tend to be overlooked. Behavioral science is therefore key to engaging end-users and maximizing the impact of smart grid technologies. In this paper, we highlight several ways in which behavioral science can be applied to better understand and engage customers in smart grid systems.

  9. Unlocking the potential of smart grid technologies with behavioral science.

    Science.gov (United States)

    Sintov, Nicole D; Schultz, P Wesley

    2015-01-01

    Smart grid systems aim to provide a more stable and adaptable electricity infrastructure, and to maximize energy efficiency. Grid-linked technologies vary widely in form and function, but generally share common potentials: to reduce energy consumption via efficiency and/or curtailment, to shift use to off-peak times of day, and to enable distributed storage and generation options. Although end users are central players in these systems, they are sometimes not central considerations in technology or program design, and in some cases, their motivations for participating in such systems are not fully appreciated. Behavioral science can be instrumental in engaging end-users and maximizing the impact of smart grid technologies. In this paper, we present emerging technologies made possible by a smart grid infrastructure, and for each we highlight ways in which behavioral science can be applied to enhance their impact on energy savings.

  10. Open Genetic Code : On open source in the life sciences

    NARCIS (Netherlands)

    Deibel, E.

    2014-01-01

    The introduction of open source in the life sciences is increasingly being suggested as an alternative to patenting. This is an alternative, however, that takes its shape at the intersection of the life sciences and informatics. Numerous examples can be identified wherein open source in the life

  11. Open Data for Global Science

    Directory of Open Access Journals (Sweden)

    Paul F Uhlir

    2007-06-01

    Full Text Available he digital revolution has transformed the accumulation of properly curated public research data into an essential upstream resource whose value increases with use. The potential contributions of such data to the creation of new knowledge and downstream economic and social goods can in many cases be multiplied exponentially when the data are made openly available on digital networks. Most developed countries spend large amounts of public resources on research and related scientific facilities and instruments that generate massive amounts of data. Yet precious little of that investment is devoted to promoting the value of the resulting data by preserving and making them broadly available. The largely ad hoc approach to managing such data, however, is now beginning to be understood as inadequate to meet the exigencies of the national and international research enterprise. The time has thus come for the research community to establish explicit responsibilities for these digital resources. This article reviews the opportunities and challenges to the global science system associated with establishing an open data policy.

  12. Grid enablement of OpenGeospatial Web Services: the G-OWS Working Group

    Science.gov (United States)

    Mazzetti, Paolo

    2010-05-01

    integration on existing solutions. More specifically, the Open Geospatial Consortium (OGC) Web Services (OWS) specifications play a fundamental role in geospatial information sharing (e.g. in INSPIRE Implementing Rules, GEOSS architecture, GMES Services, etc.). On the Grid side, the gLite middleware, developed in the European EGEE (Enabling Grids for E-sciencE) Projects, is widely spread in Europe and beyond, proving its high scalability and it is one of the middleware chosen for the future European Grid Infrastructure (EGI) initiative. Therefore the convergence between OWS and gLite technologies would be desirable for a seamless access to the Grid capabilities through OWS-compliant systems. Anyway, to achieve this harmonization there are some obstacles to overcome. Firstly, a semantics mismatch must be addressed: gLite handle low-level (e.g. close to the machine) concepts like "file", "data", "instruments", "job", etc., while geo-information services handle higher-level (closer to the human) concepts like "coverage", "observation", "measurement", "model", etc. Secondly, an architectural mismatch must be addressed: OWS implements a Web Service-Oriented-Architecture which is stateless, synchronous and with no embedded security (which is demanded to other specs), while gLite implements the Grid paradigm in an architecture which is stateful, asynchronous (even not fully event-based) and with strong embedded security (based on the VO paradigm). In recent years many initiatives and projects have worked out possible approaches for implementing Grid-enabled OWSs. Just to mention some: (i) in 2007 the OGC has signed a Memorandum of Understanding with the Open Grid Forum, "a community of users, developers, and vendors leading the global standardization effort for grid computing."; (ii) the OGC identified "WPS Profiles - Conflation; and Grid processing" as one of the tasks in the Geo Processing Workflow theme of the OWS Phase 6 (OWS-6); (iii) several national, European and

  13. Baltic Grid for e-Science Development in Baltic

    International Nuclear Information System (INIS)

    Ilmars, S.; Olgerts, B.

    2007-01-01

    Latvia, Estonia and Lithuania as new members of European Union now are involved in e- Science projects. The Baltic Grid (BG) project is a first step to infrastructure development for e-Science grid computing. Together with the universities of Baltic States some universities and organisations of neighbouring countries are involved in BG project to disseminate their experience and management skills. This paper presents achievements and experiences of BG project in e-infrastructure development in Baltic States and in Latvia and Riga Technical University, in particular. (Author)

  14. EMERGE - ESnet/MREN Regional Science Grid Experimental NGI Testbed

    Energy Technology Data Exchange (ETDEWEB)

    Mambretti, Joe; DeFanti, Tom; Brown, Maxine

    2001-07-31

    This document is the final report on the EMERGE Science Grid testbed research project from the perspective of the International Center for Advanced Internet Research (iCAIR) at Northwestern University, which was a subcontractor to this UIC project. This report is a compilation of information gathered from a variety of materials related to this project produced by multiple EMERGE participants, especially those at Electronic Visualization Lab (EVL) at the University of Illinois at Chicago (UIC), Argonne National Lab and iCAIR. The EMERGE Science Grid project was managed by Tom DeFanti, PI from EVL at UIC.

  15. The eGo grid model: An open-source and open-data based synthetic medium-voltage grid model for distribution power supply systems

    Science.gov (United States)

    Amme, J.; Pleßmann, G.; Bühler, J.; Hülk, L.; Kötter, E.; Schwaegerl, P.

    2018-02-01

    The increasing integration of renewable energy into the electricity supply system creates new challenges for distribution grids. The planning and operation of distribution systems requires appropriate grid models that consider the heterogeneity of existing grids. In this paper, we describe a novel method to generate synthetic medium-voltage (MV) grids, which we applied in our DIstribution Network GeneratOr (DINGO). DINGO is open-source software and uses freely available data. Medium-voltage grid topologies are synthesized based on location and electricity demand in defined demand areas. For this purpose, we use GIS data containing demand areas with high-resolution spatial data on physical properties, land use, energy, and demography. The grid topology is treated as a capacitated vehicle routing problem (CVRP) combined with a local search metaheuristics. We also consider the current planning principles for MV distribution networks, paying special attention to line congestion and voltage limit violations. In the modelling process, we included power flow calculations for validation. The resulting grid model datasets contain 3608 synthetic MV grids in high resolution, covering all of Germany and taking local characteristics into account. We compared the modelled networks with real network data. In terms of number of transformers and total cable length, we conclude that the method presented in this paper generates realistic grids that could be used to implement a cost-optimised electrical energy system.

  16. Open Genetic Code: on open source in the life sciences.

    Science.gov (United States)

    Deibel, Eric

    2014-01-01

    The introduction of open source in the life sciences is increasingly being suggested as an alternative to patenting. This is an alternative, however, that takes its shape at the intersection of the life sciences and informatics. Numerous examples can be identified wherein open source in the life sciences refers to access, sharing and collaboration as informatic practices. This includes open source as an experimental model and as a more sophisticated approach of genetic engineering. The first section discusses the greater flexibly in regard of patenting and the relationship to the introduction of open source in the life sciences. The main argument is that the ownership of knowledge in the life sciences should be reconsidered in the context of the centrality of DNA in informatic formats. This is illustrated by discussing a range of examples of open source models. The second part focuses on open source in synthetic biology as exemplary for the re-materialization of information into food, energy, medicine and so forth. The paper ends by raising the question whether another kind of alternative might be possible: one that looks at open source as a model for an alternative to the commodification of life that is understood as an attempt to comprehensively remove the restrictions from the usage of DNA in any of its formats.

  17. BOOK REVIEW: OPENING SCIENCE, THE EVOLVING GUIDE ...

    Science.gov (United States)

    The way we get our funding, collaborate, do our research, and get the word out has evolved over hundreds of years but we can imagine a more open science world, largely facilitated by the internet. The movement towards this more open way of doing and presenting science is coming, and it is not taking hundreds of years. If you are interested in these trends, and would like to find out more about where this is all headed and what it means to you, consider downloding Opening Science, edited by Sönke Bartling and Sascha Friesike, subtitled The Evolving Guide on How the Internet is Changing Research, Collaboration, and Scholarly Publishing. In 26 chapters by various authors from a range of disciplines the book explores the developing world of open science, starting from the first scientific revolution and bringing us to the next scientific revolution, sometimes referred to as “Science 2.0”. Some of the articles deal with the impact of the changing landscape of how science is done, looking at the impact of open science on Academia, or journal publishing, or medical research. Many of the articles look at the uses, pitfalls, and impact of specific tools, like microblogging (think Twitter), social networking, and reference management. There is lots of discussion and definition of terms you might use or misuse like “altmetrics” and “impact factor”. Science will probably never be completely open, and Twitter will probably never replace the journal article,

  18. Revista ORL at Open Science Ecosystem

    Directory of Open Access Journals (Sweden)

    Tránsito FERRERAS FERNÁNDEZ

    2018-05-01

    Full Text Available In October 2017 they were held at the University of Salamanca the congress Open Knowledge Ecosystems (ACE 2017 posed as an international meeting point for specialists in open access. The congress allowed to approach the open knowledge from the perspectives I + D + i, offering papers and communications on research related to Open Access, as well as experiences developed in repositories and institutions and approaches to innovative trends in any of the fields of open knowledge.The experience of Revista ORL was present in ECA 2017 through a joint communication of several authors titled Nuevas vías de publicación para revistas biomédicas. Proyecto Revista ORL de Ediciones Universidad de Salamanca, offering an overview of the history of the Revista as an open edition project. Although on that occasion it was called a project, now we should call it "reality" fully consolidated because the data on results obtained.Open Science represents a paradigm change in the way to make science. Although this doesn't change substantially with respect to its motivations and objectives, it changes its methods. The change lies in how it is done, not in what is done. It is an open science, collaborative and made with and for society.The concept of Open Science has been preceded by Open Access to academic content and this may have conditioned its understanding. Open Access has been assimilated only with open access to articles, while with Open Science it is considered that what must be open is any research result (articles + data as well as the auxiliary instruments used (for example, laboratory notebooks. However, the double meaning of "open" (free and free is the same for the two concepts.In this paper we pose questions such as What is Open Science? What are the motivations of governments for its promotion? What components does make up this ecosystem? What are the implications for agents of scientific research? What does represent all this for the edition of

  19. The Neutron Science TeraGrid Gateway, a TeraGrid Science Gateway to Support the Spallation Neutron Source

    International Nuclear Information System (INIS)

    Cobb, John W.; Geist, Al; Kohl, James Arthur; Miller, Stephen D; Peterson, Peter F.; Pike, Gregory; Reuter, Michael A; Swain, William; Vazhkudai, Sudharshan S.; Vijayakumar, Nithya N.

    2006-01-01

    The National Science Foundation's (NSF's) Extensible Terascale Facility (ETF), or TeraGrid (1) is entering its operational phase. An ETF science gateway effort is the Neutron Science TeraGrid Gateway (NSTG.) The Oak Ridge National Laboratory (ORNL) resource provider effort (ORNL-RP) during construction and now in operations is bridging a large scale experimental community and the TeraGrid as a large-scale national cyberinfrastructure. Of particular emphasis is collaboration with the Spallation Neutron Source (SNS) at ORNL. The U.S. Department of Energy's (DOE's) SNS (2) at ORNL will be commissioned in spring of 2006 as the world's brightest source of neutrons. Neutron science users can run experiments, generate datasets, perform data reduction, analysis, visualize results; collaborate with remotes users; and archive long term data in repositories with curation services. The ORNL-RP and the SNS data analysis group have spent 18 months developing and exploring user requirements, including the creation of prototypical services such as facility portal, data, and application execution services. We describe results from these efforts and discuss implications for science gateway creation. Finally, we show incorporation into implementation planning for the NSTG and SNS architectures. The plan is for a primarily portal-based user interaction supported by a service oriented architecture for functional implementation

  20. Grid computing and e-science: a view from inside

    Directory of Open Access Journals (Sweden)

    Stefano Cozzini

    2008-06-01

    Full Text Available My intention is to analyze how, where and if grid computing technology is truly enabling a new way of doing science (so-called ‘e-science’. I will base my views on the experiences accumulated thus far in a number of scientific communities, which we have provided with the opportunity of using grid computing. I shall first define some basic terms and concepts and then discuss a number of specific cases in which the use of grid computing has actually made possible a new method for doing science. I will then present a case in which this did not result in a change in research methods. I will try to identify the reasons for these failures and analyze the future evolution of grid computing. I will conclude by introducing and commenting the concept of ‘cloud computing’, the approach offered and provided by major industrial actors (Google/IBM and Amazon being among the most important and what impact this technology might have on the world of research.

  1. Achieving open access to conservation science.

    Science.gov (United States)

    Fuller, Richard A; Lee, Jasmine R; Watson, James E M

    2014-12-01

    Conservation science is a crisis discipline in which the results of scientific enquiry must be made available quickly to those implementing management. We assessed the extent to which scientific research published since the year 2000 in 20 conservation science journals is publicly available. Of the 19,207 papers published, 1,667 (8.68%) are freely downloadable from an official repository. Moreover, only 938 papers (4.88%) meet the standard definition of open access in which material can be freely reused providing attribution to the authors is given. This compares poorly with a comparable set of 20 evolutionary biology journals, where 31.93% of papers are freely downloadable and 7.49% are open access. Seventeen of the 20 conservation journals offer an open access option, but fewer than 5% of the papers are available through open access. The cost of accessing the full body of conservation science runs into tens of thousands of dollars per year for institutional subscribers, and many conservation practitioners cannot access pay-per-view science through their workplace. However, important initiatives such as Research4Life are making science available to organizations in developing countries. We urge authors of conservation science to pay for open access on a per-article basis or to choose publication in open access journals, taking care to ensure the license allows reuse for any purpose providing attribution is given. Currently, it would cost $51 million to make all conservation science published since 2000 freely available by paying the open access fees currently levied to authors. Publishers of conservation journals might consider more cost effective models for open access and conservation-oriented organizations running journals could consider a broader range of options for open access to nonmembers such as sponsorship of open access via membership fees. © 2014 The Authors. Conservation Biology published by Wiley Periodicals, Inc., on behalf of the Society for

  2. Open Science Interview mit Christian Heise

    OpenAIRE

    Scheliga, Kaja

    2014-01-01

    This interview is part of a series of interviews on open science and digital scholarship conducted in 2013 with researchers from various backgrounds. For an analysis of the interviews see: Scheliga, Kaja and Sascha Friesike. 2014. “Putting open science into practice: A social dilemma?” First Monday. Volume 19, Number 9. DOI: http://dx.doi.org/10.5210/fm.v19i9.5381

  3. Open Science Interview mit Daniel Mietchen

    OpenAIRE

    Scheliga, Kaja

    2014-01-01

    This interview is part of a series of interviews on open science and digital scholarship conducted in 2013 with researchers from various backgrounds. For an analysis of the interviews see: Scheliga, Kaja and Sascha Friesike. 2014. “Putting open science into practice: A social dilemma?” First Monday. Volume 19, Number 9. DOI: http://dx.doi.org/10.5210/fm.v19i9.5381

  4. Open Science Interview with Christobal Cobo

    OpenAIRE

    Scheliga, Kaja

    2014-01-01

    This interview is part of a series of interviews on open science and digital scholarship conducted in 2013 with researchers from various backgrounds. For an analysis of the interviews see: Scheliga, Kaja and Sascha Friesike. 2014. “Putting open science into practice: A social dilemma?” First Monday. Volume 19, Number 9. DOI: http://dx.doi.org/10.5210/fm.v19i9.5381

  5. Open Science Interview with Jon Crowcroft

    OpenAIRE

    Scheliga, Kaja

    2014-01-01

    This interview is part of a series of interviews on open science and digital scholarship conducted in 2013 with researchers from various backgrounds. For an analysis of the interviews see: Scheliga, Kaja and Sascha Friesike. 2014. “Putting open science into practice: A social dilemma?” First Monday. Volume 19, Number 9. DOI: http://dx.doi.org/10.5210/fm.v19i9.5381

  6. Collaborative Web between open and closed science

    Directory of Open Access Journals (Sweden)

    Alessandro Delfanti

    2008-06-01

    Full Text Available “Web 2.0” is the mantra enthusiastically repeated in the past few years on anything concerning the production of culture, dialogue and online communication. Even science is changing, along with the processes involving the communication, collaboration and cooperation created through the web, yet rooted in some of its historical features of openness. For this issue, JCOM has asked some experts on the most recent changes in science to analyse the potential and the contradictions lying in online collaborative science. The new open science feeds on the opportunity to freely contribute to knowledge production, sharing not only data, but also software and hardware. But it is open also to the outside, where citizens use Web 2.0 instruments to discuss about science in a horizontal way.

  7. Open access: changing global science publishing.

    Science.gov (United States)

    Gasparyan, Armen Yuri; Ayvazyan, Lilit; Kitas, George D

    2013-08-01

    The article reflects on open access as a strategy of changing the quality of science communication globally. Successful examples of open-access journals are presented to highlight implications of archiving in open digital repositories for the quality and citability of research output. Advantages and downsides of gold, green, and hybrid models of open access operating in diverse scientific environments are described. It is assumed that open access is a global trend which influences the workflow in scholarly journals, changing their quality, credibility, and indexability.

  8. Opening science: New publication forms in science

    OpenAIRE

    Scheliga, Kaja

    2014-01-01

    Digital technologies change how scientists access and process information and consequently impact publication forms in science. Even though the core of scientific publications has remained the same, established publication formats, such as the scientific paper or book, are succumbing to the transitions caused by digital technologies. At the same time, new online tools enable new publication forms, such as blogs, microblogs or wikis, to emerge. This article explores the changing and emerging p...

  9. Grid computing in pakistan and: opening to large hadron collider experiments

    International Nuclear Information System (INIS)

    Batool, N.; Osman, A.; Mahmood, A.; Rana, M.A.

    2009-01-01

    A grid computing facility was developed at sister institutes Pakistan Institute of Nuclear Science and Technology (PINSTECH) and Pakistan Institute of Engineering and Applied Sciences (PIEAS) in collaboration with Large Hadron Collider (LHC) Computing Grid during early years of the present decade. The Grid facility PAKGRID-LCG2 as one of the grid node in Pakistan was developed employing mainly local means and is capable of supporting local and international research and computational tasks in the domain of LHC Computing Grid. Functional status of the facility is presented in terms of number of jobs performed. The facility developed provides a forum to local researchers in the field of high energy physics to participate in the LHC experiments and related activities at European particle physics research laboratory (CERN), which is one of the best physics laboratories in the world. It also provides a platform of an emerging computing technology (CT). (author)

  10. ScienceSoft: Open software for open science

    CERN Document Server

    Di Meglio, Alberto

    2012-01-01

    Most of the software developed today by research institutes, university, research projects, etc. is typically stored in local source and binary repositories and available for the duration of a project lifetime only. Finding software based on given functional characteristics is almost impossible and binary packages are mostly available from local university or project repositories rather than the open source community repositories like Fedora/EPEL or Debian. Furthermore general information about who develops, contributes to and most importantly uses a given software program is very difficult to find out and yet the widespread availability of such information would give more visibility and credibility to the software products. The creation of links or relationships not only among pieces of software, but equally among the people interacting with the software across and beyond specific project and communities would foster a more active community and create the conditions for sharing ideas and skills, a ...

  11. Smart grids clouds, communications, open source, and automation

    CERN Document Server

    Bakken, David

    2014-01-01

    The utilization of sensors, communications, and computer technologies to create greater efficiency in the generation, transmission, distribution, and consumption of electricity will enable better management of the electric power system. As the use of smart grid technologies grows, utilities will be able to automate meter reading and billing and consumers will be more aware of their energy usage and the associated costs. The results will require utilities and their suppliers to develop new business models, strategies, and processes.With an emphasis on reducing costs and improving return on inve

  12. Open Science: a first step towards Science Communication

    Science.gov (United States)

    Grigorov, Ivo; Tuddenham, Peter

    2015-04-01

    As Earth Science communicators gear up to adopt the new tools and captivating approaches to engage citizen scientists, budding entrepreneurs, policy makers and the public in general, researchers have the responsibility, and opportunity, to fully adopt Open Science principles and capitalize on its full societal impact and engagement. Open Science is about removing all barriers to basic research, whatever its formats, so that it can be freely used, re-used and re-hashed, thus fueling discourse and accelerating generation of innovative ideas. The concept is central to EU's Responsible Research and Innovation philosophy, and removing barriers to basic research measurably contributes to engaging citizen scientists into the research process, it sets the scene for co-creation of solutions to societal challenges, and raises the general science literacy level of the public. Despite this potential, only 50% of today's basic research is freely available. Open Science can be the first passive step of communicating marine research outside academia. Full and unrestricted access to our knowledge including data, software code and scientific publications is not just an ethical obligation, but also gives solid credibility to a more sophisticated communication strategy on engaging society. The presentation will demonstrate how Open Science perfectly compliments a coherent communication strategy for placing Marine Research in societal context, and how it underpin an effective integration of Ocean & Earth Literacy principles in standard educational, as well mobilizing citizen marine scientists, thus making marine science Open Science.

  13. The Earth System Grid Federation : an Open Infrastructure for Access to Distributed Geospatial Data

    Science.gov (United States)

    Cinquini, Luca; Crichton, Daniel; Mattmann, Chris; Harney, John; Shipman, Galen; Wang, Feiyi; Ananthakrishnan, Rachana; Miller, Neill; Denvil, Sebastian; Morgan, Mark; hide

    2012-01-01

    The Earth System Grid Federation (ESGF) is a multi-agency, international collaboration that aims at developing the software infrastructure needed to facilitate and empower the study of climate change on a global scale. The ESGF's architecture employs a system of geographically distributed peer nodes, which are independently administered yet united by the adoption of common federation protocols and application programming interfaces (APIs). The cornerstones of its interoperability are the peer-to-peer messaging that is continuously exchanged among all nodes in the federation; a shared architecture and API for search and discovery; and a security infrastructure based on industry standards (OpenID, SSL, GSI and SAML). The ESGF software is developed collaboratively across institutional boundaries and made available to the community as open source. It has now been adopted by multiple Earth science projects and allows access to petabytes of geophysical data, including the entire model output used for the next international assessment report on climate change (IPCC-AR5) and a suite of satellite observations (obs4MIPs) and reanalysis data sets (ANA4MIPs).

  14. Open Science: Trends in the Development of Science Learning

    Science.gov (United States)

    Scanlon, Eileen

    2011-01-01

    This article comments on some trends in the evolution of science teaching at a distance using the Open University UK (OU UK) experience as a benchmark. Even from the first years of the university there was an understanding of the potential role for media in developing methods for teaching science at a distance, in particular the potential for…

  15. Social Media, Open Science, and Data Science Are Inextricably Linked.

    Science.gov (United States)

    Voytek, Bradley

    2017-12-20

    Should scientists use social media? Why practice open science? What is data science? Ten years ago, these phrases hardly existed. Now they are ubiquitous. Here I argue that these phenomena are inextricably linked and reflect similar underlying social and technological transformations. Copyright © 2017 Elsevier Inc. All rights reserved.

  16. The eGo grid model: An open source approach towards a model of German high and extra-high voltage power grids

    Science.gov (United States)

    Mueller, Ulf Philipp; Wienholt, Lukas; Kleinhans, David; Cussmann, Ilka; Bunke, Wolf-Dieter; Pleßmann, Guido; Wendiggensen, Jochen

    2018-02-01

    There are several power grid modelling approaches suitable for simulations in the field of power grid planning. The restrictive policies of grid operators, regulators and research institutes concerning their original data and models lead to an increased interest in open source approaches of grid models based on open data. By including all voltage levels between 60 kV (high voltage) and 380kV (extra high voltage), we dissolve the common distinction between transmission and distribution grid in energy system models and utilize a single, integrated model instead. An open data set for primarily Germany, which can be used for non-linear, linear and linear-optimal power flow methods, was developed. This data set consists of an electrically parameterised grid topology as well as allocated generation and demand characteristics for present and future scenarios at high spatial and temporal resolution. The usability of the grid model was demonstrated by the performance of exemplary power flow optimizations. Based on a marginal cost driven power plant dispatch, being subject to grid restrictions, congested power lines were identified. Continuous validation of the model is nescessary in order to reliably model storage and grid expansion in progressing research.

  17. OpenZika: An IBM World Community Grid Project to Accelerate Zika Virus Drug Discovery.

    OpenAIRE

    Sean Ekins; Alexander L Perryman; Carolina Horta Andrade

    2016-01-01

    The Zika virus outbreak in the Americas has caused global concern. To help accelerate this fight against Zika, we launched the OpenZika project. OpenZika is an IBM World Community Grid Project that uses distributed computing on millions of computers and Android devices to run docking experiments, in order to dock tens of millions of drug-like compounds against crystal structures and homology models of Zika proteins (and other related flavivirus targets). This will enable the identification of...

  18. 50 CFR Figure 13 to Part 223 - Single Grid Hard TED Escape Opening

    Science.gov (United States)

    2010-10-01

    ... 50 Wildlife and Fisheries 7 2010-10-01 2010-10-01 false Single Grid Hard TED Escape Opening 13 Figure 13 to Part 223 Wildlife and Fisheries NATIONAL MARINE FISHERIES SERVICE, NATIONAL OCEANIC AND ATMOSPHERIC ADMINISTRATION, DEPARTMENT OF COMMERCE MARINE MAMMALS THREATENED MARINE AND ANADROMOUS SPECIES Pt. 223, Fig. 13 Figure 13 to Part 223—Singl...

  19. An open science cloud for scientific research

    Science.gov (United States)

    Jones, Bob

    2016-04-01

    The Helix Nebula initiative was presented at EGU 2013 (http://meetingorganizer.copernicus.org/EGU2013/EGU2013-1510-2.pdf) and has continued to expand with more research organisations, providers and services. The hybrid cloud model deployed by Helix Nebula has grown to become a viable approach for provisioning ICT services for research communities from both public and commercial service providers (http://dx.doi.org/10.5281/zenodo.16001). The relevance of this approach for all those communities facing societal challenges in explained in a recent EIROforum publication (http://dx.doi.org/10.5281/zenodo.34264). This presentation will describe how this model brings together a range of stakeholders to implement a common platform for data intensive services that builds upon existing public funded e-infrastructures and commercial cloud services to promote open science. It explores the essential characteristics of a European Open Science Cloud if it is to address the big data needs of the latest generation of Research Infrastructures. The high-level architecture and key services as well as the role of standards is described. A governance and financial model together with the roles of the stakeholders, including commercial service providers and downstream business sectors, that will ensure a European Open Science Cloud can innovate, grow and be sustained beyond the current project cycles is described.

  20. Collaborative Science Using Web Services and the SciFlo Grid Dataflow Engine

    Science.gov (United States)

    Wilson, B. D.; Manipon, G.; Xing, Z.; Yunck, T.

    2006-12-01

    The General Earth Science Investigation Suite (GENESIS) project is a NASA-sponsored partnership between the Jet Propulsion Laboratory, academia, and NASA data centers to develop a new suite of Web Services tools to facilitate multi-sensor investigations in Earth System Science. The goal of GENESIS is to enable large-scale, multi-instrument atmospheric science using combined datasets from the AIRS, MODIS, MISR, and GPS sensors. Investigations include cross-comparison of spaceborne climate sensors, cloud spectral analysis, study of upper troposphere-stratosphere water transport, study of the aerosol indirect cloud effect, and global climate model validation. The challenges are to bring together very large datasets, reformat and understand the individual instrument retrievals, co-register or re-grid the retrieved physical parameters, perform computationally-intensive data fusion and data mining operations, and accumulate complex statistics over months to years of data. To meet these challenges, we have developed a Grid computing and dataflow framework, named SciFlo, in which we are deploying a set of versatile and reusable operators for data access, subsetting, registration, mining, fusion, compression, and advanced statistical analysis. SciFlo leverages remote Web Services, called via Simple Object Access Protocol (SOAP) or REST (one-line) URLs, and the Grid Computing standards (WS-* &Globus Alliance toolkits), and enables scientists to do multi-instrument Earth Science by assembling reusable Web Services and native executables into a distributed computing flow (tree of operators). The SciFlo client &server engines optimize the execution of such distributed data flows and allow the user to transparently find and use datasets and operators without worrying about the actual location of the Grid resources. In particular, SciFlo exploits the wealth of datasets accessible by OpenGIS Consortium (OGC) Web Mapping Servers & Web Coverage Servers (WMS/WCS), and by Open Data

  1. A simple grid implementation with Berkeley Open Infrastructure for Network Computing using BLAST as a model

    Directory of Open Access Journals (Sweden)

    Watthanai Pinthong

    2016-07-01

    Full Text Available Development of high-throughput technologies, such as Next-generation sequencing, allows thousands of experiments to be performed simultaneously while reducing resource requirement. Consequently, a massive amount of experiment data is now rapidly generated. Nevertheless, the data are not readily usable or meaningful until they are further analysed and interpreted. Due to the size of the data, a high performance computer (HPC is required for the analysis and interpretation. However, the HPC is expensive and difficult to access. Other means were developed to allow researchers to acquire the power of HPC without a need to purchase and maintain one such as cloud computing services and grid computing system. In this study, we implemented grid computing in a computer training center environment using Berkeley Open Infrastructure for Network Computing (BOINC as a job distributor and data manager combining all desktop computers to virtualize the HPC. Fifty desktop computers were used for setting up a grid system during the off-hours. In order to test the performance of the grid system, we adapted the Basic Local Alignment Search Tools (BLAST to the BOINC system. Sequencing results from Illumina platform were aligned to the human genome database by BLAST on the grid system. The result and processing time were compared to those from a single desktop computer and HPC. The estimated durations of BLAST analysis for 4 million sequence reads on a desktop PC, HPC and the grid system were 568, 24 and 5 days, respectively. Thus, the grid implementation of BLAST by BOINC is an efficient alternative to the HPC for sequence alignment. The grid implementation by BOINC also helped tap unused computing resources during the off-hours and could be easily modified for other available bioinformatics software.

  2. Open Science as a Knowledge Transfer strategy

    Science.gov (United States)

    Grigorov, Ivo; Dalmeier-Thiessen, Suenje

    2015-04-01

    Beyond providing basic understanding of how our Blue Planet functions, flows and breathes, the collection of Earth & Marine Research disciplines are of major service to most of today's Societal Challenges: from Food Security and Sustainable Resource Management, to Renewable Energies, Climate Mitigation & Ecosystem Services and Hazards. Natural Resources are a key commodity in the long-term strategy of the EU Innovation Union(1), and better understanding of the natural process governing them, as well as science-based management are seen as a key area for stimulating future economic growth. Such potential places responsibility on research project managers to devise innovative methods to ensure effective transfer of new research to public and private sector users, and society at large. Open Science is about removing all barriers to full sphere basic research knowledge and outputs, not just the publishable part of research but also the data, the software code, and failed experiments. The concept is central to EU's Responsible Research and Innovation philosophy(2), and removing barriers to basic research measurably contributes to the EU's Blue Growth Agenda(3). Despite the potential of the internet age to deliver on that promise, only 50% of today's basic research is freely available(4). The talk will demonstrate how and why Open Science can be a first, passive but effective strategy for any research project to transfer knowledge to society by allowing access and dicoverability to the full sphere of new knowledge, not just the published outputs. Apart from contributing to economic growth, Open Science can also optimize collaboration, within academia, assist with better engagement of citizen scientists into the research process and co-creation of solutions to societal challenges, as well as providing a solid ground for more sophisticated communication strategies and Ocean/Earth Literacy initiatives targeting policy makers and the public at large. (1)EC Digital Agenda

  3. Data Grid tools: enabling science on big distributed data

    Energy Technology Data Exchange (ETDEWEB)

    Allcock, Bill [Mathematics and Computer Science, Argonne National Laboratory, Argonne, IL 60439 (United States); Chervenak, Ann [Information Sciences Institute, University of Southern California, Marina del Rey, CA 90291 (United States); Foster, Ian [Mathematics and Computer Science, Argonne National Laboratory, Argonne, IL 60439 (United States); Department of Computer Science, University of Chicago, Chicago, IL 60615 (United States); Kesselman, Carl [Information Sciences Institute, University of Southern California, Marina del Rey, CA 90291 (United States); Livny, Miron [Department of Computer Science, University of Wisconsin, Madison, WI 53705 (United States)

    2005-01-01

    A particularly demanding and important challenge that we face as we attempt to construct the distributed computing machinery required to support SciDAC goals is the efficient, high-performance, reliable, secure, and policy-aware management of large-scale data movement. This problem is fundamental to diverse application domains including experimental physics (high energy physics, nuclear physics, light sources), simulation science (climate, computational chemistry, fusion, astrophysics), and large-scale collaboration. In each case, highly distributed user communities require high-speed access to valuable data, whether for visualization or analysis. The quantities of data involved (terabytes to petabytes), the scale of the demand (hundreds or thousands of users, data-intensive analyses, real-time constraints), and the complexity of the infrastructure that must be managed (networks, tertiary storage systems, network caches, computers, visualization systems) make the problem extremely challenging. Data management tools developed under the auspices of the SciDAC Data Grid Middleware project have become the de facto standard for data management in projects worldwide. Day in and day out, these tools provide the 'plumbing' that allows scientists to do more science on an unprecedented scale in production environments.

  4. Data Grid tools: enabling science on big distributed data

    International Nuclear Information System (INIS)

    Allcock, Bill; Chervenak, Ann; Foster, Ian; Kesselman, Carl; Livny, Miron

    2005-01-01

    A particularly demanding and important challenge that we face as we attempt to construct the distributed computing machinery required to support SciDAC goals is the efficient, high-performance, reliable, secure, and policy-aware management of large-scale data movement. This problem is fundamental to diverse application domains including experimental physics (high energy physics, nuclear physics, light sources), simulation science (climate, computational chemistry, fusion, astrophysics), and large-scale collaboration. In each case, highly distributed user communities require high-speed access to valuable data, whether for visualization or analysis. The quantities of data involved (terabytes to petabytes), the scale of the demand (hundreds or thousands of users, data-intensive analyses, real-time constraints), and the complexity of the infrastructure that must be managed (networks, tertiary storage systems, network caches, computers, visualization systems) make the problem extremely challenging. Data management tools developed under the auspices of the SciDAC Data Grid Middleware project have become the de facto standard for data management in projects worldwide. Day in and day out, these tools provide the 'plumbing' that allows scientists to do more science on an unprecedented scale in production environments

  5. Persistent Identifiers, Discoverability and Open Science (Communication)

    Science.gov (United States)

    Murphy, Fiona; Lehnert, Kerstin; Hanson, Brooks

    2016-04-01

    Early in 2016, the American Geophysical Union announced it was incorporating ORCIDs into its submission workflows. This was accompanied by a strong statement supporting the use of other persistent identifiers - such as IGSNs, and the CrossRef open registry 'funding data'. This was partly in response to funders' desire to track and manage their outputs. However the more compelling argument, and the reason why the AGU has also signed up to the Center for Open Science's Transparency and Openness Promotion (TOP) Guidelines (http://cos.io/top), is that ultimately science and scientists will be the richer for these initiatives due to increased opportunities for interoperability, reproduceability and accreditation. The AGU has appealed to the wider community to engage with these initiatives, recognising that - unlike the introduction of Digital Object Identifiers (DOIs) for articles by CrossRef - full, enriched use of persistent identifiers throughout the scientific process requires buy-in from a range of scholarly communications stakeholders. At the same time, across the general research landscape, initiatives such as Project CRediT (contributor roles taxonomy), Publons (reviewer acknowledgements) and the forthcoming CrossRef DOI Event Tracker are contributing to our understanding and accreditation of contributions and impact. More specifically for earth science and scientists, the cross-functional Coalition for Publishing Data in the Earth and Space Sciences (COPDESS) was formed in October 2014 and is working to 'provide an organizational framework for Earth and space science publishers and data facilities to jointly implement and promote common policies and procedures for the publication and citation of data across Earth Science journals'. Clearly, the judicious integration of standards, registries and persistent identifiers such as ORCIDs and International Geo Sample Numbers (IGSNs) to the research and research output processes is key to the success of this venture

  6. Open data science technical and cultural aspects

    CERN Multimedia

    CERN. Geneva

    2005-01-01

    Research in STM fields routinely generates and requires large amounts of data in electronic form. The growth of scientific research using infrastructures such as the Grid, UK's eScience programme and cyber infrastructure requires the re-use, repurposing and redissemination of this information. Fields like bioinformatics, astronomy, physics, and earth/environmental sciences routinely use such data as primary research input. Much of this is now carried out by machines which harvest data from multiple sources in dynamic and iterative ways, validate, filter compute and republish it. The current publication process and legal infrastructure is now a serious hindrance to this. Most STM data are never published and the re-usability of those that are is often unclear as authors and publishers give no explicit permission. However almost all authors intend that published data (non-copyrightable “facts”) are for the re-use of and redissemination to the STM community and the world in general. Many publishers agree wit...

  7. Achieving Open Access to Conservation Science

    Science.gov (United States)

    Fuller, Richard A; Lee, Jasmine R; Watson, James E M

    2014-01-01

    Conservation science is a crisis discipline in which the results of scientific enquiry must be made available quickly to those implementing management. We assessed the extent to which scientific research published since the year 2000 in 20 conservation science journals is publicly available. Of the 19,207 papers published, 1,667 (8.68%) are freely downloadable from an official repository. Moreover, only 938 papers (4.88%) meet the standard definition of open access in which material can be freely reused providing attribution to the authors is given. This compares poorly with a comparable set of 20 evolutionary biology journals, where 31.93% of papers are freely downloadable and 7.49% are open access. Seventeen of the 20 conservation journals offer an open access option, but fewer than 5% of the papers are available through open access. The cost of accessing the full body of conservation science runs into tens of thousands of dollars per year for institutional subscribers, and many conservation practitioners cannot access pay-per-view science through their workplace. However, important initiatives such as Research4Life are making science available to organizations in developing countries. We urge authors of conservation science to pay for open access on a per-article basis or to choose publication in open access journals, taking care to ensure the license allows reuse for any purpose providing attribution is given. Currently, it would cost $51 million to make all conservation science published since 2000 freely available by paying the open access fees currently levied to authors. Publishers of conservation journals might consider more cost effective models for open access and conservation-oriented organizations running journals could consider a broader range of options for open access to nonmembers such as sponsorship of open access via membership fees. Obtención de Acceso Abierto a la Ciencia de la Conservación Resumen La ciencia de la conservación es una

  8. Grid Information Technology as a New Technological Tool for e-Science, Healthcare and Life Science

    Directory of Open Access Journals (Sweden)

    Juan Manuel Maqueira Marín

    2007-06-01

    Full Text Available Nowadays, scientific projects require collaborative environments and powerful computing resources capable of handling huge quantities of data, which gives rise to e-Science. These requirements are evident in the need to optimise time and efforts in activities to do with health. When e-Science focuses on the collaborative handling of all the information generated in clinical medicine and health, e-Health is the result. Scientists are taking increasing interest in an emerging technology – Grid Information Technology – that may offer a solution to their current needs. The current work aims to survey how e-Science is using this technology all around the world. We also argue that the technology may provide an ideal solution for the new challenges facing e-Health and Life Science.

  9. Earth observation open science and innovation

    CERN Document Server

    Aubrecht, Christoph

    2018-01-01

    This book is published open access under a CC BY 4.0 license. Over  the  past  decades,  rapid developments in digital and sensing technologies, such  as the Cloud, Web and Internet of Things, have dramatically changed the way we live and work. The digital transformation is revolutionizing our ability to monitor our planet and transforming the  way we access, process and exploit Earth Observation data from satellites. This book reviews these megatrends and their implications for the Earth Observation community as well as the wider data economy. It provides insight into new paradigms of Open Science and Innovation applied to space data, which are characterized by openness, access to large volume of complex data, wide availability of new community tools, new techniques for big data analytics such as Artificial Intelligence, unprecedented level of computing power, and new types of collaboration among researchers, innovators, entrepreneurs and citizen scientists. In addition, this book aims to provide reade...

  10. Reputation, Princing and the E-Science Grid

    Science.gov (United States)

    Anandasivam, Arun; Neumann, Dirk

    One of the fundamental aspects for an efficient Grid usage is the optimization of resource allocation among the participants. However, this has not yet materialized. Each user is a self-interested participant trying to maximize his utility whereas the utility is not only determined by the fastest completion time, but on the prices as well. Future revenues are influenced by users' reputation. Reputation mechanisms help to build trust between loosely coupled and geographically distributed participants. Providers need an incentive to reduce selfish cancellation of jobs and privilege own jobs. In this chapter we present first an offline scheduling mechanism with a fixed price. Jobs are collected by a broker and scheduled to machines. The goal of the broker is to balance the load and to maximize the revenue in the network. Consumers can submit their jobs according to their preferences, but taking the incentives of the broker into account. This mechanism does not consider reputation. In a second step a reputation-based pricing mechanism for a simple, but fair pricing of resources is analyzed. In e-Science researchers do not appreciate idiosyncratic pricing strategies and policies. Their interest lies in doing research in an efficient manner. Consequently, in our mechanism the price is tightly coupled to the reputation of a site to guarantee fairness of pricing and facilitate price determination. Furthermore, the price is not the only parameter as completion time plays an important role, when deadlines have to be met. We provide a flexible utility and decision model for every participant and analyze the outcome of our reputation-based pricing system via simulation.

  11. Open Science and Open Data: Evolving Business Models

    OpenAIRE

    Melero, Remedios

    2013-01-01

    The rise of ICT has changed the way scientific inputs and outputs are disseminated and diffused. As a consequence, new business models for open access to Scientific publications and datasets are emerging. This session will explore the new features of the business models for open access and open data as well as the associated benefits and risks.

  12. An open science peer review oath

    DEFF Research Database (Denmark)

    Aleksic, Jelena; Adrian Alexa, Adrian Alexa; Attwood, Teresa K.

    2015-01-01

    One of the foundations of the scientific method is to be able to reproduce experiments and corroborate the results of research that has been done before. However, with the increasing complexities of new technologies and techniques, coupled with the specialisation of experiments, reproducing......: specifically, we introduce a peer-review oath and accompanying manifesto. These have been designed to offer guidelines to enable reviewers (with the minimum friction or bias) to follow and apply open science principles, and support the ideas of transparency, reproducibility and ultimately greater societal...... research findings has become a growing challenge. Clearly, scientific methods must be conveyed succinctly, and with clarity and rigour, in order for research to be reproducible. Here, we propose steps to help increase the transparency of the scientific method and the reproducibility of research results...

  13. Cuban Science and the Open Access Alternative

    CERN Document Server

    Arencibia Jorge, Ricardo; Torricella-Morales, Raúl G

    2004-01-01

    Science in Cuba has experienced extraordinary development since the triumph of the Cuban Revolution, in spite of the blockade to which Cuba has been subjected by the United States Government, and thanks to the support and cooperation of the countries that were part of the former Socialist Block. However, after the destruction of the Socialist Block, the Cuban economy suffered through a restructuring process that included the reorganization of the traditional systems for spreading scientific information. At that moment, it was necessary to use alternative means to effectively publicise, to the international scientific community, the information generated by Cuban scientists and scholars. This paper briefly reviews this new era, the institutions that led the process of change, and the future projections based on knowledge of the digital environment and the creation of electronic and open access information sources.

  14. Trinity Phase 2 Open Science: CTH

    Energy Technology Data Exchange (ETDEWEB)

    Ruggirello, Kevin Patrick [Sandia National Lab. (SNL-NM), Albuquerque, NM (United States); Vogler, Tracy [Sandia National Lab. (SNL-NM), Albuquerque, NM (United States)

    2017-08-01

    CTH is an Eulerian hydrocode developed by Sandia National Laboratories (SNL) to solve a wide range of shock wave propagation and material deformation problems. Adaptive mesh refinement is also used to improve efficiency for problems with a wide range of spatial scales. The code has a history of running on a variety of computing platforms ranging from desktops to massively parallel distributed-data systems. For the Trinity Phase 2 Open Science campaign, CTH was used to study mesoscale simulations of the hypervelocity penetration of granular SiC powders. The simulations were compared to experimental data. A scaling study of CTH up to 8192 KNL nodes was also performed, and several improvements were made to the code to improve the scalability.

  15. Catalyzing Open and Collaborative Science to Address Global ...

    International Development Research Centre (IDRC) Digital Library (Canada)

    As the cost of computer hardware continues to drop and developing-country researchers get increased access to the Internet and mobile phones, each offers the potential for solving these development challenges by opening up the scientific process. What is open science? At the heart of the open science concept is the ...

  16. Integrating scientific data for drug discovery and development using the Life Sciences Grid.

    Science.gov (United States)

    Dow, Ernst R; Hughes, James B; Stephens, Susie M; Narayan, Vaibhav A; Bishop, Richard W

    2009-06-01

    There are many daunting challenges for companies who wish to bring novel drugs to market. The information complexity around potential drug targets has increased greatly with the introduction of microarrays, high-throughput screening and other technological advances over the past decade, but has not yet fundamentally increased our understanding of how to modify a disease with pharmaceuticals. Further, the bar has been raised in getting a successful drug to market as just being new is no longer enough: the drug must demonstrate improved performance compared with the ever increasing generic pharmacopeia to gain support from payers and government authorities. In addition, partly as a consequence of a climate of concern regarding the safety of drugs, regulatory authorities have approved fewer new molecular entities compared to historical norms over the past few years. To overcome these challenges, the pharmaceutical industry must fully embrace information technology to bring better understood compounds to market. An important first step in addressing an unmet medical need is in understanding the disease and identifying the physiological target(s) to be modulated by the drug. Deciding which targets to pursue for a given disease requires a multidisciplinary effort that integrates heterogeneous data from many sources, including genetic variations of populations, changes in gene expression and biochemical assays. The Life Science Grid was developed to provide a flexible framework to integrate such diverse biological, chemical and disease information to help scientists make better-informed decisions. The Life Science Grid has been used to rapidly and effectively integrate scientific information in the pharmaceutical industry and has been placed in the open source community to foster collaboration in the life sciences community.

  17. Grid Integration Science, NREL Power Systems Engineering Center

    Energy Technology Data Exchange (ETDEWEB)

    Kroposki, Benjamin [National Renewable Energy Lab. (NREL), Golden, CO (United States)

    2017-04-25

    This report highlights journal articles published in 2016 by researchers in the Power Systems Engineering Center. NREL's Power Systems Engineering Center published 47 journal and magazine articles in the past year, highlighting recent research in grid modernization.

  18. Open Science: Open source licenses in scientific research

    OpenAIRE

    Guadamuz, Andres

    2006-01-01

    The article examines the validity of OSS (open source software) licenses for scientific, as opposed to creative works. It draws on examples of OSS licenses to consider their suitability for the scientific community and scientific research.

  19. How open science helps researchers succeed

    Science.gov (United States)

    McKiernan, Erin C; Bourne, Philip E; Brown, C Titus; Buck, Stuart; Kenall, Amye; Lin, Jennifer; McDougall, Damon; Nosek, Brian A; Ram, Karthik; Soderberg, Courtney K; Spies, Jeffrey R; Thaney, Kaitlin; Updegrove, Andrew; Woo, Kara H; Yarkoni, Tal

    2016-01-01

    Open access, open data, open source and other open scholarship practices are growing in popularity and necessity. However, widespread adoption of these practices has not yet been achieved. One reason is that researchers are uncertain about how sharing their work will affect their careers. We review literature demonstrating that open research is associated with increases in citations, media attention, potential collaborators, job opportunities and funding opportunities. These findings are evidence that open research practices bring significant benefits to researchers relative to more traditional closed practices. DOI: http://dx.doi.org/10.7554/eLife.16800.001 PMID:27387362

  20. Challenges facing production grids

    Energy Technology Data Exchange (ETDEWEB)

    Pordes, Ruth; /Fermilab

    2007-06-01

    Today's global communities of users expect quality of service from distributed Grid systems equivalent to that their local data centers. This must be coupled to ubiquitous access to the ensemble of processing and storage resources across multiple Grid infrastructures. We are still facing significant challenges in meeting these expectations, especially in the underlying security, a sustainable and successful economic model, and smoothing the boundaries between administrative and technical domains. Using the Open Science Grid as an example, I examine the status and challenges of Grids operating in production today.

  1. Secondary Students' Perceptions of Open Science Textbooks

    Science.gov (United States)

    Morales, Rebecca; Baker, Alesha

    2018-01-01

    In an attempt to align instructional resources with new state standards and to increase teacher awareness of these standards, one large suburban public school district piloted the development and adoption of open secondary science textbooks. Open textbooks created by teachers in grades six through nine replaced conventional science textbooks…

  2. Research and Deployment a Hospital Open Software Platform for e-Health on the Grid System at VAST/IAMI

    Science.gov (United States)

    van Tuyet, Dao; Tuan, Ngo Anh; van Lang, Tran

    Grid computing has been an increasing topic in recent years. It attracts the attention of many scientists from many fields. As a result, many Grid systems have been built for serving people's demands. At present, many tools for developing the Grid systems such as Globus, gLite, Unicore still developed incessantly. Especially, gLite - the Grid Middleware - was developed by the Europe Community scientific in recent years. Constant growth of Grid technology opened the way for new opportunities in term of information and data exchange in a secure and collaborative context. These new opportunities can be exploited to offer physicians new telemedicine services in order to improve their collaborative capacities. Our platform gives physicians an easy method to use telemedicine environment to manage and share patient's information (such as electronic medical record, images formatted DICOM) between remote locations. This paper presents the Grid Infrastructure based on gLite; some main components of gLite; the challenge scenario in which new applications can be developed to improve collaborative work between scientists; the process of deploying Hospital Open software Platform for E-health (HOPE) on the Grid.

  3. An Open Framework for Low-Latency Communications across the Smart Grid Network

    Science.gov (United States)

    Sturm, John Andrew

    2011-01-01

    The recent White House (2011) policy paper for the Smart Grid that was released on June 13, 2011, "A Policy Framework for the 21st Century Grid: Enabling Our Secure Energy Future," defines four major problems to be solved and the one that is addressed in this dissertation is Securing the Grid. Securing the Grid is referred to as one of…

  4. OpenSesame: An Open-source, Graphical Experiment Builder for the Social Sciences

    NARCIS (Netherlands)

    Mathot, S.; Schreij, D.B.B.; Theeuwes, J.

    2012-01-01

    In the present article, we introduce OpenSesame, a graphical experiment builder for the social sciences. OpenSesame is free, open-source, and cross-platform. It features a comprehensive and intuitive graphical user interface and supports Python scripting for complex tasks. Additional functionality,

  5. Stepping up Open Science Training for European Research

    Directory of Open Access Journals (Sweden)

    Birgit Schmidt

    2016-06-01

    Full Text Available Open science refers to all things open in research and scholarly communication: from publications and research data to code, models and methods as well as quality evaluation based on open peer review. However, getting started with implementing open science might not be as straightforward for all stakeholders. For example, what do research funders expect in terms of open access to publications and/or research data? Where and how to publish research data? How to ensure that research results are reproducible? These are all legitimate questions and, in particular, early career researchers may benefit from additional guidance and training. In this paper we review the activities of the European-funded FOSTER project which organized and supported a wide range of targeted trainings for open science, based on face-to-face events and on a growing suite of e-learning courses. This article reviews the approach and experiences gained from the first two years of the project.

  6. The EPOS Vision for the Open Science Cloud

    Science.gov (United States)

    Jeffery, Keith; Harrison, Matt; Cocco, Massimo

    2016-04-01

    Cloud computing offers dynamic elastic scalability for data processing on demand. For much research activity, demand for computing is uneven over time and so CLOUD computing offers both cost-effectiveness and capacity advantages. However, as reported repeatedly by the EC Cloud Expert Group, there are barriers to the uptake of Cloud Computing: (1) security and privacy; (2) interoperability (avoidance of lock-in); (3) lack of appropriate systems development environments for application programmers to characterise their applications to allow CLOUD middleware to optimize their deployment and execution. From CERN, the Helix-Nebula group has proposed the architecture for the European Open Science Cloud. They are discussing with other e-Infrastructure groups such as EGI (GRIDs), EUDAT (data curation), AARC (network authentication and authorisation) and also with the EIROFORUM group of 'international treaty' RIs (Research Infrastructures) and the ESFRI (European Strategic Forum for Research Infrastructures) RIs including EPOS. Many of these RIs are either e-RIs (electronic-RIs) or have an e-RI interface for access and use. The EPOS architecture is centred on a portal: ICS (Integrated Core Services). The architectural design already allows for access to e-RIs (which may include any or all of data, software, users and resources such as computers or instruments). Those within any one domain (subject area) of EPOS are considered within the TCS (Thematic Core Services). Those outside, or available across multiple domains of EPOS, are ICS-d (Integrated Core Services-Distributed) since the intention is that they will be used by any or all of the TCS via the ICS. Another such service type is CES (Computational Earth Science); effectively an ICS-d specializing in high performance computation, analytics, simulation or visualization offered by a TCS for others to use. Already discussions are underway between EPOS and EGI, EUDAT, AARC and Helix-Nebula for those offerings to be

  7. The Historical Origins and Economic Logic of 'Open Science'

    CERN Multimedia

    CERN. Geneva

    2008-01-01

    Modern "big science" projects, such as the LHC experiments in physics that are being prepared to run at CERN, embody the distinctive ethos of cooperation and mechanisms of coordination among distributed groups of researchers that are characteristic of 'open science'. Much has been written about the institutions of open science, their supporting social norms, and their effectiveness in generating additions to the stock of reliable knowledge. But from where have these institutions and their supporting ethos come? How robust can we assume them to be in the face of the recent trends for universities and research institutes in some domains of science to seek to appropriate the benefits of new discoveries and inventions by asserting intellectual property claims? A search for the historical origins of the institutions of open science throws some new light on these issues, and the answers may offer some lessons for contemporary science and technology policy-making.

  8. Proceedings of the Spanish Conference on e-Science Grid Computing. March 1-2, 2007. Madrid (Spain)

    International Nuclear Information System (INIS)

    Casado, J.; Mayo, R.; Munoz, R.

    2007-01-01

    The Spanish Conference on e-Science Grid Computing and the EGEE-EELA Industrial Day (http://webrt.ciemat.es:8000/e-science/index.html) are the first edition of this open forum for the integration of Grid Technologies and its applications in the Spanish community. It has been organised by CIEMAT and CETA-CIEMAT, sponsored by IBM and HP and supported by the European Community through their funded projects EELA, EUChinaGrid and EUMedGrid. To all of them, the conference is very grateful. e-Science is the concept that defines those activities developed by using geographically distributed resources, which scientists (or whoever) can access through the Internet. However, commercial Internet does not fulfil resources such as calculus and massive storage -most frequently in demand in the field of e-Science- since they require high-speed networks devoted to research. These networks, alongside the collaborative work applications developed within them, are creating an ideal scenario for interaction among researchers. Thus, this technology that interconnects a huge variety of computers, information repositories, applications software and scientific tools will change the society in the next few years. The science, industry and services systems will benefit from his immense capacity of computation that will improve the quality of life and the well-being of citizens. The future generation of technologies, which will reach all of these areas in society, such as research, medicine, engineering, economy and entertainment will be based on integrated computers and networks, rendering a very high quality of services and applications through a friendly interface. The conference aims at becoming a liaison framework between Spanish and International developers and users of e-Science applications and at implementing these technologies in Spain. It intends to be a forum where the state of the art of different European projects on e- Science is shown, as well as developments in the research

  9. Can psychology walk the walk of open science?

    Science.gov (United States)

    Hesse, Bradford W

    2018-01-01

    An "open science movement" is gaining traction across many disciplines within the research enterprise but is also precipitating consternation among those who worry that too much disruption may be hampering professional productivity. Despite this disruption, proponents of open data collaboration have argued that some of the biggest problems of the 21st century need to be solved with the help of many people and that data sharing will be the necessary engine to make that happen. In the United States, a national strategic plan for data sharing encouraged the federally funded scientific agencies to (a) publish open data for community use in discoverable, machine-readable, and useful ways; (b) work with public and civil society organizations to set priorities for data to be shared; (c) support innovation and feedback on open data solutions; and (d) continue efforts to release and enhance high-priority data sets funded by taxpayer dollars. One of the more visible open data projects in the psychological sciences is the presidentially announced "Brain Research Through Advancing Innovative Neurotechnologies" (BRAIN) initiative. Lessons learned from initiatives such as these are instructive both from the perspective of open science within psychology and from the perspective of understanding the psychology of open science. Recommendations for creating better pathways to "walk the walk" in open science include (a) nurturing innovation and agile learning, (b) thinking outside the paradigm, (c) creating simplicity from complexity, and (d) participating in continuous learning evidence platforms. (PsycINFO Database Record (c) 2018 APA, all rights reserved).

  10. Implementation of Grid Tier 2 and Tier 3 facilities on a Distributed OpenStack Cloud

    International Nuclear Information System (INIS)

    Limosani, Antonio; Boland, Lucien; Crosby, Sean; Huang, Joanna; Sevior, Martin; Coddington, Paul; Zhang, Shunde; Wilson, Ross

    2014-01-01

    The Australian Government is making a $AUD 100 million investment in Compute and Storage for the academic community. The Compute facilities are provided in the form of 30,000 CPU cores located at 8 nodes around Australia in a distributed virtualized Infrastructure as a Service facility based on OpenStack. The storage will eventually consist of over 100 petabytes located at 6 nodes. All will be linked via a 100 Gb/s network. This proceeding describes the development of a fully connected WLCG Tier-2 grid site as well as a general purpose Tier-3 computing cluster based on this architecture. The facility employs an extension to Torque to enable dynamic allocations of virtual machine instances. A base Scientific Linux virtual machine (VM) image is deployed in the OpenStack cloud and automatically configured as required using Puppet. Custom scripts are used to launch multiple VMs, integrate them into the dynamic Torque cluster and to mount remote file systems. We report on our experience in developing this nation-wide ATLAS and Belle II Tier 2 and Tier 3 computing infrastructure using the national Research Cloud and storage facilities.

  11. Implementation of Grid Tier 2 and Tier 3 facilities on a Distributed OpenStack Cloud

    Science.gov (United States)

    Limosani, Antonio; Boland, Lucien; Coddington, Paul; Crosby, Sean; Huang, Joanna; Sevior, Martin; Wilson, Ross; Zhang, Shunde

    2014-06-01

    The Australian Government is making a AUD 100 million investment in Compute and Storage for the academic community. The Compute facilities are provided in the form of 30,000 CPU cores located at 8 nodes around Australia in a distributed virtualized Infrastructure as a Service facility based on OpenStack. The storage will eventually consist of over 100 petabytes located at 6 nodes. All will be linked via a 100 Gb/s network. This proceeding describes the development of a fully connected WLCG Tier-2 grid site as well as a general purpose Tier-3 computing cluster based on this architecture. The facility employs an extension to Torque to enable dynamic allocations of virtual machine instances. A base Scientific Linux virtual machine (VM) image is deployed in the OpenStack cloud and automatically configured as required using Puppet. Custom scripts are used to launch multiple VMs, integrate them into the dynamic Torque cluster and to mount remote file systems. We report on our experience in developing this nation-wide ATLAS and Belle II Tier 2 and Tier 3 computing infrastructure using the national Research Cloud and storage facilities.

  12. AMP: a science-driven web-based application for the TeraGrid

    Science.gov (United States)

    Woitaszek, M.; Metcalfe, T.; Shorrock, I.

    The Asteroseismic Modeling Portal (AMP) provides a web-based interface for astronomers to run and view simulations that derive the properties of Sun-like stars from observations of their pulsation frequencies. In this paper, we describe the architecture and implementation of AMP, highlighting the lightweight design principles and tools used to produce a functional fully-custom web-based science application in less than a year. Targeted as a TeraGrid science gateway, AMP's architecture and implementation are intended to simplify its orchestration of TeraGrid computational resources. AMP's web-based interface was developed as a traditional standalone database-backed web application using the Python-based Django web development framework, allowing us to leverage the Django framework's capabilities while cleanly separating the user interface development from the grid interface development. We have found this combination of tools flexible and effective for rapid gateway development and deployment.

  13. Catalyzing Open and Collaborative Science to Address Global ...

    International Development Research Centre (IDRC) Digital Library (Canada)

    Climate change, environmental degradation, emerging infectious diseases, ... Examples include crowdsourcing to map and monitor deforestation in Brazil to support conservation efforts in the Amazon. ... The costs and risks of open science

  14. Open Science Interview mit Sönke Bartling

    OpenAIRE

    Scheliga, Kaja

    2014-01-01

    This interview is part of a series of interviews on open science and digital scholarship conducted in 2013 with researchers from various backgrounds. For an analysis of the interviews see: Scheliga, Kaja and Sascha Friesike. 2014. “Putting open science into practice: A social dilemma?” First Monday. Volume 19, Number 9. DOI: http://dx.doi.org/10.5210/fm.v19i9.5381

  15. Open Science Interview with Carolina Ödman-Govender

    OpenAIRE

    Scheliga, Kaja

    2014-01-01

    This interview is part of a series of interviews on open science and digital scholarship conducted in 2013 with researchers from various backgrounds. For an analysis of the interviews see: Scheliga, Kaja and Sascha Friesike. 2014. “Putting open science into practice: A social dilemma?” First Monday. Volume 19, Number 9. DOI: http://dx.doi.org/10.5210/fm.v19i9.5381

  16. The GENIUS Grid Portal and robot certificates: a new tool for e-Science.

    Science.gov (United States)

    Barbera, Roberto; Donvito, Giacinto; Falzone, Alberto; La Rocca, Giuseppe; Milanesi, Luciano; Maggi, Giorgio Pietro; Vicario, Saverio

    2009-06-16

    Grid technology is the computing model which allows users to share a wide pletora of distributed computational resources regardless of their geographical location. Up to now, the high security policy requested in order to access distributed computing resources has been a rather big limiting factor when trying to broaden the usage of Grids into a wide community of users. Grid security is indeed based on the Public Key Infrastructure (PKI) of X.509 certificates and the procedure to get and manage those certificates is unfortunately not straightforward. A first step to make Grids more appealing for new users has recently been achieved with the adoption of robot certificates. Robot certificates have recently been introduced to perform automated tasks on Grids on behalf of users. They are extremely useful for instance to automate grid service monitoring, data processing production, distributed data collection systems. Basically these certificates can be used to identify a person responsible for an unattended service or process acting as client and/or server. Robot certificates can be installed on a smart card and used behind a portal by everyone interested in running the related applications in a Grid environment using a user-friendly graphic interface. In this work, the GENIUS Grid Portal, powered by EnginFrame, has been extended in order to support the new authentication based on the adoption of these robot certificates. The work carried out and reported in this manuscript is particularly relevant for all users who are not familiar with personal digital certificates and the technical aspects of the Grid Security Infrastructure (GSI). The valuable benefits introduced by robot certificates in e-Science can so be extended to users belonging to several scientific domains, providing an asset in raising Grid awareness to a wide number of potential users. The adoption of Grid portals extended with robot certificates, can really contribute to creating transparent access to

  17. Open Science: Dimensions to a new scientific practice

    Directory of Open Access Journals (Sweden)

    Adriana Carla Silva de Oliveira

    2016-08-01

    Full Text Available Introduction:The practices of e-science and the use and reuse of scientific data have constituted a new scientific work that leads to the reflection on new regulatory, legal, institutional and technological frameworks for open science. Objective: This study shows the following research question: which dimensions provide sustainability for the formulation of a policy geared to open science and its practices in the Brazilian context? The aim of this study is to discuss the dimensions that support transversely the formulation of a policy for open science and its scientific practices. Methodology:Theoretically, the study is guided by the fourth scientific paradigm grounded in the e-Science. The methodology is supported by Bufrem’s studies (2013, which propose an alternative and multidimensional model for analysis and discussion of scientific research. Technically, the literature review and documentary survey were the methods used on the Data Lifecycle scientific model, laws and international agreements.For this study purpose, five dimensions were proposed, namely: epistemological, political, ethical-legal-cultural, morphological, and technological. Results: This studyunderstands that these dimensions substantiate an information policy or the development of minimum guidelines for the open science agenda in Brazil. Conclusions: The dimensions put away the reductionist perspective on survey data and they conducted the study for the multi-dimensional and multi-relational vision of open science.

  18. Open-science projects get kickstarted at CERN

    CERN Multimedia

    Achintya Rao

    2015-01-01

    CERN is one of the host sites for the Mozilla Science Lab Global Sprint to be held on 4 and 5 June, which will see participants around the world work on projects to further open science and educational tools.   IdeaSquare will be hosting the event at CERN. The Mozilla Science Lab Global Sprint was first held in 2014 to bring together open-science practitioners and enthusiasts to collaborate on projects designed to advance science on the open web. The sprint is a loosely federated event, and CERN is participating in the 2015 edition, hosting sprinters in the hacker-friendly IdeaSquare. Five projects have been formally proposed and CERN users and staff are invited to participate in a variety of ways. A special training session will also be held to introduce the CERN community to existing open-science and collaborative tools, including ones that have been deployed at CERN. 1. GitHub Science Badges: Sprinters will work on developing a badge-style visual representation of how open a software pro...

  19. Putting open science into practice: A social dilemma?

    NARCIS (Netherlands)

    Scheliga, Kaja; Friesike, Sascha

    2014-01-01

    Digital technologies carry the promise of transforming science and opening up the research process. We interviewed researchers from a variety of backgrounds about their attitudes towards and experiences with openness in their research practices. We observe a considerable discrepancy between the

  20. The Open Access Availability of Library and Information Science Literature

    Science.gov (United States)

    Way, Doug

    2010-01-01

    To examine the open access availability of Library and Information Science (LIS) research, a study was conducted using Google Scholar to search for articles from 20 top LIS journals. The study examined whether Google Scholar was able to find any links to full text, if open access versions of the articles were available and where these articles…

  1. Leveraging Open Standards and Technologies to Enhance Community Access to Earth Science Lidar Data

    Science.gov (United States)

    Crosby, C. J.; Nandigam, V.; Krishnan, S.; Cowart, C.; Baru, C.; Arrowsmith, R.

    2011-12-01

    Lidar (Light Detection and Ranging) data, collected from space, airborne and terrestrial platforms, have emerged as an invaluable tool for a variety of Earth science applications ranging from ice sheet monitoring to modeling of earth surface processes. However, lidar present a unique suite of challenges from the perspective of building cyberinfrastructure systems that enable the scientific community to access these valuable research datasets. Lidar data are typically characterized by millions to billions of individual measurements of x,y,z position plus attributes; these "raw" data are also often accompanied by derived raster products and are frequently terabytes in size. As a relatively new and rapidly evolving data collection technology, relevant open data standards and software projects are immature compared to those for other remote sensing platforms. The NSF-funded OpenTopography Facility project has developed an online lidar data access and processing system that co-locates data with on-demand processing tools to enable users to access both raw point cloud data as well as custom derived products and visualizations. OpenTopography is built on a Service Oriented Architecture (SOA) in which applications and data resources are deployed as standards compliant (XML and SOAP) Web services with the open source Opal Toolkit. To develop the underlying applications for data access, filtering and conversion, and various processing tasks, OpenTopography has heavily leveraged existing open source software efforts for both lidar and raster data. Operating on the de facto LAS binary point cloud format (maintained by ASPRS), open source libLAS and LASlib libraries provide OpenTopography data ingestion, query and translation capabilities. Similarly, raster data manipulation is performed through a suite of services built on the Geospatial Data Abstraction Library (GDAL). OpenTopography has also developed our own algorithm for high-performance gridding of lidar point cloud data

  2. LLNL Mercury Project Trinity Open Science Final Report

    Energy Technology Data Exchange (ETDEWEB)

    Dawson, Shawn A. [Lawrence Livermore National Lab. (LLNL), Livermore, CA (United States)

    2017-08-17

    The Mercury Monte Carlo particle transport code is used to simulate the transport of radiation through urban environments. These challenging calculations include complicated geometries and require significant computational resources to complete. In the proposed Trinity Open Science calculations, I will investigate computer science aspects of the code which are relevant to convergence of the simulation quantities with increasing Monte Carlo particle counts.

  3. Pre-Service Teachers’ Attitudes Toward Teaching Science and Their Science Learning at Indonesia Open University

    Directory of Open Access Journals (Sweden)

    Nadi SUPRAPTO

    2017-10-01

    Full Text Available This study focuses on attitudes toward (teaching science and the learning of science for primary school among pre-service teachers at the Open University of Indonesia. A three-year longitudinal survey was conducted, involving 379 students as pre-service teachers (PSTs from the Open University in Surabaya regional office. Attitudes toward (teaching science’ (ATS instrument was used to portray PSTs’ preparation for becoming primary school teachers. Data analyses were used, including descriptive analysis and confirmatory factor analysis. The model fit of the attitudes toward (teaching science can be described from seven dimensions: self-efficacy for teaching science, the relevance of teaching science, gender-stereotypical beliefs, anxiety in teaching science, the difficulty of teaching science, perceived dependency on contextual factors, and enjoyment in teaching science. The results of the research also described science learning at the Open University of Indonesia looks like. Implications for primary teacher education are discussed.

  4. Tunable Reaction Potentials in Open Framework Nanoparticle Battery Electrodes for Grid-Scale Energy Storage

    KAUST Repository

    Wessells, Colin D.

    2012-02-28

    The electrical energy grid has a growing need for energy storage to address short-term transients, frequency regulation, and load leveling. Though electrochemical energy storage devices such as batteries offer an attractive solution, current commercial battery technology cannot provide adequate power, and cycle life, and energy efficiency at a sufficiently low cost. Copper hexacyanoferrate and nickel hexacyanoferrate, two open framework materials with the Prussian Blue structure, were recently shown to offer ultralong cycle life and high-rate performance when operated as battery electrodes in safe, inexpensive aqueous sodium ion and potassium ion electrolytes. In this report, we demonstrate that the reaction potential of copper-nickel alloy hexacyanoferrate nanoparticles may be tuned by controlling the ratio of copper to nickel in these materials. X-ray diffraction, TEM energy dispersive X-ray spectroscopy, and galvanostatic electrochemical cycling of copper-nickel hexacyanoferrate reveal that copper and nickel form a fully miscible solution at particular sites in the framework without perturbing the structure. This allows copper-nickel hexacyanoferrate to reversibly intercalate sodium and potassium ions for over 2000 cycles with capacity retentions of 100% and 91%, respectively. The ability to precisely tune the reaction potential of copper-nickel hexacyanoferrate without sacrificing cycle life will allow the development of full cells that utilize the entire electrochemical stability window of aqueous sodium and potassium ion electrolytes. © 2012 American Chemical Society.

  5. USGS Science Data Catalog - Open Data Advances or Declines

    Science.gov (United States)

    Frame, M. T.; Hutchison, V.; Zolly, L.; Wheeler, B.; Latysh, N.; Devarakonda, R.; Palanisamy, G.; Shrestha, B.

    2014-12-01

    The recent Office of Science and Technology Policy (OSTP) White House Open Data Policies (2013) have required Federal agencies to establish formal catalogues of their science data holdings and make these data easily available on Web sites, portals, and applications. As an organization, the USGS has historically excelled at making its data holdings freely available on its various Web sites (i.e., National, Scientific Programs, or local Science Center). In response to these requirements, the USGS Core Science Analytics, Synthesis, and Libraries program, in collaboration with DOE's Oak Ridge National Laboratory (ORNL) Mercury Consortium (funded by NASA, USGS, and DOE), and a number of other USGS organizations, established the Science Data Catalog (http://data.usgs.gov) cyberinfrastructure, content management processes/tools, and supporting policies. The USGS Science Data Catalog led the charge at USGS to improve the robustness of existing/future metadata collections; streamline and develop sustainable publishing to external aggregators (i.e., data.gov); and provide leadership to the U.S. Department of Interior in emerging Open Data policies, techniques, and systems. The session will discuss the current successes, challenges, and movement toward meeting these Open Data policies for USGS scientific data holdings. A retrospective look at the last year of implementation of these efforts within USGS will occur to determine whether these Open Data Policies are improving data access or limiting data availability. To learn more about the USGS Science Data Catalog, visit us at http://data.usgs.gov/info/about.html

  6. A General-Purpose Spatial Survey Design for Collaborative Science and Monitoring of Global Environmental Change: The Global Grid

    Directory of Open Access Journals (Sweden)

    David M. Theobald

    2016-09-01

    Full Text Available Recent guidance on environmental modeling and global land-cover validation stresses the need for a probability-based design. Additionally, spatial balance has also been recommended as it ensures more efficient sampling, which is particularly relevant for understanding land use change. In this paper I describe a global sample design and database called the Global Grid (GG that has both of these statistical characteristics, as well as being flexible, multi-scale, and globally comprehensive. The GG is intended to facilitate collaborative science and monitoring of land changes among local, regional, and national groups of scientists and citizens, and it is provided in a variety of open source formats to promote collaborative and citizen science. Since the GG sample grid is provided at multiple scales and is globally comprehensive, it provides a universal, readily-available sample. It also supports uneven probability sample designs through filtering sample locations by user-defined strata. The GG is not appropriate for use at locations above ±85° because the shape and topological distortion of quadrants becomes extreme near the poles. Additionally, the file sizes of the GG datasets are very large at fine scale (resolution ~600 m × 600 m and require a 64-bit integer representation.

  7. Open Science & Open Data Global Sprint 2016 | 2–3 June 2016

    CERN Multimedia

    Achintya Rao

    2016-01-01

    Join us as we learn to collaboratively build projects transforming science on the web! Thursday 2 June 2016 8.00 a.m. – Friday 3 June 20.00 p.m. CERN (3179-R-E06) This two-day sprint event brings together researchers, coders, librarians and the public from around the globe to hack on open science and open data projects in their communities. This year, we have four tracks you can contribute to: tools, citizen science, curriculum and open data. CERN is hosting three projects: Everware Open Cosmics CrowdAI   You can also participate in any of the other mozsprint projects for 2016. For more information, please visit: https://indico.cern.ch/event/535760/

  8. ITK: Enabling Reproducible Research and Open Science

    Directory of Open Access Journals (Sweden)

    Matthew Michael McCormick

    2014-02-01

    Full Text Available Reproducibility verification is essential to the practice of the scientific method. Researchers report their findings, which are strengthened as other independent groups in the scientific community share similar outcomes. In the many scientific fields where software has become a fundamental tool for capturing and analyzing data, this requirement of reproducibility implies that reliable and comprehensive software platforms and tools should be made available to the scientific community. The tools will empower them and the public to verify, through practice, the reproducibility of observations that are reported in the scientific literature.Medical image analysis is one of the fields in which the use of computational resources, both software and hardware, are an essential platform for performing experimental work. In this arena, the introduction of the Insight Toolkit (ITK in 1999 has transformed the field and facilitates its progress by accelerating the rate at which algorithmic implementations are developed, tested, disseminated and improved. By building on the efficiency and quality of open source methodologies, ITK has provided the medical image community with an effective platform on which to build a daily workflow that incorporates the true scientific practices of reproducibility verification.This article describes the multiple tools, methodologies, and practices that the ITK community has adopted, refined, and followed during the past decade, in order to become one of the research communities with the most modern reproducibility verification infrastructure. For example, 207 contributors have created over 2400 unit tests that provide over 84% code line test coverage. The Insight Journal, an open publication journal associated with the toolkit, has seen over 360,000 publication downloads. The median normalized closeness centrality, a measure of knowledge flow, resulting from the distributed peer code review system was high, 0.46.

  9. Experimental and numerical investigation of water flow through spacer grids of nuclear fuel elements using the Open FOAM code

    International Nuclear Information System (INIS)

    Vidal, Guilherme A.M.; Vieira, Tiago A.S.; Castro, Higor F.P.

    2017-01-01

    With the advancement and development of computational tools, the studies of thermofluidodynamic behavior in nuclear fuel elements have been developed in recent years. Of the devices present in these elements, the spacing grids received more attention. They have kept the fuel rods equally spaced and have fins that aim to improve the heat transfer process between the water and the fuel element. Therefore, the grids present an important structural and thermal function. This work was carried out with the purpose of verifying and validating simulations of spacer grids using OpenFOAM (2017) software of Computational Fluid Dynamics (CFD). The simulations were validated using results obtained through the commercial CFD program, Ansys CFX, and experiments available in the literature and obtained in test sections assembled on the Water-Air Circuit (CCA) of the CDTN thermo-hydraulic laboratory

  10. FermiGrid - experience and future plans

    International Nuclear Information System (INIS)

    Chadwick, K.; Berman, E.; Canal, P.; Hesselroth, T.; Garzoglio, G.; Levshina, T.; Sergeev, V.; Sfiligoi, I.; Timm, S.; Yocum, D.

    2007-01-01

    Fermilab supports a scientific program that includes experiments and scientists located across the globe. In order to better serve this community, Fermilab has placed its production computer resources in a Campus Grid infrastructure called 'FermiGrid'. The FermiGrid infrastructure allows the large experiments at Fermilab to have priority access to their own resources, enables sharing of these resources in an opportunistic fashion, and movement of work (jobs, data) between the Campus Grid and National Grids such as Open Science Grid and the WLCG. FermiGrid resources support multiple Virtual Organizations (VOs), including VOs from the Open Science Grid (OSG), EGEE and the Worldwide LHC Computing Grid Collaboration (WLCG). Fermilab also makes leading contributions to the Open Science Grid in the areas of accounting, batch computing, grid security, job management, resource selection, site infrastructure, storage management, and VO services. Through the FermiGrid interfaces, authenticated and authorized VOs and individuals may access our core grid services, the 10,000+ Fermilab resident CPUs, near-petabyte (including CMS) online disk pools and the multi-petabyte Fermilab Mass Storage System. These core grid services include a site wide Globus gatekeeper, VO management services for several VOs, Fermilab site authorization services, grid user mapping services, as well as job accounting and monitoring, resource selection and data movement services. Access to these services is via standard and well-supported grid interfaces. We will report on the user experience of using the FermiGrid campus infrastructure interfaced to a national cyberinfrastructure--the successes and the problems

  11. Grid computing and collaboration technology in support of fusion energy sciences

    International Nuclear Information System (INIS)

    Schissel, D.P.

    2005-01-01

    Science research in general and magnetic fusion research in particular continue to grow in size and complexity resulting in a concurrent growth in collaborations between experimental sites and laboratories worldwide. The simultaneous increase in wide area network speeds has made it practical to envision distributed working environments that are as productive as traditionally collocated work. In computing power, it has become reasonable to decouple production and consumption resulting in the ability to construct computing grids in a similar manner as the electrical power grid. Grid computing, the secure integration of computer systems over high speed networks to provide on-demand access to data analysis capabilities and related functions, is being deployed as an alternative to traditional resource sharing among institutions. For human interaction, advanced collaborative environments are being researched and deployed to have distributed group work that is as productive as traditional meetings. The DOE Scientific Discovery through Advanced Computing Program initiative has sponsored several collaboratory projects, including the National Fusion Collaboratory Project, to utilize recent advances in grid computing and advanced collaborative environments to further research in several specific scientific domains. For fusion, the collaborative technology being deployed is being used in present day research and is also scalable to future research, in particular, to the International Thermonuclear Experimental Reactor experiment that will require extensive collaboration capability worldwide. This paper briefly reviews the concepts of grid computing and advanced collaborative environments and gives specific examples of how these technologies are being used in fusion research today

  12. Embracing Open Source for NASA's Earth Science Data Systems

    Science.gov (United States)

    Baynes, Katie; Pilone, Dan; Boller, Ryan; Meyer, David; Murphy, Kevin

    2017-01-01

    The overarching purpose of NASAs Earth Science program is to develop a scientific understanding of Earth as a system. Scientific knowledge is most robust and actionable when resulting from transparent, traceable, and reproducible methods. Reproducibility includes open access to the data as well as the software used to arrive at results. Additionally, software that is custom-developed for NASA should be open to the greatest degree possible, to enable re-use across Federal agencies, reduce overall costs to the government, remove barriers to innovation, and promote consistency through the use of uniform standards. Finally, Open Source Software (OSS) practices facilitate collaboration between agencies and the private sector. To best meet these ends, NASAs Earth Science Division promotes the full and open sharing of not only all data, metadata, products, information, documentation, models, images, and research results but also the source code used to generate, manipulate and analyze them. This talk focuses on the challenges to open sourcing NASA developed software within ESD and the growing pains associated with establishing policies running the gamut of tracking issues, properly documenting build processes, engaging the open source community, maintaining internal compliance, and accepting contributions from external sources. This talk also covers the adoption of existing open source technologies and standards to enhance our custom solutions and our contributions back to the community. Finally, we will be introducing the most recent OSS contributions from NASA Earth Science program and promoting these projects for wider community review and adoption.

  13. A Real-Time Open Access Platform Towards Proof of Concept for Smart Grid Applications

    DEFF Research Database (Denmark)

    Kemal, Mohammed Seifu; Petersen, Lennart; Iov, Florin

    2018-01-01

    : electrical grid layer, ICT & network emulation layer and control layer. DiSC-OPAL, a toolbox built for OPAL-RT real time grid simulation; comprising of models for wide variety of controllable flexible assets, stochastic power sources for wind and solar power plants, real consumption data’s and electrical...

  14. Integrating Free and Open Source Solutions into Geospatial Science Education

    Directory of Open Access Journals (Sweden)

    Vaclav Petras

    2015-06-01

    Full Text Available While free and open source software becomes increasingly important in geospatial research and industry, open science perspectives are generally less reflected in universities’ educational programs. We present an example of how free and open source software can be incorporated into geospatial education to promote open and reproducible science. Since 2008 graduate students at North Carolina State University have the opportunity to take a course on geospatial modeling and analysis that is taught with both proprietary and free and open source software. In this course, students perform geospatial tasks simultaneously in the proprietary package ArcGIS and the free and open source package GRASS GIS. By ensuring that students learn to distinguish between geospatial concepts and software specifics, students become more flexible and stronger spatial thinkers when choosing solutions for their independent work in the future. We also discuss ways to continually update and improve our publicly available teaching materials for reuse by teachers, self-learners and other members of the GIS community. Only when free and open source software is fully integrated into geospatial education, we will be able to encourage a culture of openness and, thus, enable greater reproducibility in research and development applications.

  15. Accelerating Translational Research through Open Science: The Neuro Experiment.

    Science.gov (United States)

    Gold, E Richard

    2016-12-01

    Translational research is often afflicted by a fundamental problem: a limited understanding of disease mechanisms prevents effective targeting of new treatments. Seeking to accelerate research advances and reimagine its role in the community, the Montreal Neurological Institute (Neuro) announced in the spring of 2016 that it is launching a five-year experiment during which it will adopt Open Science-open data, open materials, and no patenting-across the institution. The experiment seeks to examine two hypotheses. The first is whether the Neuro's Open Science initiative will attract new private partners. The second hypothesis is that the Neuro's institution-based approach will draw companies to the Montreal region, where the Neuro is based, leading to the creation of a local knowledge hub. This article explores why these hypotheses are likely to be true and describes the Neuro's approach to exploring them.

  16. Accelerating Translational Research through Open Science: The Neuro Experiment.

    Directory of Open Access Journals (Sweden)

    E Richard Gold

    2016-12-01

    Full Text Available Translational research is often afflicted by a fundamental problem: a limited understanding of disease mechanisms prevents effective targeting of new treatments. Seeking to accelerate research advances and reimagine its role in the community, the Montreal Neurological Institute (Neuro announced in the spring of 2016 that it is launching a five-year experiment during which it will adopt Open Science-open data, open materials, and no patenting-across the institution. The experiment seeks to examine two hypotheses. The first is whether the Neuro's Open Science initiative will attract new private partners. The second hypothesis is that the Neuro's institution-based approach will draw companies to the Montreal region, where the Neuro is based, leading to the creation of a local knowledge hub. This article explores why these hypotheses are likely to be true and describes the Neuro's approach to exploring them.

  17. Open science initiatives: challenges for public health promotion.

    Science.gov (United States)

    Holzmeyer, Cheryl

    2018-03-07

    While academic open access, open data and open science initiatives have proliferated in recent years, facilitating new research resources for health promotion, open initiatives are not one-size-fits-all. Health research particularly illustrates how open initiatives may serve various interests and ends. Open initiatives not only foster new pathways of research access; they also discipline research in new ways, especially when associated with new regimes of research use and peer review, while participating in innovation ecosystems that often perpetuate existing systemic biases toward commercial biomedicine. Currently, many open initiatives are more oriented toward biomedical research paradigms than paradigms associated with public health promotion, such as social determinants of health research. Moreover, open initiatives too often dovetail with, rather than challenge, neoliberal policy paradigms. Such initiatives are unlikely to transform existing health research landscapes and redress health inequities. In this context, attunement to social determinants of health research and community-based local knowledge is vital to orient open initiatives toward public health promotion and health equity. Such an approach calls for discourses, norms and innovation ecosystems that contest neoliberal policy frameworks and foster upstream interventions to promote health, beyond biomedical paradigms. This analysis highlights challenges and possibilities for leveraging open initiatives on behalf of a wider range of health research stakeholders, while emphasizing public health promotion, health equity and social justice as benchmarks of transformation.

  18. Mapping the hinterland: Data issues in open science

    Science.gov (United States)

    Grand, Ann; Wilkinson, Clare; Bultitude, Karen; Winfield, Alan F. T.

    2016-01-01

    Open science is a practice in which the scientific process is shared completely and in real time. It offers the potential to support information flow, collaboration and dialogue among professional and non-professional participants. Using semi-structured interviews and case studies, this research investigated the relationship between open science and public engagement. This article concentrates on three particular areas of concern that emerged: first, how to effectively contextualise and narrate information to render it accessible, as opposed to simply available; second, concerns about data quantity and quality; and third, concerns about the skills required for effective contextualisation, mapping and interpretation of information. PMID:24769860

  19. Mapping the hinterland: Data issues in open science.

    Science.gov (United States)

    Grand, Ann; Wilkinson, Clare; Bultitude, Karen; Winfield, Alan F T

    2016-01-01

    Open science is a practice in which the scientific process is shared completely and in real time. It offers the potential to support information flow, collaboration and dialogue among professional and non-professional participants. Using semi-structured interviews and case studies, this research investigated the relationship between open science and public engagement. This article concentrates on three particular areas of concern that emerged: first, how to effectively contextualise and narrate information to render it accessible, as opposed to simply available; second, concerns about data quantity and quality; and third, concerns about the skills required for effective contextualisation, mapping and interpretation of information. © The Author(s) 2014.

  20. An Open and Holistic Approach for Geo and Space Sciences

    Science.gov (United States)

    Ritschel, Bernd; Seelus, Christoph; Neher, Günther; Toshihiko, Iyemori; Yatagai, Akiyo; Koyama, Yukinobu; Murayama, Yasuhiro; King, Todd; Hughes, Steve; Fung, Shing; Galkin, Ivan; Hapgood, Mike; Belehaki, Anna

    2016-04-01

    Geo and space sciences thus far have been very successful, even often an open, cross-domain and holistic approach did not play an essential role. But this situation is changing rapidly. The research focus is shifting into more complex, non-linear and multi-domain specified phenomena, such as e.g. climate change or space environment. This kind of phenomena only can be understood step by step using the holistic idea. So, what is necessary for a successful cross-domain and holistic approach in geo and space sciences? Research and science in general become more and more dependent from a rich fundus of multi-domain data sources, related context information and the use of highly advanced technologies in data processing. Such buzzword phrases as Big Data and Deep Learning are reflecting this development. Big Data also addresses the real exponential growing of data and information produced by measurements or simulations. Deep Learning technology may help to detect new patterns and relationships in data describing high sophisticated natural phenomena. And further on, we should not forget science and humanities are only two sides of the same medal in the continuing human process of knowledge discovery. The concept of Open Data or in particular the open access to scientific data is addressing the free and open availability of -at least publicly founded and generated- data. The open availability of data covers the free use, reuse and redistribution of data which have been established with the formation of World Data Centers already more than 50 years ago. So, we should not forget, the foundation for open data is the responsibility of the individual scientist up until the big science institutions and organizations for a sustainable management of data. Other challenges are discovering and collecting the appropriate data, and preferably all of them or at least the majority of the right data. Therefore a network of individual or even better institutional catalog-based and at least

  1. Open Data and Open Science for better Research in the Geo and Space Domain

    Science.gov (United States)

    Ritschel, B.; Seelus, C.; Neher, G.; Iyemori, T.; Koyama, Y.; Yatagai, A. I.; Murayama, Y.; King, T. A.; Hughes, S.; Fung, S. F.; Galkin, I. A.; Hapgood, M. A.; Belehaki, A.

    2015-12-01

    Main open data principles had been worked out in the run-up and finally adopted in the Open Data Charta at the G8 summit in Lough Erne, Northern Ireland in June 2013. Important principles are also valid for science data, such as Open Data by Default, Quality and Quantity, Useable by All, Releasing Data for Improved Governance, Releasing Data for Innovation. There is also an explicit relationship to such areas of high values as earth observation, education and geospatial data. The European union implementation plan of the Open Data Charta identifies among other things objectives such as making data available in an open format, enabling semantic interoperability, ensuring quality, documentation and where appropriate reconciliation across different data sources, implementing software solutionsallowing easy management, publication or visualization of datasets and simplifying clearance of intellectual property rights.Open Science is not just a list of already for a longer time known principles but stands for a lot of initiatives and projects around a better handling of scientific data and openly shared scientific knowledge. It is also about transparency in methodology and collection of data, availability and reuse of scientific data, public accessibility to scientific communication and using of social media to facility scientific collaboration. Some projects are concentrating on open sharing of free and open source software and even further hardware in kind of processing capabilities. In addition question about the mashup of data and publication and an open peer review process are addressed.Following the principles of open data and open science the newest results of the collaboration efforts in mashing up the data servers related to the Japanese IUGONET, the European Union ESPAS and the GFZ ISDC semantic Web projects will be presented here. The semantic Web based approach for the mashup is focusing on the design and implementation of a common but still distributed data

  2. Grid Technology as a Cyberinfrastructure for Delivering High-End Services to the Earth and Space Science Community

    Science.gov (United States)

    Hinke, Thomas H.

    2004-01-01

    Grid technology consists of middleware that permits distributed computations, data and sensors to be seamlessly integrated into a secure, single-sign-on processing environment. In &is environment, a user has to identify and authenticate himself once to the grid middleware, and then can utilize any of the distributed resources to which he has been,panted access. Grid technology allows resources that exist in enterprises that are under different administrative control to be securely integrated into a single processing environment The grid community has adopted commercial web services technology as a means for implementing persistent, re-usable grid services that sit on top of the basic distributed processing environment that grids provide. These grid services can then form building blocks for even more complex grid services. Each grid service is characterized using the Web Service Description Language, which provides a description of the interface and how other applications can access it. The emerging Semantic grid work seeks to associates sufficient semantic information with each grid service such that applications wii1 he able to automatically select, compose and if necessary substitute available equivalent services in order to assemble collections of services that are most appropriate for a particular application. Grid technology has been used to provide limited support to various Earth and space science applications. Looking to the future, this emerging grid service technology can provide a cyberinfrastructures for both the Earth and space science communities. Groups within these communities could transform those applications that have community-wide applicability into persistent grid services that are made widely available to their respective communities. In concert with grid-enabled data archives, users could easily create complex workflows that extract desired data from one or more archives and process it though an appropriate set of widely distributed grid

  3. Open science versus commercialization: a modern research conflict?

    Science.gov (United States)

    Caulfield, Timothy; Harmon, Shawn He; Joly, Yann

    2012-02-27

    Efforts to improve research outcomes have resulted in genomic researchers being confronted with complex and seemingly contradictory instructions about how to perform their tasks. Over the past decade, there has been increasing pressure on university researchers to commercialize their work. Concurrently, they are encouraged to collaborate, share data and disseminate new knowledge quickly (that is, to adopt an open science model) in order to foster scientific progress, meet humanitarian goals, and to maximize the impact of their research. We present selected guidelines from three countries (Canada, United States, and United Kingdom) situated at the forefront of genomics to illustrate this potential policy conflict. Examining the innovation ecosystem and the messages conveyed by the different policies surveyed, we further investigate the inconsistencies between open science and commercialization policies. Commercialization and open science are not necessarily irreconcilable and could instead be envisioned as complementary elements of a more holistic innovation framework. Given the exploratory nature of our study, we wish to point out the need to gather additional evidence on the coexistence of open science and commercialization policies and on its impact, both positive and negative, on genomics academic research.

  4. Evolution of Nursing Science: Is Open Access the Answer?

    Science.gov (United States)

    Clarke, Pamela N; Garcia, Jenny

    2015-10-01

    The open access movement where journal content is made freely available over the Internet is purported to increase scientific exchange, yet has pros and cons. There are issues related to quality that need to be examined in relation to evolution of nursing science. © The Author(s) 2015.

  5. TCIA: An information resource to enable open science.

    Science.gov (United States)

    Prior, Fred W; Clark, Ken; Commean, Paul; Freymann, John; Jaffe, Carl; Kirby, Justin; Moore, Stephen; Smith, Kirk; Tarbox, Lawrence; Vendt, Bruce; Marquez, Guillermo

    2013-01-01

    Reusable, publicly available data is a pillar of open science. The Cancer Imaging Archive (TCIA) is an open image archive service supporting cancer research. TCIA collects, de-identifies, curates and manages rich collections of oncology image data. Image data sets have been contributed by 28 institutions and additional image collections are underway. Since June of 2011, more than 2,000 users have registered to search and access data from this freely available resource. TCIA encourages and supports cancer-related open science communities by hosting and managing the image archive, providing project wiki space and searchable metadata repositories. The success of TCIA is measured by the number of active research projects it enables (>40) and the number of scientific publications and presentations that are produced using data from TCIA collections (39).

  6. GOSH! A roadmap for open-source science hardware

    CERN Multimedia

    Stefania Pandolfi

    2016-01-01

    The goal of the Gathering for Open Science Hardware (GOSH! 2016), held from 2 to 5 March 2016 at IdeaSquare, was to lay the foundations of the open-source hardware for science movement.   The participants in the GOSH! 2016 meeting gathered in IdeaSquare. (Image: GOSH Community) “Despite advances in technology, many scientific innovations are held back because of a lack of affordable and customisable hardware,” says François Grey, a professor at the University of Geneva and coordinator of Citizen Cyberlab – a partnership between CERN, the UN Institute for Training and Research and the University of Geneva – which co-organised the GOSH! 2016 workshop. “This scarcity of accessible science hardware is particularly obstructive for citizen science groups and humanitarian organisations that don’t have the same economic means as a well-funded institution.” Instead, open sourcing science hardware co...

  7. e-Science on Earthquake Disaster Mitigation by EUAsiaGrid

    Science.gov (United States)

    Yen, Eric; Lin, Simon; Chen, Hsin-Yen; Chao, Li; Huang, Bor-Shoh; Liang, Wen-Tzong

    2010-05-01

    Although earthquake is not predictable at this moment, with the aid of accurate seismic wave propagation analysis, we could simulate the potential hazards at all distances from possible fault sources by understanding the source rupture process during large earthquakes. With the integration of strong ground-motion sensor network, earthquake data center and seismic wave propagation analysis over gLite e-Science Infrastructure, we could explore much better knowledge on the impact and vulnerability of potential earthquake hazards. On the other hand, this application also demonstrated the e-Science way to investigate unknown earth structure. Regional integration of earthquake sensor networks could aid in fast event reporting and accurate event data collection. Federation of earthquake data center entails consolidation and sharing of seismology and geology knowledge. Capability building of seismic wave propagation analysis implies the predictability of potential hazard impacts. With gLite infrastructure and EUAsiaGrid collaboration framework, earth scientists from Taiwan, Vietnam, Philippine, Thailand are working together to alleviate potential seismic threats by making use of Grid technologies and also to support seismology researches by e-Science. A cross continental e-infrastructure, based on EGEE and EUAsiaGrid, is established for seismic wave forward simulation and risk estimation. Both the computing challenge on seismic wave analysis among 5 European and Asian partners, and the data challenge for data center federation had been exercised and verified. Seismogram-on-Demand service is also developed for the automatic generation of seismogram on any sensor point to a specific epicenter. To ease the access to all the services based on users workflow and retain the maximal flexibility, a Seismology Science Gateway integating data, computation, workflow, services and user communities would be implemented based on typical use cases. In the future, extension of the

  8. Pre-Service Teachers’ Attitudes Toward Teaching Science and Their Science Learning at Indonesia Open University

    OpenAIRE

    Nadi SUPRAPTO; Ali MURSID

    2017-01-01

    This study focuses on attitudes toward (teaching) science and the learning of science for primary school among pre-service teachers at the Open University of Indonesia. A three-year longitudinal survey was conducted, involving 379 students as pre-service teachers (PSTs) from the Open University in Surabaya regional office. Attitudes toward (teaching) science’ (ATS) instrument was used to portray PSTs’ preparation for becoming primary school teachers. Data analyses were used, including descrip...

  9. Simulation of Electrical Grid with Omnet++ Open Source Discrete Event System Simulator

    Directory of Open Access Journals (Sweden)

    Sőrés Milán

    2016-12-01

    Full Text Available The simulation of electrical networks is very important before development and servicing of electrical networks and grids can occur. There are software that can simulate the behaviour of electrical grids under different operating conditions, but these simulation environments cannot be used in a single cloud-based project, because they are not GNU-licensed software products. In this paper, an integrated framework was proposed that models and simulates communication networks. The design and operation of the simulation environment are investigated and a model of electrical components is proposed. After simulation, the simulation results were compared to manual computed results.

  10. Experimental demonstration of an OpenFlow based software-defined optical network employing packet, fixed and flexible DWDM grid technologies on an international multi-domain testbed.

    Science.gov (United States)

    Channegowda, M; Nejabati, R; Rashidi Fard, M; Peng, S; Amaya, N; Zervas, G; Simeonidou, D; Vilalta, R; Casellas, R; Martínez, R; Muñoz, R; Liu, L; Tsuritani, T; Morita, I; Autenrieth, A; Elbers, J P; Kostecki, P; Kaczmarek, P

    2013-03-11

    Software defined networking (SDN) and flexible grid optical transport technology are two key technologies that allow network operators to customize their infrastructure based on application requirements and therefore minimizing the extra capital and operational costs required for hosting new applications. In this paper, for the first time we report on design, implementation & demonstration of a novel OpenFlow based SDN unified control plane allowing seamless operation across heterogeneous state-of-the-art optical and packet transport domains. We verify and experimentally evaluate OpenFlow protocol extensions for flexible DWDM grid transport technology along with its integration with fixed DWDM grid and layer-2 packet switching.

  11. Helix Nebula Science Cloud pilot phase open session

    CERN Multimedia

    CERN. Geneva

    2018-01-01

    This Helix Nebula Science Cloud (HNSciCloud) public session is open to everyone and will be webcast. The session will provide the audience with an overview of the HNSciCloud pre-commercial procurement project and the innovative cloud platforms that have been developed. A number of practical use-cases from the physics community will be presented as well as the next steps to be undertaken.

  12. Integrating Solar Power onto the Electric Grid - Bridging the Gap between Atmospheric Science, Engineering and Economics

    Science.gov (United States)

    Ghonima, M. S.; Yang, H.; Zhong, X.; Ozge, B.; Sahu, D. K.; Kim, C. K.; Babacan, O.; Hanna, R.; Kurtz, B.; Mejia, F. A.; Nguyen, A.; Urquhart, B.; Chow, C. W.; Mathiesen, P.; Bosch, J.; Wang, G.

    2015-12-01

    One of the main obstacles to high penetrations of solar power is the variable nature of solar power generation. To mitigate variability, grid operators have to schedule additional reliability resources, at considerable expense, to ensure that load requirements are met by generation. Thus despite the cost of solar PV decreasing, the cost of integrating solar power will increase as penetration of solar resources onto the electric grid increases. There are three principal tools currently available to mitigate variability impacts: (i) flexible generation, (ii) storage, either virtual (demand response) or physical devices and (iii) solar forecasting. Storage devices are a powerful tool capable of ensuring smooth power output from renewable resources. However, the high cost of storage is prohibitive and markets are still being designed to leverage their full potential and mitigate their limitation (e.g. empty storage). Solar forecasting provides valuable information on the daily net load profile and upcoming ramps (increasing or decreasing solar power output) thereby providing the grid advance warning to schedule ancillary generation more accurately, or curtail solar power output. In order to develop solar forecasting as a tool that can be utilized by the grid operators we identified two focus areas: (i) develop solar forecast technology and improve solar forecast accuracy and (ii) develop forecasts that can be incorporated within existing grid planning and operation infrastructure. The first issue required atmospheric science and engineering research, while the second required detailed knowledge of energy markets, and power engineering. Motivated by this background we will emphasize area (i) in this talk and provide an overview of recent advancements in solar forecasting especially in two areas: (a) Numerical modeling tools for coastal stratocumulus to improve scheduling in the day-ahead California energy market. (b) Development of a sky imager to provide short term

  13. Transparency: the emerging third dimension of Open Science and Open Data

    Directory of Open Access Journals (Sweden)

    Liz Lyon

    2016-03-01

    Full Text Available This paper presents an exploration of the concept of research transparency. The policy context is described and situated within the broader arena of open science. This is followed by commentary on transparency within the research process, which includes a brief overview of the related concept of reproducibility and the associated elements of research integrity, fraud and retractions. A two-dimensional model or continuum of open science is considered and the paper builds on this foundation by presenting a three-dimensional model, which includes the additional axis of ‘transparency’. The concept is further unpacked and preliminary definitions of key terms are introduced: transparency, transparency action, transparency agent and transparency tool.  An important linkage is made to the research lifecycle as a setting for potential transparency interventions by libraries. Four areas are highlighted as foci for enhanced engagement with transparency goals: Leadership and Policy, Advocacy and Training, Research Infrastructures and Workforce Development.

  14. ESA Science Archives, VO tools and remote Scientific Data reduction in Grid Architectures

    Science.gov (United States)

    Arviset, C.; Barbarisi, I.; de La Calle, I.; Fajersztejn, N.; Freschi, M.; Gabriel, C.; Gomez, P.; Guainazzi, M.; Ibarra, A.; Laruelo, A.; Leon, I.; Micol, A.; Parrilla, E.; Ortiz, I.; Osuna, P.; Salgado, J.; Stebe, A.; Tapiador, D.

    2008-08-01

    This paper presents the latest functionalities of the ESA Science Archives located at ESAC, Spain, in particular, the following archives : the ISO Data Archive (IDA {http://iso.esac.esa.int/ida}), the XMM-Newton Science Archive (XSA {http://xmm.esac.esa.int/xsa}), the Integral SOC Science Data Archive (ISDA {http://integral.esac.esa.int/isda}) and the Planetary Science Archive (PSA {http://www.rssd.esa.int/psa}), both the classical and the map-based Mars Express interfaces. Furthermore, the ESA VOSpec {http://esavo.esac.esa.int/vospecapp} spectra analysis tool is described, which allows to access and display spectral information from VO resources (both real observational and theoretical spectra), including access to Lines database and recent analysis functionalities. In addition, we detail the first implementation of RISA (Remote Interface for Science Analysis), a web service providing remote users the ability to create fully configurable XMM-Newton data analysis workflows, and to deploy and run them on the ESAC Grid. RISA makes fully use of the inter-operability provided by the SIAP (Simple Image Access Protocol) services as data input, and at the same time its VO-compatible output can directly be used by general VO-tools.

  15. FermiGrid-experience and future plans

    International Nuclear Information System (INIS)

    Chadwick, K; Berman, E; Canal, P; Hesselroth, T; Garzoglio, G; Levshina, T; Sergeev, V; Sfiligoi, I; Sharma, N; Timm, S; Yocum, D R

    2008-01-01

    Fermilab supports a scientific program that includes experiments and scientists located across the globe. In order to better serve this community, Fermilab has placed its production computer resources in a Campus Grid infrastructure called 'FermiGrid'. The FermiGrid infrastructure allows the large experiments at Fermilab to have priority access to their own resources, enables sharing of these resources in an opportunistic fashion, and movement of work (jobs, data) between the Campus Grid and National Grids such as Open Science Grid (OSG) and the Worldwide LHC Computing Grid Collaboration (WLCG). FermiGrid resources support multiple Virtual Organizations (VOs), including VOs from the OSG, EGEE, and the WLCG. Fermilab also makes leading contributions to the Open Science Grid in the areas of accounting, batch computing, grid security, job management, resource selection, site infrastructure, storage management, and VO services. Through the FermiGrid interfaces, authenticated and authorized VOs and individuals may access our core grid services, the 10,000+ Fermilab resident CPUs, near-petabyte (including CMS) online disk pools and the multi-petabyte Fermilab Mass Storage System. These core grid services include a site wide Globus gatekeeper, VO management services for several VOs, Fermilab site authorization services, grid user mapping services, as well as job accounting and monitoring, resource selection and data movement services. Access to these services is via standard and well-supported grid interfaces. We will report on the user experience of using the FermiGrid campus infrastructure interfaced to a national cyberinfrastructure - the successes and the problems

  16. Open science, e-science and the new technologies: Challenges and old problems in qualitative research in the social sciences

    Directory of Open Access Journals (Sweden)

    Ercilia García-Álvarez

    2012-12-01

    Full Text Available Purpose: As well as introducing the articles in the special issue titled "Qualitative Research in the Social Sciences", this article reviews the challenges, problems and main advances made by the qualitative paradigm in the context of the new European science policy based on open science and e-Science and analysis alternative technologies freely available in the 2.0 environment and their application to fieldwork and data analysis. Design/methodology: Theoretical review. Practical implications: The article identifies open access technologies with applications in qualitative research such as applications for smartphones and tablets, web platforms and specific qualitative data analysis software, all developed in both the e-Science context and the 2.0 environment. Social implications: The article discusses the possible role to be played by qualitative research in the open science and e-Science context and considers the impact of this new context on the size and structure of research groups, the development of truly collaborative research, the emergence of new ethical problems and quality assessment in review processes in an open environment. Originality/value: The article describes the characteristics that define the new scientific environment and the challenges posed for qualitative research, reviews the latest open access technologies available to researchers in terms of their main features and proposes specific applications suitable for fieldwork and data analysis.

  17. An Open-Loop Grid Synchronization Approach for Single-Phase Applications

    DEFF Research Database (Denmark)

    Golestan, Saeed; Guerrero, Josep M.; Quintero, Juan Carlos Vasquez

    2018-01-01

    in the presence of frequency drifts. This is particularly true in single-phase applications, where the lack of multiple independent input signals makes the implementation of the synchronization technique difficult. The aim of this paper is to develop an effective OLS technique for single-phase power and energy...... applications. The proposed OLS method benefits from a straightforward implementation, a fast dynamic response (a response time less than two cycles of the nominal frequency), and a complete immunity against the DC component in the grid voltage. In addition, the designed OLS method totally blocks (significantly...

  18. Grid-enabled measures: using Science 2.0 to standardize measures and share data.

    Science.gov (United States)

    Moser, Richard P; Hesse, Bradford W; Shaikh, Abdul R; Courtney, Paul; Morgan, Glen; Augustson, Erik; Kobrin, Sarah; Levin, Kerry Y; Helba, Cynthia; Garner, David; Dunn, Marsha; Coa, Kisha

    2011-05-01

    Scientists are taking advantage of the Internet and collaborative web technology to accelerate discovery in a massively connected, participative environment--a phenomenon referred to by some as Science 2.0. As a new way of doing science, this phenomenon has the potential to push science forward in a more efficient manner than was previously possible. The Grid-Enabled Measures (GEM) database has been conceptualized as an instantiation of Science 2.0 principles by the National Cancer Institute (NCI) with two overarching goals: (1) promote the use of standardized measures, which are tied to theoretically based constructs; and (2) facilitate the ability to share harmonized data resulting from the use of standardized measures. The first is accomplished by creating an online venue where a virtual community of researchers can collaborate together and come to consensus on measures by rating, commenting on, and viewing meta-data about the measures and associated constructs. The second is accomplished by connecting the constructs and measures to an ontological framework with data standards and common data elements such as the NCI Enterprise Vocabulary System (EVS) and the cancer Data Standards Repository (caDSR). This paper will describe the web 2.0 principles on which the GEM database is based, describe its functionality, and discuss some of the important issues involved with creating the GEM database such as the role of mutually agreed-on ontologies (i.e., knowledge categories and the relationships among these categories--for data sharing). Published by Elsevier Inc.

  19. Space charge calibration of the ALICE TPC operated with an open gating grid

    Energy Technology Data Exchange (ETDEWEB)

    Hellbaer, Ernst [Institut fuer Kernphysik, Goethe-Universitaet Frankfurt (Germany); Ivanov, Marian [GSI (Germany); Wiechula, Jens [Universitaet Tuebingen (Germany); Collaboration: ALICE-Collaboration

    2015-07-01

    The Time Projection Chamber (TPC) is the main particle identification detector of the ALICE experiment at the CERN LHC. High interaction rates of 50 kHz in Pb-Pb during the Run 3 period after 2020 require a major upgrade of the TPC readout. The currently used Multiwire Proportional Chambers (MWPCs) will be replaced by readout chambers (ROCs) based on Gas Electron Multiplier (GEM) technology which will be operated in a continuous mode. While the gating grid of the MWPCs prevents the positive ions of the amplification region from entering the drift volume, the GEM-based ROCs will introduce an ion backflow (IBF) of about 1%. In combination with the high-luminosity environment, this amount of back-drifting ions results in a considerable space charge density which distorts the drift path of the primary ionisation electrons significantly. In order to still provide a high tracking efficiency and cluster-to-track association, an efficient calibration scheme will be implemented. As a test ground for the new calibration scheme, pp collision data was taken during Run 1 with the gating grid operated in a transparent mode allowing the ions to enter the drift volume. The measured space point distortions due to the space charge are presented together with the corrected data and compared to simulations for Run 3.

  20. Tunable Reaction Potentials in Open Framework Nanoparticle Battery Electrodes for Grid-Scale Energy Storage

    KAUST Repository

    Wessells, Colin D.; McDowell, Matthew T.; Peddada, Sandeep V.; Pasta, Mauro; Huggins, Robert A.; Cui, Yi

    2012-01-01

    commercial battery technology cannot provide adequate power, and cycle life, and energy efficiency at a sufficiently low cost. Copper hexacyanoferrate and nickel hexacyanoferrate, two open framework materials with the Prussian Blue structure, were recently

  1. Interdisciplinary research center devoted to molecular environmental science opens

    Science.gov (United States)

    Vaughan, David J.

    In October, a new research center opened at the University of Manchester in the United Kingdom. The center is the product of over a decade of ground-breaking interdisciplinary research in the Earth and related biological and chemical sciences at the university The center also responds to the British governments policy of investing in research infrastructure at key universities.The Williamson Research Centre, the first of its kind in Britain and among the first worldwide, is devoted to the emerging field of molecular environmental science. This field also aims to bring about a revolution in understanding of our environment. Though it may be a less violent revolution than some, perhaps, its potential is high for developments that could affect us all.

  2. LAMMPS Project Report for the Trinity KNL Open Science Period.

    Energy Technology Data Exchange (ETDEWEB)

    Moore, Stan Gerald [Sandia National Lab. (SNL-NM), Albuquerque, NM (United States); Thompson, Aidan P. [Sandia National Lab. (SNL-NM), Albuquerque, NM (United States); Wood, Mitchell [Sandia National Lab. (SNL-NM), Albuquerque, NM (United States)

    2017-08-01

    LAMMPS is a classical molecular dynamics code (lammps.sandia.gov) used to model materials science problems at Sandia National Laboratories and around the world. LAMMPS was one of three Sandia codes selected to participate in the Trinity KNL (TR2) Open Science period. During this period, three different problems of interest were investigated using LAMMPS. The first was benchmarking KNL performance using different force field models. The second was simulating void collapse in shocked HNS energetic material using an all-atom model. The third was simulating shock propagation through poly-crystalline RDX energetic material using a coarse-grain model, the results of which were used in an ACM Gordon Bell Prize submission. This report describes the results of these simulations, lessons learned, and some hardware issues found on Trinity KNL as part of this work.

  3. Informatics in radiology: An open-source and open-access cancer biomedical informatics grid annotation and image markup template builder.

    Science.gov (United States)

    Mongkolwat, Pattanasak; Channin, David S; Kleper, Vladimir; Rubin, Daniel L

    2012-01-01

    In a routine clinical environment or clinical trial, a case report form or structured reporting template can be used to quickly generate uniform and consistent reports. Annotation and image markup (AIM), a project supported by the National Cancer Institute's cancer biomedical informatics grid, can be used to collect information for a case report form or structured reporting template. AIM is designed to store, in a single information source, (a) the description of pixel data with use of markups or graphical drawings placed on the image, (b) calculation results (which may or may not be directly related to the markups), and (c) supplemental information. To facilitate the creation of AIM annotations with data entry templates, an AIM template schema and an open-source template creation application were developed to assist clinicians, image researchers, and designers of clinical trials to quickly create a set of data collection items, thereby ultimately making image information more readily accessible.

  4. YouPower : An open source platform for community-oriented smart grid user engagement

    NARCIS (Netherlands)

    Huang, Yilin; Hasselqvist, Hanna; Poderi, Giacomo; Scepanovic, S.; Kis, F.; Bogdan, Cristian; Warnier, Martijn; Brazier, F.M.

    2017-01-01

    This paper presents YouPower, an open source platform designed to make people more aware of their energy consumption and encourage sustainable consumption with local communities. The platform is designed iteratively in collaboration with users in the Swedish and Italian test sites of the project

  5. Open Access to Scientific Data: Promoting Science and Innovation

    Directory of Open Access Journals (Sweden)

    Guan-Hua Xu

    2007-06-01

    Full Text Available As an important part of the science and technology infrastructure platform of China, the Ministry of Science and Technology launched the Scientific Data Sharing Program in 2002. Twenty-four government agencies now participate in the Program. After five years of hard work, great progress has been achieved in the policy and legal framework, data standards, pilot projects, and international cooperation. By the end of 2005, one-third of the existing public-interest and basic scientific databases in China had been integrated and upgraded. By 2020, China is expected to build a more user-friendly scientific data management and sharing system, with 80 percent of scientific data available to the general public. In order to realize this objective, the emphases of the project are to perfect the policy and legislation system, improve the quality of data resources, expand and establish national scientific data centers, and strengthen international cooperation. It is believed that with the opening up of access to scientific data in China, the Program will play a bigger role in promoting science and national innovation.

  6. Further validation of liquid metal MHD code for unstructured grid based on OpenFOAM

    Energy Technology Data Exchange (ETDEWEB)

    Feng, Jingchao; Chen, Hongli, E-mail: hlchen1@ustc.edu.cn; He, Qingyun; Ye, Minyou

    2015-11-15

    Highlights: • Specific correction scheme has been adopted to revise the calculating result for non-orthogonal meshes. • The developed MHD code based on OpenFOAM platform has been validated by benchmark cases under uniform and non-uniform magnetic field in round and rectangular ducts. • ALEX experimental results have been used to validate the MHD code based on OpenFOAM. - Abstract: In fusion liquid metal blankets, complex geometries involving contractions, expansions, bends, manifolds are very common. The characteristics of liquid metal flow in these geometries are significant. In order to extend the magnetohydrodynamic (MHD) solver developed on OpenFOAM platform to be applied in the complex geometry, the MHD solver based on unstructured meshes has been implemented. The adoption of non-orthogonal correction techniques in the solver makes it possible to process the non-orthogonal meshes in complex geometries. The present paper focused on the validation of the code under critical conditions. An analytical solution benchmark case and two experimental benchmark cases were conducted to validate the code. Benchmark case I is MHD flow in a circular pipe with arbitrary electric conductivity of the walls in a uniform magnetic field. Benchmark cases II and III are experimental cases of 3D laminar steady MHD flow under fringing magnetic field. In all these cases, the numerical results match well with the benchmark cases.

  7. Further validation of liquid metal MHD code for unstructured grid based on OpenFOAM

    International Nuclear Information System (INIS)

    Feng, Jingchao; Chen, Hongli; He, Qingyun; Ye, Minyou

    2015-01-01

    Highlights: • Specific correction scheme has been adopted to revise the calculating result for non-orthogonal meshes. • The developed MHD code based on OpenFOAM platform has been validated by benchmark cases under uniform and non-uniform magnetic field in round and rectangular ducts. • ALEX experimental results have been used to validate the MHD code based on OpenFOAM. - Abstract: In fusion liquid metal blankets, complex geometries involving contractions, expansions, bends, manifolds are very common. The characteristics of liquid metal flow in these geometries are significant. In order to extend the magnetohydrodynamic (MHD) solver developed on OpenFOAM platform to be applied in the complex geometry, the MHD solver based on unstructured meshes has been implemented. The adoption of non-orthogonal correction techniques in the solver makes it possible to process the non-orthogonal meshes in complex geometries. The present paper focused on the validation of the code under critical conditions. An analytical solution benchmark case and two experimental benchmark cases were conducted to validate the code. Benchmark case I is MHD flow in a circular pipe with arbitrary electric conductivity of the walls in a uniform magnetic field. Benchmark cases II and III are experimental cases of 3D laminar steady MHD flow under fringing magnetic field. In all these cases, the numerical results match well with the benchmark cases.

  8. Semantic Web-based Vocabulary Broker for Open Science

    Science.gov (United States)

    Ritschel, B.; Neher, G.; Iyemori, T.; Murayama, Y.; Kondo, Y.; Koyama, Y.; King, T. A.; Galkin, I. A.; Fung, S. F.; Wharton, S.; Cecconi, B.

    2016-12-01

    Keyword vocabularies are used to tag and to identify data of science data repositories. Such vocabularies consist of controlled terms and the appropriate concepts, such as GCMD1 keywords or the ESPAS2 keyword ontology. The Semantic Web-based mash-up of domain-specific, cross- or even trans-domain vocabularies provides unique capabilities in the network of appropriate data resources. Based on a collaboration between GFZ3, the FHP4, the WDC for Geomagnetism5 and the NICT6 we developed the concept of a vocabulary broker for inter- and trans-disciplinary data detection and integration. Our prototype of the Semantic Web-based vocabulary broker uses OSF7 for the mash-up of geo and space research vocabularies, such as GCMD keywords, ESPAS keyword ontology and SPASE8 keyword vocabulary. The vocabulary broker starts the search with "free" keywords or terms of a specific vocabulary scheme. The vocabulary broker almost automatically connects the different science data repositories which are tagged by terms of the aforementioned vocabularies. Therefore the mash-up of the SKOS9 based vocabularies with appropriate metadata from different domains can be realized by addressing LOD10 resources or virtual SPARQL11 endpoints which maps relational structures into the RDF format12. In order to demonstrate such a mash-up approach in real life, we installed and use a D2RQ13 server for the integration of IUGONET14 data which are managed by a relational database. The OSF based vocabulary broker and the D2RQ platform are installed at virtual LINUX machines at the Kyoto University. The vocabulary broker meets the standard of a main component of the WDS15 knowledge network. The Web address of the vocabulary broker is http://wdcosf.kugi.kyoto-u.ac.jp 1 Global Change Master Directory2 Near earth space data infrastructure for e-science3 German Research Centre for Geosciences4 University of Applied Sciences Potsdam5 World Data Center for Geomagnetism Kyoto6 National Institute of Information and

  9. Smart business for smart users? : A social science agenda for developing smart grids

    NARCIS (Netherlands)

    Verbong, G.P.J.; Verkade, N.; Verhees, B.; Huijben, J.C.C.M.; Höffken, J.I.; Beaulieu, A.; de Wilde, J.; Scherpen, J.M.A.

    2016-01-01

    The promise of smart grids is very attractive. However, it is not yet clear what the future smart grid will look like. Although most researchers acknowledge that users will play a more prominent role in smart grids, there is a lot of uncertainty on this issue. To counter the strong techno-logical

  10. Developing Healthcare Data Analytics APPs with Open Data Science Tools.

    Science.gov (United States)

    Hao, Bibo; Sun, Wen; Yu, Yiqin; Xie, Guotong

    2017-01-01

    Recent advances in big data analytics provide more flexible, efficient, and open tools for researchers to gain insight from healthcare data. Whilst many tools require researchers to develop programs with programming languages like Python, R and so on, which is not a skill set grasped by many researchers in the healthcare data analytics area. To make data science more approachable, we explored existing tools and developed a practice that can help data scientists convert existing analytics pipelines to user-friendly analytics APPs with rich interactions and features of real-time analysis. With this practice, data scientists can develop customized analytics pipelines as APPs in Jupyter Notebook and disseminate them to other researchers easily, and researchers can benefit from the shared notebook to perform analysis tasks or reproduce research results much more easily.

  11. LLNL Mercury Project Trinity Open Science Final Report

    Energy Technology Data Exchange (ETDEWEB)

    Brantley, Patrick [Lawrence Livermore National Lab. (LLNL), Livermore, CA (United States); Dawson, Shawn [Lawrence Livermore National Lab. (LLNL), Livermore, CA (United States); McKinley, Scott [Lawrence Livermore National Lab. (LLNL), Livermore, CA (United States); O' Brien, Matt [Lawrence Livermore National Lab. (LLNL), Livermore, CA (United States); Peters, Doug [Lawrence Livermore National Lab. (LLNL), Livermore, CA (United States); Pozulp, Mike [Lawrence Livermore National Lab. (LLNL), Livermore, CA (United States); Becker, Greg [Lawrence Livermore National Lab. (LLNL), Livermore, CA (United States); Mohror, Kathryn [Lawrence Livermore National Lab. (LLNL), Livermore, CA (United States); Moody, Adam [Lawrence Livermore National Lab. (LLNL), Livermore, CA (United States)

    2016-04-20

    The Mercury Monte Carlo particle transport code developed at Lawrence Livermore National Laboratory (LLNL) is used to simulate the transport of radiation through urban environments. These challenging calculations include complicated geometries and require significant computational resources to complete. As a result, a question arises as to the level of convergence of the calculations with Monte Carlo simulation particle count. In the Trinity Open Science calculations, one main focus was to investigate convergence of the relevant simulation quantities with Monte Carlo particle count to assess the current simulation methodology. Both for this application space but also of more general applicability, we also investigated the impact of code algorithms on parallel scaling on the Trinity machine as well as the utilization of the Trinity DataWarp burst buffer technology in Mercury via the LLNL Scalable Checkpoint/Restart (SCR) library.

  12. Order Without Intellectual Property Law: Open Science in Influenza.

    Science.gov (United States)

    Kapczynski, Amy

    Today, intellectual property (IP) scholars accept that IP as an approach to information production has serious limits. But what lies beyond IP? A new literature on "intellectual production without IP" (or "IP without IP") has emerged to explore this question, but its examples and explanations have yet to convince skeptics. This Article reorients this new literature via a study of a hard case: a global influenza virus-sharing network that has for decades produced critically important information goods, at significant expense, and in a loose-knit group--all without recourse to IP. I analyze the Network as an example of "open science," a mode of information production that differs strikingly from conventional IP, and yet that successfully produces important scientific goods in response to social need. The theory and example developed here refute the most powerful criticisms of the emerging "IP without IP" literature, and provide a stronger foundation for this important new field. Even where capital costs are high, creation without IP can be reasonably effective in social terms, if it can link sources of funding to reputational and evaluative feedback loops like those that characterize open science. It can also be sustained over time, even by loose-knit groups and where the stakes are high, because organizations and other forms of law can help to stabilize cooperation. I also show that contract law is well suited to modes of information production that rely upon a "supply side" rather than "demand side" model. In its most important instances, "order without IP" is not order without governance, nor order without law. Recognizing this can help us better ground this new field, and better study and support forms of knowledge production that deserve our attention, and that sometimes sustain our very lives.

  13. A Bibliometric Study of Scholarly Articles Published by Library and Information Science Authors about Open Access

    Science.gov (United States)

    Grandbois, Jennifer; Beheshti, Jamshid

    2014-01-01

    Introduction: This study aims to gain a greater understanding of the development of open access practices amongst library and information science authors, since their role is integral to the success of the broader open access movement. Method: Data were collected from scholarly articles about open access by library and information science authors…

  14. The Tanenbaum Open Science Institute: Leading a Paradigm Shift at the Montreal Neurological Institute.

    Science.gov (United States)

    Poupon, Viviane; Seyller, Annabel; Rouleau, Guy A

    2017-08-30

    The Montreal Neurological Institute is adopting an Open Science Policy that will be enacted by the Tanenbaum Open Science Institute. The aim is to accelerate the generation of knowledge and novel effective treatments for brain disorders by freeing science. Copyright © 2017 Elsevier Inc. All rights reserved.

  15. Future opportunities and future trends for e-infrastructures and life sciences: going beyond grid to enable life science data analysis

    Directory of Open Access Journals (Sweden)

    Fotis ePsomopoulos

    2015-06-01

    Full Text Available With the increasingly rapid growth of data in Life Sciences we are witnessing a major transition in the way research is conducted, from hypothesis-driven studies to data-driven simulations of whole systems. In the context of the European Grid Infrastructure Community Forum 2014 (Helsinki, 19–23 May 2014, a workshop was held aimed at understanding the state of the art of Grid/Cloud computing in EU research as viewed from within the field of Life Sciences. The workshop brought together Life Science researchers and infrastructure providers from around Europe and facilitated networking between them within the context of EGI. The first part of the workshop included talks from key infrastructures and projects within the Life Sciences community. This was complemented by technical talks that established the key aspects present in major research approaches. Finally, the discussion phase provided significant insights into the road ahead with proposals for possible collaborations and suggestions for future actions.

  16. Cyberinfrastructure for Open Science at the Montreal Neurological Institute.

    Science.gov (United States)

    Das, Samir; Glatard, Tristan; Rogers, Christine; Saigle, John; Paiva, Santiago; MacIntyre, Leigh; Safi-Harab, Mouna; Rousseau, Marc-Etienne; Stirling, Jordan; Khalili-Mahani, Najmeh; MacFarlane, David; Kostopoulos, Penelope; Rioux, Pierre; Madjar, Cecile; Lecours-Boucher, Xavier; Vanamala, Sandeep; Adalat, Reza; Mohaddes, Zia; Fonov, Vladimir S; Milot, Sylvain; Leppert, Ilana; Degroot, Clotilde; Durcan, Thomas M; Campbell, Tara; Moreau, Jeremy; Dagher, Alain; Collins, D Louis; Karamchandani, Jason; Bar-Or, Amit; Fon, Edward A; Hoge, Rick; Baillet, Sylvain; Rouleau, Guy; Evans, Alan C

    2016-01-01

    Data sharing is becoming more of a requirement as technologies mature and as global research and communications diversify. As a result, researchers are looking for practical solutions, not only to enhance scientific collaborations, but also to acquire larger amounts of data, and to access specialized datasets. In many cases, the realities of data acquisition present a significant burden, therefore gaining access to public datasets allows for more robust analyses and broadly enriched data exploration. To answer this demand, the Montreal Neurological Institute has announced its commitment to Open Science, harnessing the power of making both clinical and research data available to the world (Owens, 2016a,b). As such, the LORIS and CBRAIN (Das et al., 2016) platforms have been tasked with the technical challenges specific to the institutional-level implementation of open data sharing, including: Comprehensive linking of multimodal data (phenotypic, clinical, neuroimaging, biobanking, and genomics, etc.)Secure database encryption, specifically designed for institutional and multi-project data sharing, ensuring subject confidentiality (using multi-tiered identifiers).Querying capabilities with multiple levels of single study and institutional permissions, allowing public data sharing for all consented and de-identified subject data.Configurable pipelines and flags to facilitate acquisition and analysis, as well as access to High Performance Computing clusters for rapid data processing and sharing of software tools.Robust Workflows and Quality Control mechanisms ensuring transparency and consistency in best practices.Long term storage (and web access) of data, reducing loss of institutional data assets.Enhanced web-based visualization of imaging, genomic, and phenotypic data, allowing for real-time viewing and manipulation of data from anywhere in the world.Numerous modules for data filtering, summary statistics, and personalized and configurable dashboards. Implementing

  17. A national upgrade of the climate monitoring grid in Sri Lanka. The place of Open Design, OSHW and FOSS.

    Science.gov (United States)

    Chemin, Yann; Bandara, Niroshan; Eriyagama, Nishadi

    2015-04-01

    The National Climate Observatory of Sri lanka is a proposition designed for the Government of Sri Lanka in September and discussed with private and public stakeholders in November 2014. The idea was initially to install a networked grid of weather instruments from locally-made open source hardware technology, on land and seas, that report live the state of climate. After initial stakeholder meetings, it was agreed to first try to connect any existing weather stations from different governmental and private sector agencies. This would bring existing information to a common ground through the Internet. At this point, it was realized that extracting information from various vendors set up would take a large amount of efforts, that is still the best and fastest anyway, as considerations from ownership and maintenance are the most important issues in a tropical humid country as Sri Lanka. Thus, the question of Open Design, open source hardware (OSHW) and free and open source software (FOSS) became a pivotal element in considering operationalization of any future elements of a national grid. Reasons range from ownership, to low-cost and customization, but prominently it is about technology ownership, royalty-free and local availability. Building on previous work from (Chemin and Bandara, 2014) we proposed to open design specifications and prototypes for weather monitoring for various kinds of needs, the Meteorological Department clearly specified that the highest variability observed spatially in Sri Lanka is rainfall, and their willingness to investigate OSHW electronics using their new team of electronics and sensors specialists. A local manufacturer is providing an OSHW micro-controller product, a start up is providing additional sensor boards under OSHW specifications and local manufacture of the sensors (tipping-bucket and other wind sensors) is under development and blueprints have been made available in the Public Domain for CNC machine, 3D printing or Plastic

  18. How FOSTER supports training Open Science in the GeoSciences

    Science.gov (United States)

    Orth, Astrid

    2016-04-01

    FOSTER (1) is about promoting and facilitating the adoption of Open Science by the European research community, and fostering compliance with the open access policies set out in Horizon 2020 (H2020). FOSTER aims to reach out and provide training to the wide range of disciplines and countries involved in the European Research Area (ERA) by offering and supporting face-to-face as well as distance training. Different stakeholders, mainly young researchers, are trained to integrate Open Science in their daily workflow, supporting researchers to optimise their research visibility and impact. Strengthening the institutional training capacity is achieved through a train-the-trainers approach. The two-and-half-year project started in February 2014 with identifying, enriching and providing training content on all relevant topics in the area of Open Science. One of the main elements was to support two rounds of trainings, which were conducted during 2014 and 2015, organizing more than 100 training events with around 3000 participants. The presentation will explain the project objectives and results and will look into best practice training examples, among them successful training series in the GeoSciences. The FOSTER portal that now holds a collection of training resources (e.g. slides and PDFs, schedules and design of training events dedicated to different audiences, video captures of complete events) is presented. It provides easy ways to identify learning materials and to create own e-learning courses based on the materials and examples. (1) FOSTER is funded through the European Union's Seventh Framework Programme for research, technological development and demonstration under grant agreement no 612425. http://fosteropenscience.eu

  19. Grid interoperability: joining grid information systems

    International Nuclear Information System (INIS)

    Flechl, M; Field, L

    2008-01-01

    A grid is defined as being 'coordinated resource sharing and problem solving in dynamic, multi-institutional virtual organizations'. Over recent years a number of grid projects, many of which have a strong regional presence, have emerged to help coordinate institutions and enable grids. Today, we face a situation where a number of grid projects exist, most of which are using slightly different middleware. Grid interoperation is trying to bridge these differences and enable Virtual Organizations to access resources at the institutions independent of their grid project affiliation. Grid interoperation is usually a bilateral activity between two grid infrastructures. Recently within the Open Grid Forum, the Grid Interoperability Now (GIN) Community Group is trying to build upon these bilateral activities. The GIN group is a focal point where all the infrastructures can come together to share ideas and experiences on grid interoperation. It is hoped that each bilateral activity will bring us one step closer to the overall goal of a uniform grid landscape. A fundamental aspect of a grid is the information system, which is used to find available grid services. As different grids use different information systems, interoperation between these systems is crucial for grid interoperability. This paper describes the work carried out to overcome these differences between a number of grid projects and the experiences gained. It focuses on the different techniques used and highlights the important areas for future standardization

  20. Grid Security

    CERN Multimedia

    CERN. Geneva

    2004-01-01

    The aim of Grid computing is to enable the easy and open sharing of resources between large and highly distributed communities of scientists and institutes across many independent administrative domains. Convincing site security officers and computer centre managers to allow this to happen in view of today's ever-increasing Internet security problems is a major challenge. Convincing users and application developers to take security seriously is equally difficult. This paper will describe the main Grid security issues, both in terms of technology and policy, that have been tackled over recent years in LCG and related Grid projects. Achievements to date will be described and opportunities for future improvements will be addressed.

  1. Pre-Service Teachers' Attitudes toward Teaching Science and Their Science Learning at Indonesia Open University

    Science.gov (United States)

    Suprapto, Nadi; Mursid, Ali

    2017-01-01

    This study focuses on attitudes toward (teaching) science and the learning of science for primary school among pre-service teachers at the Open University of Indonesia. A three-year longitudinal survey was conducted, involving 379 students as pre-service teachers (PSTs) from the Open University in Surabaya regional office. Attitudes toward…

  2. ROSA P : The National Transportation Library’s Repository and Open Science Access Portal

    Science.gov (United States)

    2018-01-01

    The National Transportation Library (NTL) was founded as an all-digital repository of US DOT research reports, technical publications and data products. NTLs primary public offering is ROSA P, the Repository and Open Science Access Portal. An open...

  3. ISOGA: Integrated Services Optical Grid Architecture for Emerging E-Science Collaborative Applications

    Energy Technology Data Exchange (ETDEWEB)

    Oliver Yu

    2008-11-28

    This final report describes the accomplishments in the ISOGA (Integrated Services Optical Grid Architecture) project. ISOGA enables efficient deployment of existing and emerging collaborative grid applications with increasingly diverse multimedia communication requirements over a wide-area multi-domain optical network grid; and enables collaborative scientists with fast retrieval and seamless browsing of distributed scientific multimedia datasets over a wide-area optical network grid. The project focuses on research and development in the following areas: the polymorphic optical network control planes to enable multiple switching and communication services simultaneously; the intelligent optical grid user-network interface to enable user-centric network control and monitoring; and the seamless optical grid dataset browsing interface to enable fast retrieval of local/remote dataset for visualization and manipulation.

  4. Mobile Open-Source Solar-Powered 3-D Printers for Distributed Manufacturing in Off-Grid Communities

    Directory of Open Access Journals (Sweden)

    Debbie L. King

    2014-04-01

    Full Text Available Manufacturing in areas of the developing world that lack electricity severely restricts the technical sophistication of what is produced. More than a billion people with no access to electricity still have access to some imported higher-technologies; however, these often lack customization and often appropriateness for their community. Open source appropriate tech­nology (OSAT can over­come this challenge, but one of the key impediments to the more rapid development and distri­bution of OSAT is the lack of means of production beyond a specific technical complexity. This study designs and demonstrates the technical viability of two open-source mobile digital manufacturing facilities powered with solar photovoltaics, and capable of printing customizable OSAT in any com­munity with access to sunlight. The first, designed for com­munity use, such as in schools or maker­spaces, is semi-mobile and capable of nearly continuous 3-D printing using RepRap technology, while also powering multiple computers. The second design, which can be completely packed into a standard suitcase, allows for specialist travel from community to community to provide the ability to custom manufacture OSAT as needed, anywhere. These designs not only bring the possibility of complex manufacturing and replacement part fabrication to isolated rural communities lacking access to the electric grid, but they also offer the opportunity to leap-frog the entire conventional manufacturing supply chain, while radically reducing both the cost and the environmental impact of products for developing communities.

  5. Integrating Grid Services into the Cray XT4 Environment

    OpenAIRE

    Cholia, Shreyas

    2009-01-01

    The 38640 core Cray XT4 "Franklin" system at the National Energy Research Scientific Computing Center (NERSC) is a massively parallel resource available to Department of Energy researchers that also provides on-demand grid computing to the Open Science Grid. The integration of grid services on Franklin presented various challenges, including fundamental differences between the interactive and compute nodes, a stripped down compute-node operating system without dynamic library support, a share...

  6. Making USGS Science Data more Open, Accessible, and Usable: Leveraging ScienceBase for Success

    Science.gov (United States)

    Chang, M.; Ignizio, D.; Langseth, M. L.; Norkin, T.

    2016-12-01

    In 2013, the White House released initiatives requiring federally funded research to be made publicly available and machine readable. In response, the U.S. Geological Survey (USGS) has been developing a unified approach to make USGS data available and open. This effort has involved the establishment of internal policies and the release of a Public Access Plan, which outlines a strategy for the USGS to move forward into the modern era in scientific data management. Originally designed as a catalog and collaborative data management platform, ScienceBase (www.sciencebase.gov) is being leveraged to serve as a robust data hosting solution for USGS researchers to make scientific data accessible. With the goal of maintaining persistent access to formal data products and developing a management approach to facilitate stable data citation, the ScienceBase Data Release Team was established to ensure the quality, consistency, and meaningful organization of USGS data through standardized workflows and best practices. These practices include the creation and maintenance of persistent identifiers for data, improving the use of open data formats, establishing permissions for read/write access, validating the quality of standards compliant metadata, verifying that data have been reviewed and approved prior to release, and connecting to external search catalogs such as the USGS Science Data Catalog (data.usgs.gov) and data.gov. The ScienceBase team is actively building features to support this effort by automating steps to streamline the process, building metrics to track site visits and downloads, and connecting published digital resources in line with USGS and Federal policy. By utilizing ScienceBase to achieve stewardship quality and employing a dedicated team to help USGS scientists improve the quality of their data, the USGS is helping to meet today's data quality management challenges and ensure that reliable USGS data are available to and reusable for the public.

  7. Next-Generation Metrics: Responsible Metrics & Evaluation for Open Science

    Energy Technology Data Exchange (ETDEWEB)

    Wilsdon, J.; Bar-Ilan, J.; Peters, I.; Wouters, P.

    2016-07-01

    Metrics evoke a mixed reaction from the research community. A commitment to using data to inform decisions makes some enthusiastic about the prospect of granular, real-time analysis o of research and its wider impacts. Yet we only have to look at the blunt use of metrics such as journal impact factors, h-indices and grant income targets, to be reminded of the pitfalls. Some of the most precious qualities of academic culture resist simple quantification, and individual indicators often struggle to do justice to the richness and plurality of research. Too often, poorly designed evaluation criteria are “dominating minds, distorting behaviour and determining careers (Lawrence, 2007).” Metrics hold real power: they are constitutive of values, identities and livelihoods. How to exercise that power to more positive ends has been the focus of several recent and complementary initiatives, including the San Francisco Declaration on Research Assessment (DORA1), the Leiden Manifesto2 and The Metric Tide3 (a UK government review of the role of metrics in research management and assessment). Building on these initiatives, the European Commission, under its new Open Science Policy Platform4, is now looking to develop a framework for responsible metrics for research management and evaluation, which can be incorporated into the successor framework to Horizon 2020. (Author)

  8. A New Open Access Journal of Marine Science and Engineering

    Directory of Open Access Journals (Sweden)

    Anthony S. Clare

    2013-03-01

    Full Text Available The oceans cover approximately 71% of the Earth’s surface and contain more than 97% of the planet’s water, representing over 100 times more liveable volume than the terrestrial habitat. Approximately fifty percent of the species on the planet occupy this ocean biome, much of which remains unexplored. The health and sustainability of the oceans are threatened by a combination of pressures associated with climate change and the ever-increasing demands we place on them for food, recreation, trade, energy and minerals. The biggest threat, however, is the pace of change to the oceans, e.g., ocean acidification, which is unprecedented in human history. Consequently, there has never been a greater need for the rapid and widespread dissemination of the outcomes of research aimed at improving our understanding of how the oceans work and solutions to their sustainable use. It is our hope that this new online, open-access Journal of Marine Science and Engineering will go some way to fulfilling this need. [...

  9. Authentic school science knowing and learning in open-inquiry science laboratories

    CERN Document Server

    Roth, Wolff-Michael

    1995-01-01

    According to John Dewey, Seymour Papert, Donald Schon, and Allan Collins, school activities, to be authentic, need to share key features with those worlds about which they teach. This book documents learning and teaching in open-inquiry learning environments, designed with the precepts of these educational thinkers in mind. The book is thus a first-hand report of knowing and learning by individuals and groups in complex open-inquiry learning environments in science. As such, it contributes to the emerging literature in this field. Secondly, it exemplifies research methods for studying such complex learning environments. The reader is thus encouraged not only to take the research findings as such, but to reflect on the process of arriving at these findings. Finally, the book is also an example of knowledge constructed by a teacher-researcher, and thus a model for teacher-researcher activity.

  10. Hazards in smart grids. Smart meters can open the door to hackers; Gefahren im intelligenten Stromnetz. Smart Meter als Einfallstor fuer Hacker-Angriffe

    Energy Technology Data Exchange (ETDEWEB)

    Gerretz, Dirk [Covisint Emea Compuware GmbH, Neu-Isenburg (Germany)

    2011-10-31

    Smart grid, smart meter, smart home: Increasingly, intelligent technologies are introduced in the energy sector. The merging of power grids and data grids is costly and requires high investments in areas that are far from the key business and key competence of public utilities. Reliable protection of smart meters is a particular challenge as unauthorized access or manipulation may result in great financial and reputational damage. Prior to introducting smart meters, utilities should decide if they want to introduce the necessary safety technologies themselves, including hardware, software, and know-how, or if they want to rely on solutions provided by experienced market partners. They offer open, expandable and scalable platforms for comprehensive identity management and safe data exchange that have been tested in practice in several branches of industry.

  11. Open Science Strategies in Research Policies: A Comparative Exploration of Canada, the US and the UK

    Science.gov (United States)

    Lasthiotakis, Helen; Kretz, Andrew; Sá, Creso

    2015-01-01

    Several movements have emerged related to the general idea of promoting "openness" in science. Research councils are key institutions in bringing about changes proposed by these movements, as sponsors and facilitators of research. In this paper we identify the approaches used in Canada, the US and the UK to advance open science, as a…

  12. 76 FR 60564 - President's Council of Advisors on Science and Technology; Notice of Meeting: Open Regional...

    Science.gov (United States)

    2011-09-29

    ... development. Facility and infrastructure sharing. Policies that could create a fertile innovation environment...; Notice of Meeting: Open Regional Meeting of the President's Council of Advisors on Science and Technology... schedule and summary agenda for an open regional meeting of the President's Council of Advisors on Science...

  13. Indiana University receives grant from National Science Foundation to help build global grid network

    CERN Multimedia

    2001-01-01

    The NSF awarded a consortium of 15 universities $13.65 million to build the International Virtual Data Grid Laboratory, or iVDGL. The iVDGL will consist of a seamless network of thousands of computers at 40 locations in the US, Europe and Asia. These computers will work together as a powerful grid capable of handling petabytes of data. Indiana University will make significant contributions to this project by providing a prototype Tier-2 Data Center for the ATLAS high energy physics experiment and the International Grid Operations Center.

  14. Perspectives on Open Science and scientific data sharing:an interdisciplinary workshop.

    Science.gov (United States)

    Destro Bisol, Giovanni; Anagnostou, Paolo; Capocasa, Marco; Bencivelli, Silvia; Cerroni, Andrea; Contreras, Jorge; Enke, Neela; Fantini, Bernardino; Greco, Pietro; Heeney, Catherine; Luzi, Daniela; Manghi, Paolo; Mascalzoni, Deborah; Molloy, Jennifer; Parenti, Fabio; Wicherts, Jelte; Boulton, Geoffrey

    2014-01-01

    Looking at Open Science and Open Data from a broad perspective. This is the idea behind "Scientific data sharing: an interdisciplinary workshop", an initiative designed to foster dialogue between scholars from different scientific domains which was organized by the Istituto Italiano di Antropologia in Anagni, Italy, 2-4 September 2013.We here report summaries of the presentations and discussions at the meeting. They deal with four sets of issues: (i) setting a common framework, a general discussion of open data principles, values and opportunities; (ii) insights into scientific practices, a view of the way in which the open data movement is developing in a variety of scientific domains (biology, psychology, epidemiology and archaeology); (iii) a case study of human genomics, which was a trail-blazer in data sharing, and which encapsulates the tension that can occur between large-scale data sharing and one of the boundaries of openness, the protection of individual data; (iv) open science and the public, based on a round table discussion about the public communication of science and the societal implications of open science. There were three proposals for the planning of further interdisciplinary initiatives on open science. Firstly, there is a need to integrate top-down initiatives by governments, institutions and journals with bottom-up approaches from the scientific community. Secondly, more should be done to popularize the societal benefits of open science, not only in providing the evidence needed by citizens to draw their own conclusions on scientific issues that are of concern to them, but also explaining the direct benefits of data sharing in areas such as the control of infectious disease. Finally, introducing arguments from social sciences and humanities in the educational dissemination of open data may help students become more profoundly engaged with Open Science and look at science from a broader perspective.

  15. SemMat: Federated Semantic Services Platform for Open materials Science and Engineering

    Science.gov (United States)

    2017-01-01

    SEMMAT: FEDERATED SEMANTIC SERVICES PLATFORM FOR OPEN MATERIALS SCIENCE AND ENGINEERING WRIGHT STATE UNIVERSITY JANUARY 2017 FINAL TECHNICAL...COVERED (From - To) JUL 2013 – JUN 2016 4. TITLE AND SUBTITLE SemMat: FEDERATED SEMANTIC SERVICES PLATFORM FOR OPEN MATERIALS SCIENCE AND ENGINEERING...models to represent materials data. This provides a data exchange scheme for materials science , which also includes provenance information to promote

  16. Reliable multicast for the Grid: a case study in experimental computer science.

    Science.gov (United States)

    Nekovee, Maziar; Barcellos, Marinho P; Daw, Michael

    2005-08-15

    In its simplest form, multicast communication is the process of sending data packets from a source to multiple destinations in the same logical multicast group. IP multicast allows the efficient transport of data through wide-area networks, and its potentially great value for the Grid has been highlighted recently by a number of research groups. In this paper, we focus on the use of IP multicast in Grid applications, which require high-throughput reliable multicast. These include Grid-enabled computational steering and collaborative visualization applications, and wide-area distributed computing. We describe the results of our extensive evaluation studies of state-of-the-art reliable-multicast protocols, which were performed on the UK's high-speed academic networks. Based on these studies, we examine the ability of current reliable multicast technology to meet the Grid's requirements and discuss future directions.

  17. Open Data in Biomedical Science: Policy Drivers and Recent ...

    Science.gov (United States)

    EPA's progress in implementing the open data initiatives first outlined in the 2009 Presidential memorandum on open government and more specifically regarding publications and data from publications in the 2013 Holdren memorandum. The presentation outlines the major points in both memorandums regarding open data, presents several (but not exhaustive) EPA initiatives on open data, some of which occurred will before both policy memorandums. The presentation concludes by outlining the initiatives to ensure public access to all EPA publications through PubMed Central and all publication-associated data through the Environmental Data Gateway and Data.gov. The purpose of this presentation is to present EPA's progress in implementing the open data initiatives first outlined in the 2009 Presidential memorandum on open government and more specifically regarding publications and data from publications in the 2013 Holdren memorandum.

  18. Interoperation of World-Wide Production e-Science Infrastructures

    CERN Document Server

    Riedel, M; Soddemann, T; Field, L; Navarro, JP; Casey, J; Litmaath, M; Baud, J; Koblitz, B; Catlett, C; Skow, D; Wang, S; Saeki, Y; Sato, H; Matsuoka, S; Geddes, N

    Many production Grid and e-Science infrastructures have begun to offer services to end-users during the past several years with an increasing number of scientific applications that require access to a wide variety of resources and services in multiple Grids. Therefore, the Grid Interoperation Now—Community Group of the Open Grid Forum—organizes and manages interoperation efforts among those production Grid infrastructures to reach the goal of a world-wide Grid vision on a technical level in the near future. This contribution highlights fundamental approaches of the group and discusses open standards in the context of production e-Science infrastructures.

  19. OGC and Grid Interoperability in enviroGRIDS Project

    Science.gov (United States)

    Gorgan, Dorian; Rodila, Denisa; Bacu, Victor; Giuliani, Gregory; Ray, Nicolas

    2010-05-01

    EnviroGRIDS (Black Sea Catchment Observation and Assessment System supporting Sustainable Development) [1] is a 4-years FP7 Project aiming to address the subjects of ecologically unsustainable development and inadequate resource management. The project develops a Spatial Data Infrastructure of the Black Sea Catchment region. The geospatial technologies offer very specialized functionality for Earth Science oriented applications as well as the Grid oriented technology that is able to support distributed and parallel processing. One challenge of the enviroGRIDS project is the interoperability between geospatial and Grid infrastructures by providing the basic and the extended features of the both technologies. The geospatial interoperability technology has been promoted as a way of dealing with large volumes of geospatial data in distributed environments through the development of interoperable Web service specifications proposed by the Open Geospatial Consortium (OGC), with applications spread across multiple fields but especially in Earth observation research. Due to the huge volumes of data available in the geospatial domain and the additional introduced issues (data management, secure data transfer, data distribution and data computation), the need for an infrastructure capable to manage all those problems becomes an important aspect. The Grid promotes and facilitates the secure interoperations of geospatial heterogeneous distributed data within a distributed environment, the creation and management of large distributed computational jobs and assures a security level for communication and transfer of messages based on certificates. This presentation analysis and discusses the most significant use cases for enabling the OGC Web services interoperability with the Grid environment and focuses on the description and implementation of the most promising one. In these use cases we give a special attention to issues such as: the relations between computational grid and

  20. Virtual Labs (Science Gateways) as platforms for Free and Open Source Science

    Science.gov (United States)

    Lescinsky, David; Car, Nicholas; Fraser, Ryan; Friedrich, Carsten; Kemp, Carina; Squire, Geoffrey

    2016-04-01

    The Free and Open Source Software (FOSS) movement promotes community engagement in software development, as well as provides access to a range of sophisticated technologies that would be prohibitively expensive if obtained commercially. However, as geoinformatics and eResearch tools and services become more dispersed, it becomes more complicated to identify and interface between the many required components. Virtual Laboratories (VLs, also known as Science Gateways) simplify the management and coordination of these components by providing a platform linking many, if not all, of the steps in particular scientific processes. These enable scientists to focus on their science, rather than the underlying supporting technologies. We describe a modular, open source, VL infrastructure that can be reconfigured to create VLs for a wide range of disciplines. Development of this infrastructure has been led by CSIRO in collaboration with Geoscience Australia and the National Computational Infrastructure (NCI) with support from the National eResearch Collaboration Tools and Resources (NeCTAR) and the Australian National Data Service (ANDS). Initially, the infrastructure was developed to support the Virtual Geophysical Laboratory (VGL), and has subsequently been repurposed to create the Virtual Hazards Impact and Risk Laboratory (VHIRL) and the reconfigured Australian National Virtual Geophysics Laboratory (ANVGL). During each step of development, new capabilities and services have been added and/or enhanced. We plan on continuing to follow this model using a shared, community code base. The VL platform facilitates transparent and reproducible science by providing access to both the data and methodologies used during scientific investigations. This is further enhanced by the ability to set up and run investigations using computational resources accessed through the VL. Data is accessed using registries pointing to catalogues within public data repositories (notably including the

  1. Future opportunities and trends for e-infrastructures and life sciences: going beyond the grid to enable life science data analysis.

    Science.gov (United States)

    Duarte, Afonso M S; Psomopoulos, Fotis E; Blanchet, Christophe; Bonvin, Alexandre M J J; Corpas, Manuel; Franc, Alain; Jimenez, Rafael C; de Lucas, Jesus M; Nyrönen, Tommi; Sipos, Gergely; Suhr, Stephanie B

    2015-01-01

    With the increasingly rapid growth of data in life sciences we are witnessing a major transition in the way research is conducted, from hypothesis-driven studies to data-driven simulations of whole systems. Such approaches necessitate the use of large-scale computational resources and e-infrastructures, such as the European Grid Infrastructure (EGI). EGI, one of key the enablers of the digital European Research Area, is a federation of resource providers set up to deliver sustainable, integrated and secure computing services to European researchers and their international partners. Here we aim to provide the state of the art of Grid/Cloud computing in EU research as viewed from within the field of life sciences, focusing on key infrastructures and projects within the life sciences community. Rather than focusing purely on the technical aspects underlying the currently provided solutions, we outline the design aspects and key characteristics that can be identified across major research approaches. Overall, we aim to provide significant insights into the road ahead by establishing ever-strengthening connections between EGI as a whole and the life sciences community.

  2. Digital platforms for research collaboration: using design science in developing a South African open knowledge repository

    CSIR Research Space (South Africa)

    van Biljon, J

    2017-05-01

    Full Text Available ) enabled collaboration through the design and development of a sustainable open knowledge repository (OKR) according to the design science research (DSR) paradigm. OKRs are tools used to support knowledge sharing and collaboration. The theoretical...

  3. Open-access databases as unprecedented resources and drivers of cultural change in fisheries science

    Energy Technology Data Exchange (ETDEWEB)

    McManamay, Ryan A [ORNL; Utz, Ryan [National Ecological Observatory Network

    2014-01-01

    Open-access databases with utility in fisheries science have grown exponentially in quantity and scope over the past decade, with profound impacts to our discipline. The management, distillation, and sharing of an exponentially growing stream of open-access data represents several fundamental challenges in fisheries science. Many of the currently available open-access resources may not be universally known among fisheries scientists. We therefore introduce many national- and global-scale open-access databases with applications in fisheries science and provide an example of how they can be harnessed to perform valuable analyses without additional field efforts. We also discuss how the development, maintenance, and utilization of open-access data are likely to pose technical, financial, and educational challenges to fisheries scientists. Such cultural implications that will coincide with the rapidly increasing availability of free data should compel the American Fisheries Society to actively address these problems now to help ease the forthcoming cultural transition.

  4. On the evolving open peer review culture for chemical information science.

    Science.gov (United States)

    Walters, W Patrick; Bajorath, Jürgen

    2015-01-01

    Compared to the traditional anonymous peer review process, open post-publication peer review provides additional opportunities -and challenges- for reviewers to judge scientific studies. In this editorial, we comment on the open peer review culture and provide some guidance for reviewers of manuscripts submitted to the Chemical Information Science channel of F1000Research.

  5. Perspectives on open science and scientific data sharing : An interdisciplinary workshop”

    NARCIS (Netherlands)

    Destro Bisol, G.; Anagnostou, P.; Capocasa, M.; Bencivelli, S.; Cerroni, A.; Contreras, J.; Enke, N.; Fantini, B.; Greco, P.; Heeney, C.; Luzi, D.; Manghi, P.; Mascalzoni, D.; Molloy, J.; Parenti, F.; Wicherts, J.M.; Boulton, G.

    2014-01-01

    Looking at Open Science and Open Data from a broad perspective. This is the idea behind “Scientific data sharing: an interdisciplinary workshop”, an initiative designed to foster dialogue between scholars from different scientific domains which was organized by the Istituto Italiano di Antropologia

  6. OSG Director reports on grid progress

    CERN Multimedia

    Pordes, Ruth

    2006-01-01

    "In this Q&A from the Open Science Grid (OSG), executive director Ruth Prodes provides a brief history of the OSG, an overview of current projects and partners, and a glimpse at future plans, including how the recent $30 million award from the ODE's office of Science and the NSF will be employed. She also shares her thoughts of SC, saying the personal contacts are the best part."(4,5 pages)

  7. 'Open SESAME' for science in the Middle-East

    CERN Multimedia

    2003-01-01

    A memorandum of understanding has just been signed between CERN, SESAME and Jordan. SESAME, the international centre for Synchrotron light for Experimental Science and Applications in the Middle East, is currently being built in Jordan. Its President of Council is no other than CERN's former Director-General, Herwig Schopper.

  8. 75 FR 9876 - Science Advisory Board; Notice of Open Meeting

    Science.gov (United States)

    2010-03-04

    ..., each individual or group making a verbal presentation will be limited to a total time of five (5... Science Advisory Board (SAB) was established by a Decision Memorandum dated September 25, 1997, and is the... resource management. Time and Date: The meeting will be held Tuesday March 23, 2010, from 8:30 a.m. to 5:30...

  9. Data grids a new computational infrastructure for data-intensive science

    CERN Document Server

    Avery, P

    2002-01-01

    Twenty-first-century scientific and engineering enterprises are increasingly characterized by their geographic dispersion and their reliance on large data archives. These characteristics bring with them unique challenges. First, the increasing size and complexity of modern data collections require significant investments in information technologies to store, retrieve and analyse them. Second, the increased distribution of people and resources in these projects has made resource sharing and collaboration across significant geographic and organizational boundaries critical to their success. In this paper I explore how computing infrastructures based on data grids offer data-intensive enterprises a comprehensive, scalable framework for collaboration and resource sharing. A detailed example of a data grid framework is presented for a Large Hadron Collider experiment, where a hierarchical set of laboratory and university resources comprising petaflops of processing power and a multi- petabyte data archive must be ...

  10. Campus Grids: Bringing Additional Computational Resources to HEP Researchers

    International Nuclear Information System (INIS)

    Weitzel, Derek; Fraser, Dan; Bockelman, Brian; Swanson, David

    2012-01-01

    It is common at research institutions to maintain multiple clusters that represent different owners or generations of hardware, or that fulfill different needs and policies. Many of these clusters are consistently under utilized while researchers on campus could greatly benefit from these unused capabilities. By leveraging principles from the Open Science Grid it is now possible to utilize these resources by forming a lightweight campus grid. The campus grids framework enables jobs that are submitted to one cluster to overflow, when necessary, to other clusters within the campus using whatever authentication mechanisms are available on campus. This framework is currently being used on several campuses to run HEP and other science jobs. Further, the framework has in some cases been expanded beyond the campus boundary by bridging campus grids into a regional grid, and can even be used to integrate resources from a national cyberinfrastructure such as the Open Science Grid. This paper will highlight 18 months of operational experiences creating campus grids in the US, and the different campus configurations that have successfully utilized the campus grid infrastructure.

  11. The Open Microscopy Environment: open image informatics for the biological sciences

    Science.gov (United States)

    Blackburn, Colin; Allan, Chris; Besson, Sébastien; Burel, Jean-Marie; Carroll, Mark; Ferguson, Richard K.; Flynn, Helen; Gault, David; Gillen, Kenneth; Leigh, Roger; Leo, Simone; Li, Simon; Lindner, Dominik; Linkert, Melissa; Moore, Josh; Moore, William J.; Ramalingam, Balaji; Rozbicki, Emil; Rustici, Gabriella; Tarkowska, Aleksandra; Walczysko, Petr; Williams, Eleanor; Swedlow, Jason R.

    2016-07-01

    Despite significant advances in biological imaging and analysis, major informatics challenges remain unsolved: file formats are proprietary, storage and analysis facilities are lacking, as are standards for sharing image data and results. While the open FITS file format is ubiquitous in astronomy, astronomical imaging shares many challenges with biological imaging, including the need to share large image sets using secure, cross-platform APIs, and the need for scalable applications for processing and visualization. The Open Microscopy Environment (OME) is an open-source software framework developed to address these challenges. OME tools include: an open data model for multidimensional imaging (OME Data Model); an open file format (OME-TIFF) and library (Bio-Formats) enabling free access to images (5D+) written in more than 145 formats from many imaging domains, including FITS; and a data management server (OMERO). The Java-based OMERO client-server platform comprises an image metadata store, an image repository, visualization and analysis by remote access, allowing sharing and publishing of image data. OMERO provides a means to manage the data through a multi-platform API. OMERO's model-based architecture has enabled its extension into a range of imaging domains, including light and electron microscopy, high content screening, digital pathology and recently into applications using non-image data from clinical and genomic studies. This is made possible using the Bio-Formats library. The current release includes a single mechanism for accessing image data of all types, regardless of original file format, via Java, C/C++ and Python and a variety of applications and environments (e.g. ImageJ, Matlab and R).

  12. The OpenEarth Framework (OEF) for the 3D Visualization of Integrated Earth Science Data

    Science.gov (United States)

    Nadeau, David; Moreland, John; Baru, Chaitan; Crosby, Chris

    2010-05-01

    Data integration is increasingly important as we strive to combine data from disparate sources and assemble better models of the complex processes operating at the Earth's surface and within its interior. These data are often large, multi-dimensional, and subject to differing conventions for data structures, file formats, coordinate spaces, and units of measure. When visualized, these data require differing, and sometimes conflicting, conventions for visual representations, dimensionality, symbology, and interaction. All of this makes the visualization of integrated Earth science data particularly difficult. The OpenEarth Framework (OEF) is an open-source data integration and visualization suite of applications and libraries being developed by the GEON project at the University of California, San Diego, USA. Funded by the NSF, the project is leveraging virtual globe technology from NASA's WorldWind to create interactive 3D visualization tools that combine and layer data from a wide variety of sources to create a holistic view of features at, above, and beneath the Earth's surface. The OEF architecture is open, cross-platform, modular, and based upon Java. The OEF's modular approach to software architecture yields an array of mix-and-match software components for assembling custom applications. Available modules support file format handling, web service communications, data management, user interaction, and 3D visualization. File parsers handle a variety of formal and de facto standard file formats used in the field. Each one imports data into a general-purpose common data model supporting multidimensional regular and irregular grids, topography, feature geometry, and more. Data within these data models may be manipulated, combined, reprojected, and visualized. The OEF's visualization features support a variety of conventional and new visualization techniques for looking at topography, tomography, point clouds, imagery, maps, and feature geometry. 3D data such as

  13. Supporting science in developing countries using open technologies

    International Nuclear Information System (INIS)

    Canessa, Enrique; Zennaro, Marco; Fonda, Carlo

    2009-01-01

    We describe our contributions in using information and communication technologies (ICT) to address the digital and knowledge divides in developing regions. These include the implementation of new prototype systems using state-of-the-art, low-cost technologies based on the scientific audience, the local information technology infrastructure and the level of support available from local technical staff. Efforts are made to provide the necessary capacity and know-how to understand and manage their available information infrastructure with the final goal of supporting their science to allow participation at an international level

  14. Promoting open access to science through effective communication

    Science.gov (United States)

    Egger, A. E.

    2006-12-01

    Geology is a difficult subject to communicate effectively. Many people associate geology with memorizing rock and mineral names and not with dynamic earth processes. Even more challenging for the non-geologist is the concept of deep time, and why processes that happened millions of years ago are important to us today. Additionally, many people view science itself as inaccessible and difficult. And yet, geology is a naturally accessible subject, as it is all around us. In order to communicate effectively, geologists must convince others that their work is both accessible and relevant, even though it may not directly generate economic benefits or lend insight into solutions for our modern problems like climate change. As scientists, we know the connections are there, but convincing others requires creating face-to-face, positive interactions through the use of active techniques to help bring the audience to an understanding of the process of science in addition to the subject matter itself. My overarching motive for creating and participating in communication activities with a broad audience is thus to demonstrate that science is accessible to everyone, that a scientific way of thinking can be both fun and useful, and that a little knowledge about geology can give you a new perspective on the world. Using this motivation as a guiding principle regardless of the specific audience, two techniques are important to make the communication effective. First, whenever possible, I conduct activities in the field (broadly speaking), or at least bring the field into the talk, and model the scientific process by asking for participation. This allows the audience to fully understand how geologic work is done, including the mundane and the mistakes. Second, I take my audience seriously, including all questions and observations, in order to build confidence in everyone that they are able to contribute to and understand both geology and the scientific process in general. Despite the

  15. Analysis gets on the starting Grid

    CERN Multimedia

    Roger Jones

    It is vital for ATLAS to have a functioning distributed analysis system to analyse its data. There are three major Grid deployments in ATLAS (Enabling Grids for E-sciencE, EGEE; the US Open Science Grid, OSG; and the Nordic DataGrid Facility, NGDF), and our data and jobs need to work across all of them, as well as on local machines and batch systems. Users must also be able to locate the data they want and register new small datasets so they can be used later. ATLAS has a suite of products to meet these needs, and a series of Distributed Analysis tutorials are training an increasing number of brave early adopters to use the system. Real users are vital to make sure that the tools are fit for their purpose and to refine our computing model. One such tutorial happened on the 1st and 2nd February at the National eScience Centre in Edinburgh, UK, sponsored by the GridPP Collaboration. The first day introduced an international set of tutees to the basic tools for Grid-based distributed analysis. The architecture...

  16. SEE-GRID eInfrastructure for Regional eScience

    Science.gov (United States)

    Prnjat, Ognjen; Balaz, Antun; Vudragovic, Dusan; Liabotis, Ioannis; Sener, Cevat; Marovic, Branko; Kozlovszky, Miklos; Neagu, Gabriel

    In the past 6 years, a number of targeted initiatives, funded by the European Commission via its information society and RTD programmes and Greek infrastructure development actions, have articulated a successful regional development actions in South East Europe that can be used as a role model for other international developments. The SEEREN (South-East European Research and Education Networking initiative) project, through its two phases, established the SEE segment of the pan-European G ´EANT network and successfully connected the research and scientific communities in the region. Currently, the SEE-LIGHT project is working towards establishing a dark-fiber backbone that will interconnect most national Research and Education networks in the region. On the distributed computing and storage provisioning i.e. Grid plane, the SEE-GRID (South-East European GRID e-Infrastructure Development) project, similarly through its two phases, has established a strong human network in the area of scientific computing and has set up a powerful regional Grid infrastructure, and attracted a number of applications from different fields from countries throughout the South-East Europe. The current SEEGRID-SCI project, ending in April 2010, empowers the regional user communities from fields of meteorology, seismology and environmental protection in common use and sharing of the regional e-Infrastructure. Current technical initiatives in formulation are focusing on a set of coordinated actions in the area of HPC and application fields making use of HPC initiatives. Finally, the current SEERA-EI project brings together policy makers - programme managers from 10 countries in the region. The project aims to establish a communication platform between programme managers, pave the way towards common e-Infrastructure strategy and vision, and implement concrete actions for common funding of electronic infrastructures on the regional level. The regional vision on establishing an e

  17. Successful Massive Open Online Climate Course on Climate Science and Psychology

    Science.gov (United States)

    Nuccitelli, D. A.; Cook, J.

    2015-12-01

    In 2015, the University of Queensland and edX launched a Massive Open Online Course (MOOC), 'Making Sense of Climate Science Denial.' The MOOC debunked approximately 50 common climate myths using elements of both physical science and psychology. Students learned how to recognise the social and psychological drivers of climate science denial, how to better understand climate change, how to identify the techniques and fallacies that climate myths employ to distort climate science, and how to effectively debunk climate misinformation. Contributors to the website Skeptical Science delivered the lectures, which were reinforced via interviews with climate science and psychology experts. Over 15,000 students from 167 countries enrolled in the course, and student feedback was overwhelmingly positive. This MOOC provides a model for effective climate science education.

  18. Open Access Citation Advantage in selected Information Science journals: an extended analysis to altmetrics indicators

    Directory of Open Access Journals (Sweden)

    Paulo Roberto Cintra

    2017-04-01

    Full Text Available Introduction: Open access refers to scientific literature available free of charge and free of copyright restrictions and licensing for its reuse. An increase in the total number of citations received by articles available in open access in relation to those of restricted, pay-walled access is expected, according to the Open Access Citation Advantage hypothesis. Objective: Assess the possible citation advantages and mentions on the social web that open access can offer to the Information Science area. Methodology: Bibliometric and altmetric indicators were analyzed in two journals: Journal of the American Society for Information Science and Scientometrics. Data collection was conducted in the Web of Science, Google Scholar, Altmetric.com and Mendeley. Results: The results indicated that for both journals, open access offers an advantage in the number of citations received by articles. It was also demonstrated that the advantage is maintained over time. Conclusions: This research confirmed the hypothesis of an Open Access Citation Advantage for the journals analyzed in the area of Information Science. This pattern was also observed for the altmetric data.

  19. Power grids

    International Nuclear Information System (INIS)

    Viterbo, J.

    2012-01-01

    The implementation of renewable energies represents new challenges for electrical systems. The objective: making power grids smarter so they can handle intermittent production. The advent of smart grids will allow flexible operations like distributing energy in a multidirectional manner instead of just one way and it will make electrical systems capable of integrating actions by different users, consumers and producers in order to maintain efficient, sustainable, economical and secure power supplies. Practically speaking, they associate sensors, instrumentation and controls with information processing and communication systems in order to create massively automated networks. Smart grids require huge investments: for example more than 7 billion dollars have been invested in China and in the Usa in 2010 and France is ranked 9. worldwide with 265 million dollars invested. It is expected that smart grids will promote the development of new business models and a change in the value chain for energy. Decentralized production combined with the probable introduction of more or less flexible rates for sales or purchases and of new supplier-customer relationships will open the way to the creation of new businesses. (A.C.)

  20. Grid Architecture 2

    Energy Technology Data Exchange (ETDEWEB)

    Taft, Jeffrey D. [Pacific Northwest National Lab. (PNNL), Richland, WA (United States)

    2016-01-01

    The report describes work done on Grid Architecture under the auspices of the Department of Electricity Office of Electricity Delivery and Reliability in 2015. As described in the first Grid Architecture report, the primary purpose of this work is to provide stakeholder insight about grid issues so as to enable superior decision making on their part. Doing this requires the creation of various work products, including oft-times complex diagrams, analyses, and explanations. This report provides architectural insights into several important grid topics and also describes work done to advance the science of Grid Architecture as well.

  1. Research, Collaboration, and Open Science Using Web 2.0

    Directory of Open Access Journals (Sweden)

    Kevin Shee

    2010-10-01

    Full Text Available There is little doubt that the Internet has transformed the world in which we live. Information that was once archived in bricks and mortar libraries is now only a click away, and people across the globe have become connected in a manner inconceivable only 20 years ago. Although many scientists and educators have embraced the Internet as an invaluable tool for research, education and data sharing, some have been somewhat slower to take full advantage of emerging Web 2.0 technologies. Here we discuss the benefits and challenges of integrating Web 2.0 applications into undergraduate research and education programs, based on our experience utilizing these technologies in a summer undergraduate research program in synthetic biology at Harvard University. We discuss the use of applications including wiki-based documentation, digital brainstorming, and open data sharing via the Web, to facilitate the educational aspects and collaborative progress of undergraduate research projects. We hope to inspire others to integrate these technologies into their own coursework or research projects.

  2. New Source Code: Spelman Women Transforming the Grid of Science and Technology

    Science.gov (United States)

    Okonkwo, Holly

    From a seminary for newly freedwomen in the 19th century "Deep South" of the United States to a "Model Institution for Excellence" in undergraduate science, technology, engineering, and math education, the narrative of Spelman College is a critical piece to understanding the overall history and socially constructed nature of science and higher education in the U.S. Making a place for science at Spelman College, disrupts and redefines the presumed and acceptable roles of African American women in science and their social, political and economic engagements in U.S society as a whole. Over the course of 16 months, I explore the narrative experiences of members of the Spelman campus community and immerse myself in the environment to experience becoming of member of a scientific community that asserts a place for women of African descent in science and technology and perceives this positionality as positive, powerful and the locus of agency. My intention is to offer this research as an in-depth ethnographic presentation of intentional science learning, knowledge production and practice as lived experiences at the multiple intersections of the constructs of race, gender, positionality and U.S science itself. In this research, I am motivated to move the contemporary discourse of diversifying science, technology, engineering and mathematics fields in the U.S. academy, beyond the chronicling of women of African descent as statistical rarities over time, as subjectivities and the deficit frameworks that theoretically encapsulate their narratives. The findings of this research demonstrate that Spelman students, staff and alumni are themselves, the cultural capital that validates Spelman's identity as a place, its institutional mission and are at the core of the institutional success of the college. It is a personal mission as much as it is an institutional mission, which is precisely what makes it powerful.

  3. Distributed Geant4 simulation in medical and space science applications using DIANE framework and the GRID

    CERN Document Server

    Moscicki, J T; Mantero, A; Pia, M G

    2003-01-01

    Distributed computing is one of the most important trends in IT which has recently gained significance for large-scale scientific applications. Distributed analysis environment (DIANE) is a R&D study, focusing on semiinteractive parallel and remote data analysis and simulation, which has been conducted at CERN. DIANE provides necessary software infrastructure for parallel scientific applications in the master-worker model. Advanced error recovery policies, automatic book-keeping of distributed jobs and on-line monitoring and control tools are provided. DIANE makes a transparent use of a number of different middleware implementations such as load balancing service (LSF, PBS, GRID Resource Broker, Condor) and security service (GSI, Kerberos, openssh). A number of distributed Geant 4 simulations have been deployed and tested, ranging from interactive radiotherapy treatment planning using dedicated clusters in hospitals, to globally-distributed simulations of astrophysics experiments using the European data g...

  4. Open Access Policies of Research Funders: The Case Study of the Austrian Science Fund (FWF)

    OpenAIRE

    Tonto, Yaşar; Doğan, Güleda; Al, Umut; Madran, Orçun

    2015-01-01

    The Austrian Science Fund (FWF) is the main funder for basic research in Austria. FWF has been instrumental in promoting Open Access in Austria and elsewhere and possesses a strong Open Access policy for the research it funds. This case study presents FWF as a good practice of an effective funder policy on account of its comprehensive strategy and multi-faceted approach for implementing and supporting it.

  5. Research and development of grid computing technology in center for computational science and e-systems of Japan Atomic Energy Agency

    International Nuclear Information System (INIS)

    Suzuki, Yoshio

    2007-01-01

    Center for Computational Science and E-systems of the Japan Atomic Energy Agency (CCSE/JAEA) has carried out R and D of grid computing technology. Since 1995, R and D to realize computational assistance for researchers called Seamless Thinking Aid (STA) and then to share intellectual resources called Information Technology Based Laboratory (ITBL) have been conducted, leading to construct an intelligent infrastructure for the atomic energy research called Atomic Energy Grid InfraStructure (AEGIS) under the Japanese national project 'Development and Applications of Advanced High-Performance Supercomputer'. It aims to enable synchronization of three themes: 1) Computer-Aided Research and Development (CARD) to realize and environment for STA, 2) Computer-Aided Engineering (CAEN) to establish Multi Experimental Tools (MEXT), and 3) Computer Aided Science (CASC) to promote the Atomic Energy Research and Investigation (AERI). This article reviewed achievements in R and D of grid computing technology so far obtained. (T. Tanaka)

  6. Design and Implementation of a Library and Information Science Open Access Journal Union Catalogue System

    Directory of Open Access Journals (Sweden)

    Sinn-Cheng Lin

    2017-03-01

    Full Text Available Open access is a mode of academic communication that has been on the rise in recent years, but open access academic resources are widely dispersed across the internet, making it occasionally inconvenient in terms of its use. This research is focused on library and information science, using the OAIS reference model as the system framework, two open access platform, DOAJ and E-LIS as the data sources, and through system implementation develop a “library and information science open access journal union catalogue” system. Using the OAI-PMH protocol as the data interoperability standard, and LAMP as the development environment, four major functionalities: injest, archiving, management and access of information were designed, developed, and integrated into system build. Actual testing and verification showed this system is able to successfully collect data from DOAJ and E-LIS open journal resources related to library and information science. The system is now active and functional, and can be used by researchers in the library and science information field.

  7. Alchemy & algorithms: perspectives on the philosophy and history of open science

    Directory of Open Access Journals (Sweden)

    Leo Lahti

    2017-05-01

    Full Text Available This paper gives the reader a chance to experience, or revisit, PHOS16: a conference on the History and Philosophy of Open Science. In the winter of 2016, we invited a varied international group to engage with these topics at the University of Helsinki, Finland. Our aim was a critical assessment of the defining features, underlying narratives, and overall objectives of the contemporary open science movement. The event brought together contemporary open science scholars, publishers, and advocates to discuss the philosophical foundations and historical roots of openness in academic research. The eight sessions combined historical views with more contemporary perspectives on topics such as transparency, reproducibility, collaboration, publishing, peer review, research ethics, as well as societal impact and engagement. We gathered together expert panelists and 15 invited speakers who have published extensively on these topics, which allowed us to engage in a thorough and multifaceted discussion. Together with our involved audience we charted the role and foundations of openness of research in our time, considered the accumulation and dissemination of scientific knowledge, and debated the various technical, legal, and ethical challenges of the past and present. In this article, we provide an overview of the topics covered at the conference as well as individual video interviews with each speaker. In addition to this, all the talks were recorded and they are offered here as an openly licensed community resource in both video and audio form.

  8. Open science: policy implications for the evolving phenomenon of user-led scientific innovation

    Directory of Open Access Journals (Sweden)

    Victoria Stodden

    2010-03-01

    Full Text Available From contributions of astronomy data and DNA sequences to disease treatment research, scientific activity by non-scientists is a real and emergent phenomenon, and raising policy questions. This involvement in science can be understood as an issue of access to publications, code, and data that facilitates public engagement in the research process, thus appropriate policy to support the associated welfare enhancing benefits is essential. Current legal barriers to citizen participation can be alleviated by scientists’ use of the “Reproducible Research Standard,” thus making the literature, data, and code associated with scientific results accessible. The enterprise of science is undergoing deep and fundamental changes, particularly in how scientists obtain results and share their work: the promise of open research dissemination held by the Internet is gradually being fulfilled by scientists. Contributions to science from beyond the ivory tower are forcing a rethinking of traditional models of knowledge generation, evaluation, and communication. The notion of a scientific “peer” is blurred with the advent of lay contributions to science raising questions regarding the concepts of peer-review and recognition. New collaborative models are emerging around both open scientific software and the generation of scientific discoveries that bear a similarity to open innovation models in other settings. Public engagement in science can be understood as an issue of access to knowledge for public involvement in the research process, facilitated by appropriate policy to support the welfare enhancing benefits deriving from citizen-science.

  9. Open access publishing in the biomedical sciences: could funding agencies accelerate the inevitable changes?

    Science.gov (United States)

    Glover, Steven William; Webb, Anne; Gleghorn, Colette

    2006-09-01

    Open access is making a noticeable impact on access to information. In 2005, many major research funders, including the Wellcome Trust, National Institutes for Health (NIH), and the Research Councils UK (RCUK), set out their position in a number of statements. Of particular note was the stipulation that authors receiving grants must deposit their final manuscript in an open access forum within 6-12 months of publication. The paper will look at the open access position statements issued by some of the major funding bodies in the biomedical sciences. The paper will also look at the models used by publishers to provide open or delayed access, such as Oxford Open from Oxford University Press, HighWire Press' delayed access policy, BioMed Central, and Public Library of Science (PLoS). There are now over 1.2 million articles in PubMed that are freely accessible via publishers' websites.(1) Could funding agencies accelerate the move to open access? The list of funding agencies supporting open access is growing. The National Institutes for Health and the Wellcome Trust have been joined by many of the world's major funders in biomedical research whose goal it is to make their research findings available with no barriers.

  10. Introducing Enabling Computational Tools to the Climate Sciences: Multi-Resolution Climate Modeling with Adaptive Cubed-Sphere Grids

    Energy Technology Data Exchange (ETDEWEB)

    Jablonowski, Christiane [Univ. of Michigan, Ann Arbor, MI (United States)

    2015-07-14

    The research investigates and advances strategies how to bridge the scale discrepancies between local, regional and global phenomena in climate models without the prohibitive computational costs of global cloud-resolving simulations. In particular, the research explores new frontiers in computational geoscience by introducing high-order Adaptive Mesh Refinement (AMR) techniques into climate research. AMR and statically-adapted variable-resolution approaches represent an emerging trend for atmospheric models and are likely to become the new norm in future-generation weather and climate models. The research advances the understanding of multi-scale interactions in the climate system and showcases a pathway how to model these interactions effectively with advanced computational tools, like the Chombo AMR library developed at the Lawrence Berkeley National Laboratory. The research is interdisciplinary and combines applied mathematics, scientific computing and the atmospheric sciences. In this research project, a hierarchy of high-order atmospheric models on cubed-sphere computational grids have been developed that serve as an algorithmic prototype for the finite-volume solution-adaptive Chombo-AMR approach. The foci of the investigations have lied on the characteristics of both static mesh adaptations and dynamically-adaptive grids that can capture flow fields of interest like tropical cyclones. Six research themes have been chosen. These are (1) the introduction of adaptive mesh refinement techniques into the climate sciences, (2) advanced algorithms for nonhydrostatic atmospheric dynamical cores, (3) an assessment of the interplay between resolved-scale dynamical motions and subgrid-scale physical parameterizations, (4) evaluation techniques for atmospheric model hierarchies, (5) the comparison of AMR refinement strategies and (6) tropical cyclone studies with a focus on multi-scale interactions and variable-resolution modeling. The results of this research project

  11. LHC computing grid

    International Nuclear Information System (INIS)

    Novaes, Sergio

    2011-01-01

    Full text: We give an overview of the grid computing initiatives in the Americas. High-Energy Physics has played a very important role in the development of grid computing in the world and in Latin America it has not been different. Lately, the grid concept has expanded its reach across all branches of e-Science, and we have witnessed the birth of the first nationwide infrastructures and its use in the private sector. (author)

  12. Challenges of Virtual and Open Distance Science Teacher Education in Zimbabwe

    Science.gov (United States)

    Mpofu, Vongai; Samukange, Tendai; Kusure, Lovemore M.; Zinyandu, Tinoidzwa M.; Denhere, Clever; Huggins, Nyakotyo; Wiseman, Chingombe; Ndlovu, Shakespear; Chiveya, Renias; Matavire, Monica; Mukavhi, Leckson; Gwizangwe, Isaac; Magombe, Elliot; Magomelo, Munyaradzi; Sithole, Fungai; Bindura University of Science Education (BUSE),

    2012-01-01

    This paper reports on a study of the implementation of science teacher education through virtual and open distance learning in the Mashonaland Central Province, Zimbabwe. The study provides insight into challenges faced by students and lecturers on inception of the program at four centres. Data was collected from completed evaluation survey forms…

  13. A Survey of Physical Sciences, Engineering and Mathematics Faculty Regarding Author Fees in Open Access Journals

    Science.gov (United States)

    Cusker, Jeremy; Rauh, Anne E.

    2014-01-01

    Discussions of the potential of open access publishing frequently must contend with the skepticism of research authors regarding the need to pay author fees (also known as publication fees). With that in mind, the authors undertook a survey of faculty, postdocs, and graduate students in physical science, mathematics, and engineering fields at two…

  14. Open Educational Resources in Support of Science Learning: Tools for Inquiry and Observation

    Science.gov (United States)

    Scanlon, Eileen

    2012-01-01

    This article focuses on the potential of free tools, particularly inquiry tools for influencing participation in twenty-first-century learning in science, as well as influencing the development of communities around tools. Two examples are presented: one on the development of an open source tool for structured inquiry learning that can bridge the…

  15. Unlocking the full potential of Open Innovation in the Life Sciences through a classification system

    DEFF Research Database (Denmark)

    Nilsson, Niclas; Minssen, Timo

    2018-01-01

    Open Innovation (OI) holds much promise as a new business model for collaborative value creation in life science. From a corporate perspective, benefits include faster access to new relevant technology; the opportunity for Biotechs and Small to Medium-sized Enterprises (SMEs) to explore new marke...

  16. A Bright Spark: Open Teaching of Science Using Faraday's Lectures on Candles

    Science.gov (United States)

    Walker, Mark; Groger, Martin; Schutler, Kirsten; Mosler, Bernd

    2008-01-01

    As well as being a founding father of modern chemistry and physics Michael Faraday was also a skilled lecturer, able to explain scientific principles and ideas simply and concisely to nonscientific audiences. However science didactics today emphasizes the use of open and student-centered methods of teaching in which students find and develop…

  17. Open Access Research via Collaborative Educational Blogging: A Case Study from Library & Information Science

    Science.gov (United States)

    Rebmann, Kristen Radsliff; Clark, Camden Bernard

    2017-01-01

    This article charts the development of activities for online graduate students in library and information science. Project goals include helping students develop competencies in understanding open access publishing, synthesizing research in the field, and engaging in scholarly communication via collaborative educational blogging. Using a design…

  18. Impact of problem finding on the quality of authentic open inquiry science research projects

    Science.gov (United States)

    Labanca, Frank

    2008-11-01

    Problem finding is a creative process whereby individuals develop original ideas for study. Secondary science students who successfully participate in authentic, novel, open inquiry studies must engage in problem finding to determine viable and suitable topics. This study examined problem finding strategies employed by students who successfully completed and presented the results of their open inquiry research at the 2007 Connecticut Science Fair and the 2007 International Science and Engineering Fair. A multicase qualitative study was framed through the lenses of creativity, inquiry strategies, and situated cognition learning theory. Data were triangulated by methods (interviews, document analysis, surveys) and sources (students, teachers, mentors, fair directors, documents). The data demonstrated that the quality of student projects was directly impacted by the quality of their problem finding. Effective problem finding was a result of students using resources from previous, specialized experiences. They had a positive self-concept and a temperament for both the creative and logical perspectives of science research. Successful problem finding was derived from an idiosyncratic, nonlinear, and flexible use and understanding of inquiry. Finally, problem finding was influenced and assisted by the community of practicing scientists, with whom the students had an exceptional ability to communicate effectively. As a result, there appears to be a juxtaposition of creative and logical/analytical thought for open inquiry that may not be present in other forms of inquiry. Instructional strategies are suggested for teachers of science research students to improve the quality of problem finding for their students and their subsequent research projects.

  19. An open data repository and a data processing software toolset of an equivalent Nordic grid model matched to historical electricity market data.

    Science.gov (United States)

    Vanfretti, Luigi; Olsen, Svein H; Arava, V S Narasimham; Laera, Giuseppe; Bidadfar, Ali; Rabuzin, Tin; Jakobsen, Sigurd H; Lavenius, Jan; Baudette, Maxime; Gómez-López, Francisco J

    2017-04-01

    This article presents an open data repository, the methodology to generate it and the associated data processing software developed to consolidate an hourly snapshot historical data set for the year 2015 to an equivalent Nordic power grid model (aka Nordic 44), the consolidation was achieved by matching the model׳s physical response w.r.t historical power flow records in the bidding regions of the Nordic grid that are available from the Nordic electricity market agent, Nord Pool. The model is made available in the form of CIM v14, Modelica and PSS/E (Siemens PTI) files. The Nordic 44 model in Modelica and PSS/E were first presented in the paper titled "iTesla Power Systems Library (iPSL): A Modelica library for phasor time-domain simulations" (Vanfretti et al., 2016) [1] for a single snapshot. In the digital repository being made available with the submission of this paper (SmarTSLab_Nordic44 Repository at Github, 2016) [2], a total of 8760 snapshots (for the year 2015) that can be used to initialize and execute dynamic simulations using tools compatible with CIM v14, the Modelica language and the proprietary PSS/E tool are provided. The Python scripts to generate the snapshots (processed data) are also available with all the data in the GitHub repository (SmarTSLab_Nordic44 Repository at Github, 2016) [2]. This Nordic 44 equivalent model was also used in iTesla project (iTesla) [3] to carry out simulations within a dynamic security assessment toolset (iTesla, 2016) [4], and has been further enhanced during the ITEA3 OpenCPS project (iTEA3) [5]. The raw, processed data and output models utilized within the iTesla platform (iTesla, 2016) [4] are also available in the repository. The CIM and Modelica snapshots of the "Nordic 44" model for the year 2015 are available in a Zenodo repository.

  20. Documentation in Otolaryngology. Sharing Otolaryngology research data in an open science ecosyste

    Directory of Open Access Journals (Sweden)

    Fernanda PESET

    2018-01-01

    Full Text Available Introduction and objective: The present text addresses the most significant aspects to share research data in otolaryngology in the context of open science as an ecosystem. Its aim is to offer a panoramic view that helps the researcher to manage their data as part of enriched science. Method: A bibliographical review and of the own experience in the field of the investigation data was performed. Results: The basic pillars for success are offered: its political, technical and necessary capacities. Discussion: The tasks of making data available should be recognized as part of the researcher's curriculum because documenting them to be reusable is a highly specialized and time-consuming task. Conclusions: It is considered that we are at a crucial moment to begin to share data. It is being considered in all scientific policy scenarios as in the EU through the European Open Science Computing.

  1. Enabling Open Science for Health Research: Collaborative Informatics Environment for Learning on Health Outcomes (CIELO).

    Science.gov (United States)

    Payne, Philip; Lele, Omkar; Johnson, Beth; Holve, Erin

    2017-07-31

    There is an emergent and intensive dialogue in the United States with regard to the accessibility, reproducibility, and rigor of health research. This discussion is also closely aligned with the need to identify sustainable ways to expand the national research enterprise and to generate actionable results that can be applied to improve the nation's health. The principles and practices of Open Science offer a promising path to address both goals by facilitating (1) increased transparency of data and methods, which promotes research reproducibility and rigor; and (2) cumulative efficiencies wherein research tools and the output of research are combined to accelerate the delivery of new knowledge in proximal domains, thereby resulting in greater productivity and a reduction in redundant research investments. AcademyHealth's Electronic Data Methods (EDM) Forum implemented a proof-of-concept open science platform for health research called the Collaborative Informatics Environment for Learning on Health Outcomes (CIELO). The EDM Forum conducted a user-centered design process to elucidate important and high-level requirements for creating and sustaining an open science paradigm. By implementing CIELO and engaging a variety of potential users in its public beta testing, the EDM Forum has been able to elucidate a broad range of stakeholder needs and requirements related to the use of an open science platform focused on health research in a variety of "real world" settings. Our initial design and development experience over the course of the CIELO project has provided the basis for a vigorous dialogue between stakeholder community members regarding the capabilities that will add the greatest value to an open science platform for the health research community. A number of important questions around user incentives, sustainability, and scalability will require further community dialogue and agreement. ©Philip Payne, Omkar Lele, Beth Johnson, Erin Holve. Originally published

  2. Free and Open Source Software for Geospatial in the field of planetary science

    Science.gov (United States)

    Frigeri, A.

    2012-12-01

    Information technology applied to geospatial analyses has spread quickly in the last ten years. The availability of OpenData and data from collaborative mapping projects increased the interest on tools, procedures and methods to handle spatially-related information. Free Open Source Software projects devoted to geospatial data handling are gaining a good success as the use of interoperable formats and protocols allow the user to choose what pipeline of tools and libraries is needed to solve a particular task, adapting the software scene to his specific problem. In particular, the Free Open Source model of development mimics the scientific method very well, and researchers should be naturally encouraged to take part to the development process of these software projects, as this represent a very agile way to interact among several institutions. When it comes to planetary sciences, geospatial Free Open Source Software is gaining a key role in projects that commonly involve different subjects in an international scenario. Very popular software suites for processing scientific mission data (for example, ISIS) and for navigation/planning (SPICE) are being distributed along with the source code and the interaction between user and developer is often very strict, creating a continuum between these two figures. A very widely spread library for handling geospatial data (GDAL) has started to support planetary data from the Planetary Data System, and recent contributions enabled the support to other popular data formats used in planetary science, as the Vicar one. The use of Geographic Information System in planetary science is now diffused, and Free Open Source GIS, open GIS formats and network protocols allow to extend existing tools and methods developed to solve Earth based problems, also to the case of the study of solar system bodies. A day in the working life of a researcher using Free Open Source Software for geospatial will be presented, as well as benefits and

  3. Book Review: Opening Science, the Evolving Guide on How the Internet is Changing Research, Collaboration, and Scholarly Publishing

    Science.gov (United States)

    The way we get our funding, collaborate, do our research, and get the word out has evolved over hundreds of years but we can imagine a more open science world, largely facilitated by the internet. The movement towards this more open way of doing and presenting science is coming, ...

  4. Harnessing the Use of Open Learning Exchange to Support Basic Education in Science and Mathematics in the Philippines

    Science.gov (United States)

    Feliciano, Josephine S.; Mandapat, Louie Carl R.; Khan, Concepcion L.

    2013-01-01

    This paper presents the open learning initiatives of the Science Education Institute of the Department of Science and Technology to overcome certain barriers, such as enabling access, cost of replication, timely feedback, monitoring and continuous improvement of learning modules. Using an open-education model, like MIT's (Massachusetts Institute…

  5. Physical Science Informatics: Providing Open Science Access to Microheater Array Boiling Experiment Data

    Science.gov (United States)

    McQuillen, John; Green, Robert D.; Henrie, Ben; Miller, Teresa; Chiaramonte, Fran

    2014-01-01

    The Physical Science Informatics (PSI) system is the next step in this an effort to make NASA sponsored flight data available to the scientific and engineering community, along with the general public. The experimental data, from six overall disciplines, Combustion Science, Fluid Physics, Complex Fluids, Fundamental Physics, and Materials Science, will present some unique challenges. Besides data in textual or numerical format, large portions of both the raw and analyzed data for many of these experiments are digital images and video, requiring large data storage requirements. In addition, the accessible data will include experiment design and engineering data (including applicable drawings), any analytical or numerical models, publications, reports, and patents, and any commercial products developed as a result of the research. This objective of paper includes the following: Present the preliminary layout (Figure 2) of MABE data within the PSI database. Obtain feedback on the layout. Present the procedure to obtain access to this database.

  6. Maintaining the momentum of Open Search in Earth Science Data discovery

    Science.gov (United States)

    Newman, D. J.; Lynnes, C.

    2013-12-01

    Federated Search for Earth Observation data has been a hallmark of EOSDIS (Earth Observing System Data and Information System) for two decades. Originally, the EOSDIS Version 0 system provided both data-collection-level and granule/file-level search in the mid 1990s with EOSDIS-specific socket protocols and message formats. Since that time, the advent of several standards has helped to simplify EOSDIS federated search, beginning with HTTP as the transfer protocol. Most recently, OpenSearch (www.opensearch.org) was employed for the EOS Clearinghouse (ECHO), based on a set of conventions that had been developed within the Earth Science Information Partners (ESIP) Federation. The ECHO OpenSearch API has evolved to encompass the ESIP RFC and the Open Geospatial Consortium (OGC) Open Search standard. Uptake of the ECHO Open Search API has been significant and has made ECHO accessible to client developers that found the previous ECHO SOAP API and current REST API too complex. Client adoption of the OpenSearch API appears to be largely driven by the simplicity of the OpenSearch convention. This simplicity is thus important to retain as the standard and convention evolve. For example, ECHO metrics indicate that the vast majority of ECHO users favor the following search criteria when using the REST API, - Spatial - bounding box, polygon, line and point - Temporal - start and end time - Keywords - free text Fewer than 10% of searches use additional constraints, particularly those requiring a controlled vocabulary, such as instrument, sensor, etc. This suggests that ongoing standardization efforts around OpenSearch usage for Earth Observation data may be more productive if oriented toward improving support for the Spatial, Temporal and Keyword search aspects. Areas still requiring improvement include support of - Concrete requirements for keyword constraints - Phrasal search for keyword constraints - Temporal constraint relations - Terminological symmetry between search URLs

  7. Our path to better science in less time using open data science tools.

    Science.gov (United States)

    Lowndes, Julia S Stewart; Best, Benjamin D; Scarborough, Courtney; Afflerbach, Jamie C; Frazier, Melanie R; O'Hara, Casey C; Jiang, Ning; Halpern, Benjamin S

    2017-05-23

    Reproducibility has long been a tenet of science but has been challenging to achieve-we learned this the hard way when our old approaches proved inadequate to efficiently reproduce our own work. Here we describe how several free software tools have fundamentally upgraded our approach to collaborative research, making our entire workflow more transparent and streamlined. By describing specific tools and how we incrementally began using them for the Ocean Health Index project, we hope to encourage others in the scientific community to do the same-so we can all produce better science in less time.

  8. Open access behaviours and perceptions of health sciences faculty and roles of information professionals.

    Science.gov (United States)

    Lwoga, Edda T; Questier, Frederik

    2015-03-01

    This study sought to investigate the faculty's awareness, attitudes and use of open access, and the role of information professionals in supporting open access (OA) scholarly communication in Tanzanian health sciences universities. A cross-sectional survey was conducted. Semi-structured interviews were conducted with 16 librarians, while questionnaires were physically distributed to 415 faculty members in all eight Tanzanian health sciences universities, with a response rate of 71.1%. The study found that most faculty members were aware about OA issues. However, the high level of OA awareness among faculty members did not translate into actual dissemination of faculty's research outputs through OA web avenues. A small proportion of faculty's research materials was made available as OA. Faculty were more engaged with OA journal publishing than with self-archiving practices. Senior faculty with proficient technical skills were more likely to use open access than junior faculty. Major barriers to OA usage were related to ICT infrastructure, awareness, skills, author-pay model, and copyright and plagiarism concerns. Interviews with librarians revealed that there was a strong support for promoting OA issues on campus; however, this positive support with various open access-related tasks did not translate into actual action. It is thus important for librarians and OA administrators to consider all these factors for effective implementation of OA projects in research and academic institutions. This is the first comprehensive and detailed study focusing on the health sciences faculty's and librarians' behaviours and perceptions of open access initiatives in Tanzania and reveals findings that are useful for planning and implementing open access initiatives in other institutions with similar conditions. © 2015 Health Libraries Journal.

  9. Global forces and local currents in Argentina's science policy crossroads: restricted access or open knowledge

    Directory of Open Access Journals (Sweden)

    Horacio Javier Etchichury

    2014-11-01

    Full Text Available The article describes the tensions between two competing approaches to scientific policy in Argentina. The traditional vision favors autonomous research. The neoliberal conception fosters the link between science and markets. In the past few years, a neodevelopmentalist current also tries to stress relevance of scientific research. Finally, the article describes how the Open Access movement has entered the debate. The World Bank intervention and the human rights dimension of the question are discussed in depth. The article introduces the notion of open knowledge as a guiding criterion to design a human-rights based scientific policy.

  10. The Open-source Data Inventory for Anthropogenic CO2, version 2016 (ODIAC2016: a global monthly fossil fuel CO2 gridded emissions data product for tracer transport simulations and surface flux inversions

    Directory of Open Access Journals (Sweden)

    T. Oda

    2018-01-01

    Full Text Available The Open-source Data Inventory for Anthropogenic CO2 (ODIAC is a global high-spatial-resolution gridded emissions data product that distributes carbon dioxide (CO2 emissions from fossil fuel combustion. The emissions spatial distributions are estimated at a 1  ×  1 km spatial resolution over land using power plant profiles (emissions intensity and geographical location and satellite-observed nighttime lights. This paper describes the year 2016 version of the ODIAC emissions data product (ODIAC2016 and presents analyses that help guide data users, especially for atmospheric CO2 tracer transport simulations and flux inversion analysis. Since the original publication in 2011, we have made modifications to our emissions modeling framework in order to deliver a comprehensive global gridded emissions data product. Major changes from the 2011 publication are (1 the use of emissions estimates made by the Carbon Dioxide Information Analysis Center (CDIAC at the Oak Ridge National Laboratory (ORNL by fuel type (solid, liquid, gas, cement manufacturing, gas flaring, and international aviation and marine bunkers; (2 the use of multiple spatial emissions proxies by fuel type such as (a nighttime light data specific to gas flaring and (b ship/aircraft fleet tracks; and (3 the inclusion of emissions temporal variations. Using global fuel consumption data, we extrapolated the CDIAC emissions estimates for the recent years and produced the ODIAC2016 emissions data product that covers 2000–2015. Our emissions data can be viewed as an extended version of CDIAC gridded emissions data product, which should allow data users to impose global fossil fuel emissions in a more comprehensive manner than the original CDIAC product. Our new emissions modeling framework allows us to produce future versions of the ODIAC emissions data product with a timely update. Such capability has become more significant given the CDIAC/ORNL's shutdown. The ODIAC data

  11. The Open-source Data Inventory for Anthropogenic CO2, version 2016 (ODIAC2016): a global monthly fossil fuel CO2 gridded emissions data product for tracer transport simulations and surface flux inversions

    Science.gov (United States)

    Oda, Tomohiro; Maksyutov, Shamil; Andres, Robert J.

    2018-01-01

    The Open-source Data Inventory for Anthropogenic CO2 (ODIAC) is a global high-spatial-resolution gridded emissions data product that distributes carbon dioxide (CO2) emissions from fossil fuel combustion. The emissions spatial distributions are estimated at a 1 × 1 km spatial resolution over land using power plant profiles (emissions intensity and geographical location) and satellite-observed nighttime lights. This paper describes the year 2016 version of the ODIAC emissions data product (ODIAC2016) and presents analyses that help guide data users, especially for atmospheric CO2 tracer transport simulations and flux inversion analysis. Since the original publication in 2011, we have made modifications to our emissions modeling framework in order to deliver a comprehensive global gridded emissions data product. Major changes from the 2011 publication are (1) the use of emissions estimates made by the Carbon Dioxide Information Analysis Center (CDIAC) at the Oak Ridge National Laboratory (ORNL) by fuel type (solid, liquid, gas, cement manufacturing, gas flaring, and international aviation and marine bunkers); (2) the use of multiple spatial emissions proxies by fuel type such as (a) nighttime light data specific to gas flaring and (b) ship/aircraft fleet tracks; and (3) the inclusion of emissions temporal variations. Using global fuel consumption data, we extrapolated the CDIAC emissions estimates for the recent years and produced the ODIAC2016 emissions data product that covers 2000-2015. Our emissions data can be viewed as an extended version of CDIAC gridded emissions data product, which should allow data users to impose global fossil fuel emissions in a more comprehensive manner than the original CDIAC product. Our new emissions modeling framework allows us to produce future versions of the ODIAC emissions data product with a timely update. Such capability has become more significant given the CDIAC/ORNL's shutdown. The ODIAC data product could play an important

  12. White paper on science and technology, 1997. Striving for an open research community

    International Nuclear Information System (INIS)

    1997-01-01

    This report concerns the policy measures intended to promote science and technology, pursuant to Article 8 of the Science and Technology Basic Law (Law No. 130), enacted in 1995. This report is constituted from three parts. Part 1 and 2 discuss trends in a wide range of scientific and technological activities to help the reader understand the policy measures implemented to promote science and technology, which are then discussed in Part 3. Part 1, titled 'striving for an open research community', attempt an analysis of reform and current and future issues addressed in the Science and Technology Basic Plan, which was enacted in July, 1996. Part 2 uses various data to compare scientific and technological activities in Japan with those in other selected countries. Part 3 relates to policies implanted for the promotion of science and technology in the Science and Technology Agency, Japan Government. Here is described on science and technology policy development, development of comprehensive and systematic policies and promotion of research activities. (G.K.)

  13. Smart grid

    International Nuclear Information System (INIS)

    Choi, Dong Bae

    2001-11-01

    This book describes press smart grid from basics to recent trend. It is divided into ten chapters, which deals with smart grid as green revolution in energy with introduction, history, the fields, application and needed technique for smart grid, Trend of smart grid in foreign such as a model business of smart grid in foreign, policy for smart grid in U.S.A, Trend of smart grid in domestic with international standard of smart grid and strategy and rood map, smart power grid as infrastructure of smart business with EMS development, SAS, SCADA, DAS and PQMS, smart grid for smart consumer, smart renewable like Desertec project, convergence IT with network and PLC, application of an electric car, smart electro service for realtime of electrical pricing system, arrangement of smart grid.

  14. Safe Grid

    Science.gov (United States)

    Chow, Edward T.; Stewart, Helen; Korsmeyer, David (Technical Monitor)

    2003-01-01

    The biggest users of GRID technologies came from the science and technology communities. These consist of government, industry and academia (national and international). The NASA GRID is moving into a higher technology readiness level (TRL) today; and as a joint effort among these leaders within government, academia, and industry, the NASA GRID plans to extend availability to enable scientists and engineers across these geographical boundaries collaborate to solve important problems facing the world in the 21 st century. In order to enable NASA programs and missions to use IPG resources for program and mission design, the IPG capabilities needs to be accessible from inside the NASA center networks. However, because different NASA centers maintain different security domains, the GRID penetration across different firewalls is a concern for center security people. This is the reason why some IPG resources are been separated from the NASA center network. Also, because of the center network security and ITAR concerns, the NASA IPG resource owner may not have full control over who can access remotely from outside the NASA center. In order to obtain organizational approval for secured remote access, the IPG infrastructure needs to be adapted to work with the NASA business process. Improvements need to be made before the IPG can be used for NASA program and mission development. The Secured Advanced Federated Environment (SAFE) technology is designed to provide federated security across NASA center and NASA partner's security domains. Instead of one giant center firewall which can be difficult to modify for different GRID applications, the SAFE "micro security domain" provide large number of professionally managed "micro firewalls" that can allow NASA centers to accept remote IPG access without the worry of damaging other center resources. The SAFE policy-driven capability-based federated security mechanism can enable joint organizational and resource owner approved remote

  15. Unlocking the full potential of open innovation in the life sciences through a classification system.

    Science.gov (United States)

    Nilsson, Niclas; Minssen, Timo

    2018-04-01

    A common understanding of expectations and requirements is critical for boosting research-driven business opportunities in open innovation (OI) settings. Transparent communication requires common definitions and standards for OI to align the expectations of both parties. Here, we suggest a five-level classification system for OI models, reflecting the degree of openness. The aim of this classification system is to reduce contract negotiation complexity and times between two parties looking to engage in OI. Systematizing definitions and contractual terms for OI in the life sciences helps to reduce entry barriers and boosts collaborative value generation. By providing a contractual framework with predefined rules, science will be allowed to move more freely, thus maximizing the potential of OI. Copyright © 2018 The Authors. Published by Elsevier Ltd.. All rights reserved.

  16. Challenges of Virtual and Open Distance Science Teacher Education in Zimbabwe

    Directory of Open Access Journals (Sweden)

    Vongai Mpofu

    2012-01-01

    Full Text Available This paper reports on a study of the implementation of science teacher education through virtual and open distance learning in the Mashonaland Central Province, Zimbabwe. The study provides insight into challenges faced by students and lecturers on inception of the program at four centres. Data was collected from completed evaluation survey forms of forty-two lecturers who were directly involved at the launch of the program and in-depth interviews. Qualitative data analysis revealed that the programme faces potential threat from centre-, institution-, lecturer-, and student-related factors. These include limited resources, large classes, inadequate expertise in open and distance education, inappropriate science teacher education qualifications, implementer conflict of interest in program participation, students’ low self-esteem, lack of awareness of quality parameters of delivery systems among staff, and lack of standard criteria to measure the quality of services. The paper recommends that issues raised be addressed in order to produce quality teachers.

  17. Communication technologies in smart grid

    Directory of Open Access Journals (Sweden)

    Miladinović Nikola

    2013-01-01

    Full Text Available The role of communication technologies in Smart Grid lies in integration of large number of devices into one telecommunication system. This paper provides an overview of the technologies currently in use in electric power grid, that are not necessarily in compliance with the Smart Grid concept. Considering that the Smart Grid is open to the flow of information in all directions, it is necessary to provide reliability, protection and security of information.

  18. Improved Management of Water and Natural Resources Requires Open, Cognizant, Adaptive Science and Policy

    Science.gov (United States)

    Glynn, P. D.; Voinov, A. A.; Shapiro, C. D.; Jenni, K. E.

    2017-12-01

    Water issues impact the availability and use of other natural resources as well as environmental conditions. In an increasingly populated hyper-connected world, water issues are increasingly "wicked problems": complex problems with high uncertainties and no independent observers. Water is essential to life, and life affects water quality and availability. Scientists, managers, decision-makers, and the greater public all have a stake in improving the management of water resources. In turn, they are part of the systems that they are studying, deciding on, affecting, or trying to improve. Governance of water issues requires greater accessibility, traceability, and accountability (ATA) in science and policy. Water-related studies and decision-making need transdisciplinary science, inclusive participatory processes, and consideration and acceptance of multiple perspectives. Biases, Beliefs, Heuristics, and Values (BBHV) shape much of our perceptions and knowledge, and inevitably, affect both science and policy. Understanding the role of BBHV is critical to (1) understanding individual and group judgments and choices, (2) recognizing potential differences between societal "wants" and societal "needs", and (3) identifying "winners" and "losers" of policy decisions. Societal acceptance of proposed policies and actions can be fostered by enhancing participatory processes and by providing greater ATA in science, in policy, and in development of the laws, rules, and traditions that constrain decision-making. An adaptive science-infused governance framework is proposed that seeks greater cognizance of the role of BBHV in shaping science and policy choices and decisions, and that also seeks "Open Traceable Accountable Policy" to complement "Open Science". We discuss the limitations of the governance that we suggest, as well as tools and approaches to help implementation.

  19. Polymers – A New Open Access Scientific Journal on Polymer Science

    Directory of Open Access Journals (Sweden)

    Shu-Kun Lin

    2009-12-01

    Full Text Available Polymers is a new interdisciplinary, Open Access scientific journal on polymer science, published by Molecular Diversity Preservation International (MDPI. This journal welcomes manuscript submissions on polymer chemistry, macromolecular chemistry, polymer physics, polymer characterization and all related topics. Both synthetic polymers and natural polymers, including biopolymers, are considered. Manuscripts will be thoroughly peer-reviewed in a timely fashion, and papers will be published, if accepted, within 6 to 8 weeks after submission. [...

  20. Challenges of Virtual and Open Distance Science Teacher Education in Zimbabwe

    OpenAIRE

    Vongai Mpofu; Tendai Samukange; Lovemore M Kusure; Tinoidzwa M Zinyandu; Clever Denhere; Nyakotyo Huggins; Chingombe Wiseman; Shakespear Ndlovu; Rennias Chiveya; Monica Matavire; Leckson Mukavhi; Isaac Gwizangwe; Elliot Magombe; Munyaradzi Magomelo; Fungai Sithole

    2012-01-01

    This paper reports on a study of the implementation of science teacher education through virtual and open distance learning in the Mashonaland Central Province, Zimbabwe. The study provides insight into challenges faced by students and lecturers on inception of the program at four centres. Data was collected from completed evaluation survey forms of forty-two lecturers who were directly involved at the launch of the program and in-depth interviews. Qualitative data analysis revealed that the ...

  1. The ICTJA-CSIC Science Week 2016: an open door to Earth Sciences for secondary education students

    Science.gov (United States)

    Cortes-Picas, Jordi; Diaz, Jordi; Fernandez-Turiel, Jose-Luis; Garcia-Castellanos, Daniel; Geyer, Adelina; Jurado, Maria-Jose; Montoya, Encarni; Rejas Alejos, Marta; Sánchez-Pastor, Pilar; Valverde-Perez, Angel

    2017-04-01

    The Science Week is one of the main scientific outreach events every year in Spain. The Institute of Earth Sciences Jaume Almera of CSIC (ICTJA-CSIC) participates in it since many years ago, opening its doors and proposing several activities in which it is shown what kind of multidisciplinary research is being developed at the Institute and in Geosciences. The activities,developed as workshops, are designed and conducted by scientific and technical personnel of the centre, who participates in the Science Week voluntarily. The activities proposed by the ICTJA-CSIC staff are designed for a target audience composed by secondary school students (12-18 years). The ICTJA-CSIC joined Science Week 2016 in the framework of the activity entitled "What we investigate in Earth Sciences?". The aim is to show to the society what is being investigated in the ICTJA-CSIC. In addition, it is intended, with the contact and interaction between the public and the institute researchers, to increase the interest in scientific activity and, if possible, to generate new vocations in the field of the Earth Sciences among secondary school pupils. We show in this communication the experience of the Science Week 2016 at the ICTJA-CSIC, carried out with the effort and commitment of the of the Institute's personnel with the outreach of Earth Sciences research. Between November 14th and 19th 2016, more than 100 students from four secondary schools from Barcelona area visited the Institute and took part in the Science Week. A total of six interactive workshops were prepared showing different features of seismology, geophysical borehole logging, analog and digital modelling, paleoecology, volcanology and geochemistry. As a novelty, this year a new workshop based on an augmented reality sandbox was offered to show and to simulate the processes of creation and evolution of the topographic relief. In addition, within the workshop dedicated to geophysical borehole logging, six exact replicates of

  2. Supporting open collaboration in science through explicit and linked semantic description of processes

    Science.gov (United States)

    Gil, Yolanda; Michel, Felix; Ratnakar, Varun; Read, Jordan S.; Hauder, Matheus; Duffy, Christopher; Hanson, Paul C.; Dugan, Hilary

    2015-01-01

    The Web was originally developed to support collaboration in science. Although scientists benefit from many forms of collaboration on the Web (e.g., blogs, wikis, forums, code sharing, etc.), most collaborative projects are coordinated over email, phone calls, and in-person meetings. Our goal is to develop a collaborative infrastructure for scientists to work on complex science questions that require multi-disciplinary contributions to gather and analyze data, that cannot occur without significant coordination to synthesize findings, and that grow organically to accommodate new contributors as needed as the work evolves over time. Our approach is to develop an organic data science framework based on a task-centered organization of the collaboration, includes principles from social sciences for successful on-line communities, and exposes an open science process. Our approach is implemented as an extension of a semantic wiki platform, and captures formal representations of task decomposition structures, relations between tasks and users, and other properties of tasks, data, and other relevant science objects. All these entities are captured through the semantic wiki user interface, represented as semantic web objects, and exported as linked data.

  3. The Role of Semantics in Open-World, Integrative, Collaborative Science Data Platforms

    Science.gov (United States)

    Fox, Peter; Chen, Yanning; Wang, Han; West, Patrick; Erickson, John; Ma, Marshall

    2014-05-01

    As collaborative science spreads into more and more Earth and space science fields, both participants and funders are expressing stronger needs for highly functional data and information capabilities. Characteristics include a) easy to use, b) highly integrated, c) leverage investments, d) accommodate rapid technical change, and e) do not incur undue expense or time to build or maintain - these are not a small set of requirements. Based on our accumulated experience over the last ~ decade and several key technical approaches, we adapt, extend, and integrate several open source applications and frameworks to handle major portions of functionality for these platforms. This includes: an object-type repository, collaboration tools, identity management, all within a portal managing diverse content and applications. In this contribution, we present our methods and results of information models, adaptation, integration and evolution of a networked data science architecture based on several open source technologies (Drupal, VIVO, the Comprehensive Knowledge Archive Network; CKAN, and the Global Handle System; GHS). In particular we present the Deep Carbon Observatory - a platform for international science collaboration. We present and discuss key functional and non-functional attributes, and discuss the general applicability of the platform.

  4. `Opening up' a science task: an exploration of shifting embodied participation of a multilingual primary student

    Science.gov (United States)

    Gómez Fernández, Roberto; Siry, Christina

    2018-05-01

    Culturally and linguistically diverse (CLD) students have different home languages and cultures from many of their peers, In our context, these students suffer from higher school drop-out rates than their peers and are far behind their peers in sciences. This study investigates the interactions of a nine-year-old child whose home language is Portuguese and who learns science in this specific case in a diglossic environment in the Luxembourgish school system, in which his teacher used German for written tasks and Luxembourgish for oral communication. We examine, moment-by-moment, the interactions around a task regarding environmental protection. The role of this Lusoburguês (Luxembourgish and Portuguese identities and nationalities combined) student and his embodiment and participation changes when his group is confronted with an activity that requires an increased amount of manipulation. His identity evolves in interaction, as he becomes the leader in his group, and through a playful stance, manages to open the task so that his peers can further explore. Implications include the value of including more open-ended investigations in the teaching and learning of science as well as implications for further study concerning practice-based approaches in science classrooms with CLD students, particularly in increasingly multilingual/cultural and/or diglossic or heteroglossic school contexts.

  5. Study of the adaptive refinement on an open source 2D shallow-water flow solver using quadtree grid for flash flood simulations.

    Science.gov (United States)

    Kirstetter, G.; Popinet, S.; Fullana, J. M.; Lagrée, P. Y.; Josserand, C.

    2015-12-01

    The full resolution of shallow-water equations for modeling flash floods may have a high computational cost, so that majority of flood simulation softwares used for flood forecasting uses a simplification of this model : 1D approximations, diffusive or kinematic wave approximations or exotic models using non-physical free parameters. These kind of approximations permit to save a lot of computational time by sacrificing in an unquantified way the precision of simulations. To reduce drastically the cost of such 2D simulations by quantifying the lost of precision, we propose a 2D shallow-water flow solver built with the open source code Basilisk1, which is using adaptive refinement on a quadtree grid. This solver uses a well-balanced central-upwind scheme, which is at second order in time and space, and treats the friction and rain terms implicitly in finite volume approach. We demonstrate the validity of our simulation on the case of the flood of Tewkesbury (UK) occurred in July 2007, as shown on Fig. 1. On this case, a systematic study of the impact of the chosen criterium for adaptive refinement is performed. The criterium which has the best computational time / precision ratio is proposed. Finally, we present the power law giving the computational time in respect to the maximum resolution and we show that this law for our 2D simulation is close to the one of 1D simulation, thanks to the fractal dimension of the topography. [1] http://basilisk.fr/

  6. CERN readies world's biggest science grid The computing network now encompasses more than 100 sites in 31 countries

    CERN Multimedia

    Niccolai, James

    2005-01-01

    If the Large Hadron Collider (LHC) at CERN is to yield miraculous discoveries in particle physics, it may also require a small miracle in grid computing. By a lack of suitable tools from commercial vendors, engineers at the famed Geneva laboratory are hard at work building a giant grid to store and process the vast amount of data the collider is expected to produce when it begins operations in mid-2007 (2 pages)

  7. Open Science Meets Stem Cells: A New Drug Discovery Approach for Neurodegenerative Disorders.

    Science.gov (United States)

    Han, Chanshuai; Chaineau, Mathilde; Chen, Carol X-Q; Beitel, Lenore K; Durcan, Thomas M

    2018-01-01

    Neurodegenerative diseases are a challenge for drug discovery, as the biological mechanisms are complex and poorly understood, with a paucity of models that faithfully recapitulate these disorders. Recent advances in stem cell technology have provided a paradigm shift, providing researchers with tools to generate human induced pluripotent stem cells (iPSCs) from patient cells. With the potential to generate any human cell type, we can now generate human neurons and develop "first-of-their-kind" disease-relevant assays for small molecule screening. Now that the tools are in place, it is imperative that we accelerate discoveries from the bench to the clinic. Using traditional closed-door research systems raises barriers to discovery, by restricting access to cells, data and other research findings. Thus, a new strategy is required, and the Montreal Neurological Institute (MNI) and its partners are piloting an "Open Science" model. One signature initiative will be that the MNI biorepository will curate and disseminate patient samples in a more accessible manner through open transfer agreements. This feeds into the MNI open drug discovery platform, focused on developing industry-standard assays with iPSC-derived neurons. All cell lines, reagents and assay findings developed in this open fashion will be made available to academia and industry. By removing the obstacles many universities and companies face in distributing patient samples and assay results, our goal is to accelerate translational medical research and the development of new therapies for devastating neurodegenerative disorders.

  8. Open science resources for the discovery and analysis of Tara Oceans data.

    Science.gov (United States)

    Pesant, Stéphane; Not, Fabrice; Picheral, Marc; Kandels-Lewis, Stefanie; Le Bescot, Noan; Gorsky, Gabriel; Iudicone, Daniele; Karsenti, Eric; Speich, Sabrina; Troublé, Romain; Dimier, Céline; Searson, Sarah

    2015-01-01

    The Tara Oceans expedition (2009-2013) sampled contrasting ecosystems of the world oceans, collecting environmental data and plankton, from viruses to metazoans, for later analysis using modern sequencing and state-of-the-art imaging technologies. It surveyed 210 ecosystems in 20 biogeographic provinces, collecting over 35,000 samples of seawater and plankton. The interpretation of such an extensive collection of samples in their ecological context requires means to explore, assess and access raw and validated data sets. To address this challenge, the Tara Oceans Consortium offers open science resources, including the use of open access archives for nucleotides (ENA) and for environmental, biogeochemical, taxonomic and morphological data (PANGAEA), and the development of on line discovery tools and collaborative annotation tools for sequences and images. Here, we present an overview of Tara Oceans Data, and we provide detailed registries (data sets) of all campaigns (from port-to-port), stations and sampling events.

  9. Open data used in water sciences - Review of access, licenses and understandability

    Science.gov (United States)

    Falkenroth, Esa; Lagerbäck Adolphi, Emma; Arheimer, Berit

    2016-04-01

    The amount of open data available for hydrology research is continually growing. In the EU-funded project SWITCH-ON (Sharing Water-related Information to Tackle Changes in the Hydrosphere - for Operational Needs: www.water-switch-on.eu), we are addressing water concerns by exploring and exploiting the untapped potential of these new open data. This work is enabled by many ongoing efforts to facilitate the use of open data. For instance, a number of portals provide the means to search for open data sets and open spatial data services (such as the GEOSS Portal, INSPIRE community geoportal or various Climate Services and public portals). However, in general, many research groups in water sciences still hesitate in using this open data. We therefore examined some limiting factors. Factors that limit usability of a dataset include: (1) accessibility, (2) understandability and (3) licences. In the SWITCH-ON project we have developed a search tool for finding and accessing data with relevance to water science in Europe, as the existing ones are not addressing data needs in water sciences specifically. The tool is filled with some 9000 sets of metadata and each one is linked to water related key-words. The keywords are based on the ones developed within the CUAHSI community in USA, but extended with non-hydrosphere topics, additional subclasses and only showing keywords actually having data. Access to data sets: 78% of the data is directly accessible, while the rest is either available after registration and request, or through a web client for visualisation but without direct download. However, several data sets were found to be inaccessible due to server downtime, incorrect links or problems with the host database management system. One possible explanation for this could be that many datasets have been assembled by research project that no longer are funded. Hence, their server infrastructure would be less maintained compared to large-scale operational services

  10. Pika: A snow science simulation tool built using the open-source framework MOOSE

    Science.gov (United States)

    Slaughter, A.; Johnson, M.

    2017-12-01

    The Department of Energy (DOE) is currently investing millions of dollars annually into various modeling and simulation tools for all aspects of nuclear energy. An important part of this effort includes developing applications based on the open-source Multiphysics Object Oriented Simulation Environment (MOOSE; mooseframework.org) from Idaho National Laboratory (INL).Thanks to the efforts of the DOE and outside collaborators, MOOSE currently contains a large set of physics modules, including phase-field, level set, heat conduction, tensor mechanics, Navier-Stokes, fracture and crack propagation (via the extended finite-element method), flow in porous media, and others. The heat conduction, tensor mechanics, and phase-field modules, in particular, are well-suited for snow science problems. Pika--an open-source MOOSE-based application--is capable of simulating both 3D, coupled nonlinear continuum heat transfer and large-deformation mechanics applications (such as settlement) and phase-field based micro-structure applications. Additionally, these types of problems may be coupled tightly in a single solve or across length and time scales using a loosely coupled Picard iteration approach. In addition to the wide range of physics capabilities, MOOSE-based applications also inherit an extensible testing framework, graphical user interface, and documentation system; tools that allow MOOSE and other applications to adhere to nuclear software quality standards. The snow science community can learn from the nuclear industry and harness the existing effort to build simulation tools that are open, modular, and share a common framework. In particular, MOOSE-based multiphysics solvers are inherently parallel, dimension agnostic, adaptive in time and space, fully coupled, and capable of interacting with other applications. The snow science community should build on existing tools to enable collaboration between researchers and practitioners throughout the world, and advance the

  11. Geosciences: An Open Access Journal on Earth and Planetary Sciences and Their Interdisciplinary Approaches

    Directory of Open Access Journals (Sweden)

    Jesus Martinez-Frias

    2011-05-01

    Full Text Available On behalf of the Editorial Board and the editorial management staff of MDPI, it is my great pleasure to introduce this new journal Geosciences. Geosciences is an international, peer-reviewed open access journal, which publishes original papers, rapid communications, technical notes and review articles, and discussions about all interdisciplinary aspects of the earth and planetary sciences. Geosciences may also include papers presented at scientific conferences (proceedings or articles on a well defined topic assembled by individual editors or organizations/institutions (special publications.

  12. The integration of open access journals in the scholarly communication system: Three science fields

    DEFF Research Database (Denmark)

    Faber Frandsen, Tove

    2009-01-01

    across disciplines. This study is an analysis of the citing behaviour in journals within three science fields: biology, mathematics, and pharmacy and pharmacology. It is a statistical analysis of OAJs as well as non-OAJs including both the citing and cited side of the journal to journal citations......The greatest number of open access journals (OAJs) is found in the sciences and their influence is growing. However, there are only a few studies on the acceptance and thereby integration of these OAJs in the scholarly communication system. Even fewer studies provide insight into the differences....... The multivariate linear regression reveals many similarities in citing behaviour across fields and media. But it also points to great differences in the integration of OAJs. The integration of OAJs in the scholarly communication system varies considerably across fields. The implications for bibliometric research...

  13. A Template for Open Inquiry: Using Questions to Encourage and Support Inquiry in Earth and Space Science

    Science.gov (United States)

    Hermann, Ronald S.; Miranda, Rommel J.

    2010-01-01

    This article provides an instructional approach to helping students generate open-inquiry research questions, which the authors call the "open-inquiry question template." This template was created based on their experience teaching high school science and preservice university methods courses. To help teachers implement this template, they…

  14. The pilot way to Grid resources using glideinWMS

    CERN Document Server

    Sfiligoi, Igor; Holzman, Burt; Mhashilkar, Parag; Padhi, Sanjay; Wurthwrin, Frank

    Grid computing has become very popular in big and widespread scientific communities with high computing demands, like high energy physics. Computing resources are being distributed over many independent sites with only a thin layer of grid middleware shared between them. This deployment model has proven to be very convenient for computing resource providers, but has introduced several problems for the users of the system, the three major being the complexity of job scheduling, the non-uniformity of compute resources, and the lack of good job monitoring. Pilot jobs address all the above problems by creating a virtual private computing pool on top of grid resources. This paper presents both the general pilot concept, as well as a concrete implementation, called glideinWMS, deployed in the Open Science Grid.

  15. Enabling a new Paradigm to Address Big Data and Open Science Challenges

    Science.gov (United States)

    Ramamurthy, Mohan; Fisher, Ward

    2017-04-01

    Data are not only the lifeblood of the geosciences but they have become the currency of the modern world in science and society. Rapid advances in computing, communi¬cations, and observational technologies — along with concomitant advances in high-resolution modeling, ensemble and coupled-systems predictions of the Earth system — are revolutionizing nearly every aspect of our field. Modern data volumes from high-resolution ensemble prediction/projection/simulation systems and next-generation remote-sensing systems like hyper-spectral satellite sensors and phased-array radars are staggering. For example, CMIP efforts alone will generate many petabytes of climate projection data for use in assessments of climate change. And NOAA's National Climatic Data Center projects that it will archive over 350 petabytes by 2030. For researchers and educators, this deluge and the increasing complexity of data brings challenges along with the opportunities for discovery and scientific breakthroughs. The potential for big data to transform the geosciences is enormous, but realizing the next frontier depends on effectively managing, analyzing, and exploiting these heterogeneous data sources, extracting knowledge and useful information from heterogeneous data sources in ways that were previously impossible, to enable discoveries and gain new insights. At the same time, there is a growing focus on the area of "Reproducibility or Replicability in Science" that has implications for Open Science. The advent of cloud computing has opened new avenues for not only addressing both big data and Open Science challenges to accelerate scientific discoveries. However, to successfully leverage the enormous potential of cloud technologies, it will require the data providers and the scientific communities to develop new paradigms to enable next-generation workflows and transform the conduct of science. Making data readily available is a necessary but not a sufficient condition. Data providers

  16. Afraid of Scooping – Case Study on Researcher Strategies against Fear of Scooping in the Context of Open Science

    Directory of Open Access Journals (Sweden)

    Heidi Laine

    2017-06-01

    Full Text Available The risk of scooping is often used as a counter argument for open science, especially open data. In this case study I have examined openness strategies, practices and attitudes in two open collaboration research projects created by Finnish researchers, in order to understand what made them resistant to the fear of scooping. The radically open approach of the projects includes open by default funding proposals, co-authorship and community membership. Primary sources used are interviews of the projects’ founding members. The analysis indicates that openness requires trust in close peers, but not necessarily in research community or society at large. Based on the case study evidence, focusing on intrinsic goals, like new knowledge and bringing about ethical reform, instead of external goals such as publications, supports openness. Understanding fundaments of science, philosophy of science and research ethics, can also have a beneficial effect on willingness to share. Whether there are aspects in open sharing that makes it seem riskier from the point of view of certain demographical groups within research community, such as women, could be worth closer inspection.

  17. The Grid

    CERN Document Server

    Klotz, Wolf-Dieter

    2005-01-01

    Grid technology is widely emerging. Grid computing, most simply stated, is distributed computing taken to the next evolutionary level. The goal is to create the illusion of a simple, robust yet large and powerful self managing virtual computer out of a large collection of connected heterogeneous systems sharing various combinations of resources. This talk will give a short history how, out of lessons learned from the Internet, the vision of Grids was born. Then the extensible anatomy of a Grid architecture will be discussed. The talk will end by presenting a selection of major Grid projects in Europe and US and if time permits a short on-line demonstration.

  18. Data Science: History repeated? - The heritage of the Free and Open Source GIS community

    Science.gov (United States)

    Löwe, Peter; Neteler, Markus

    2014-05-01

    Data Science is described as the process of knowledge extraction from large data sets by means of scientific methods. The discipline draws heavily from techniques and theories from many fields, which are jointly used to furthermore develop information retrieval on structured or unstructured very large datasets. While the term Data Science was already coined in 1960, the current perception of this field places is still in the first section of the hype cycle according to Gartner, being well en route from the technology trigger stage to the peak of inflated expectations. In our view the future development of Data Science could benefit from the analysis of experiences from related evolutionary processes. One predecessor is the area of Geographic Information Systems (GIS). The intrinsic scope of GIS is the integration and storage of spatial information from often heterogeneous sources, data analysis, sharing of reconstructed or aggregated results in visual form or via data transfer. GIS is successfully applied to process and analyse spatially referenced content in a wide and still expanding range of science areas, spanning from human and social sciences like archeology, politics and architecture to environmental and geoscientific applications, even including planetology. This paper presents proven patterns for innovation and organisation derived from the evolution of GIS, which can be ported to Data Science. Within the GIS landscape, three strategic interacting tiers can be denoted: i) Standardisation, ii) applications based on closed-source software, without the option of access to and analysis of the implemented algorithms, and iii) Free and Open Source Software (FOSS) based on freely accessible program code enabling analysis, education and ,improvement by everyone. This paper focuses on patterns gained from the synthesis of three decades of FOSS development. We identified best-practices which evolved from long term FOSS projects, describe the role of community

  19. Citizen Science and Open Data: a model for Invasive Alien Species in Europe

    Directory of Open Access Journals (Sweden)

    Ana Cristina Cardoso

    2017-07-01

    Full Text Available Invasive Alien Species (IAS are a growing threat to Europe's biodiversity. The implementation of European Union Regulation on IAS can benefit from the involvement of the public in IAS recording and management through Citizen Science (CS initiatives. Aiming to tackle issues related with the use of CS projects on IAS topics, a dedicated workshop titled “Citizen Science and Open Data: a model for Invasive Alien Species in Europe” was organized by the Joint Research Centre (JRC and the European Cooperation in Science and Technology (COST Association. Fifty key stakeholders from all Europe, including two Members of the European Parliament, attended the workshop. With a clear focus on IAS, the workshop aimed at addressing the following issues: a CS and policy, b citizen engagement, and c CS data management. Nine short presentations provided input on CS and IAS issues. Participants discussed specific topics in several round tables (“world café” style and reported back their conclusions to the audience and full assembly moderated discussions. Overall, the workshop enabled the sharing of ideas, approaches and best practices regarding CS and IAS. Specific opportunities and pitfalls of using CS data in the whole policy cycle for IAS were recognized. Concerning the implementation of the IAS Regulation, CS data could complement official surveillance systems, and contribute to the early warning of the IAS of Union concern after appropriate validation by the Member States’ competent authorities. CS projects can additionally increase awareness and empower citizens. Attendees pointed out the importance for further public engagement in CS projects on IAS that demonstrate specific initiatives and approaches and analyze lessons learned from past experiences. In addition, the workshop noted that the data gathered from different CS projects on IAS are fragmented. It highlighted the need for using an open and accessible platform to upload data originating

  20. GSNL 2.0: leveraging on Open Science to promote science-based decision making in Disaster Risk Reduction

    Science.gov (United States)

    Salvi, Stefano; Rubbia, Giuliana; Abruzzese, Luigi

    2017-04-01

    In 2010 the GEO Geohazard Supersites and Natural Laboratories initiative (GSNL) launched the concept of a global partnership among the geophysical scientific community and the satellite and in situ data providers, aiming to promote scientific advancements in the knowledge of seismic and volcanic phenomena. The initial goal was successfully achieved, and many more new scientific results were obtained than it could have been possible if the Supersites had not existed (http://www.earthobservations.org/gsnl.php). At the same time the Supersites have demonstrated to be able to effectively support the rapid transfer of useful scientific information to the risk managers, exploiting the existing institutional relationships between the Supersite coordinators and the local decision makers. However, a more demanding call for action is given by the Sendai Framework 2015-2030 (outcome of the 2015 UN World Conference on Disaster Risk Reduction), where for the first time the knowledge of the risk components and the science based decision-making process are defined as top priorities for an effective DRR. There are evident possible synergies between the Sendai framework, GEO, the CEOS (Committee on Earth Observation Satellites), and GSNL, but for maximum benefit and effectiveness the latter needs to progress at a faster pace towards a full implementation of the Open Science approach to geohazard science. In the above global framework the Supersites can represent local test beds where to experiment coordination, collaboration and communication approaches and technological solutions tailored to the local situation, to ensure that the scientific community can contribute the information needed for the best possible decision making. This vision and the new developments of GSNL 2.0 have been approved by the GEO Program Board, and a clear roadmap has been set for the period 2017-2019. We will present the approach and the implementation plan at the conference.

  1. Securing the smart grid information exchange

    Energy Technology Data Exchange (ETDEWEB)

    Fries, Steffen; Falk, Rainer [Siemens AG, Corporate Technology, Muenchen (Germany)

    2012-07-01

    The smart grid is based on information exchange between various stakeholders using open communication technologies, to control the physical electric grid through the information grid. Protection against cyber attacks is essential to ensure a reliable operation of the smart grid. This challenge is addressed by various regulatory, standardization, and research activities. After giving an overview of the security demand of a smart grid, existing and appearing standardization activities are described. (orig.)

  2. Running an open experiment: transparency and reproducibility in soil and ecosystem science

    Science.gov (United States)

    Bond-Lamberty, Ben; Peyton Smith, A.; Bailey, Vanessa

    2016-08-01

    Researchers in soil and ecosystem science, and almost every other field, are being pushed—by funders, journals, governments, and their peers—to increase transparency and reproducibility of their work. A key part of this effort is a move towards open data as a way to fight post-publication data loss, improve data and code quality, enable powerful meta- and cross-disciplinary analyses, and increase trust in, and the efficiency of, publicly-funded research. Many scientists however lack experience in, and may be unsure of the benefits of, making their data and fully-reproducible analyses publicly available. Here we describe a recent ‘open experiment’, in which we documented every aspect of a soil incubation online, making all raw data, scripts, diagnostics, final analyses, and manuscripts available in real time. We found that using tools such as version control, issue tracking, and open-source statistical software improved data integrity, accelerated our team’s communication and productivity, and ensured transparency. There are many avenues to improve scientific reproducibility and data availability, of which is this only one example, and it is not an approach suited for every experiment or situation. Nonetheless, we encourage the communities in our respective fields to consider its advantages, and to lead rather than follow with respect to scientific reproducibility, transparency, and data availability.

  3. Grid Enabled Geospatial Catalogue Web Service

    Science.gov (United States)

    Chen, Ai-Jun; Di, Li-Ping; Wei, Ya-Xing; Liu, Yang; Bui, Yu-Qi; Hu, Chau-Min; Mehrotra, Piyush

    2004-01-01

    Geospatial Catalogue Web Service is a vital service for sharing and interoperating volumes of distributed heterogeneous geospatial resources, such as data, services, applications, and their replicas over the web. Based on the Grid technology and the Open Geospatial Consortium (0GC) s Catalogue Service - Web Information Model, this paper proposes a new information model for Geospatial Catalogue Web Service, named as GCWS which can securely provides Grid-based publishing, managing and querying geospatial data and services, and the transparent access to the replica data and related services under the Grid environment. This information model integrates the information model of the Grid Replica Location Service (RLS)/Monitoring & Discovery Service (MDS) with the information model of OGC Catalogue Service (CSW), and refers to the geospatial data metadata standards from IS0 19115, FGDC and NASA EOS Core System and service metadata standards from IS0 191 19 to extend itself for expressing geospatial resources. Using GCWS, any valid geospatial user, who belongs to an authorized Virtual Organization (VO), can securely publish and manage geospatial resources, especially query on-demand data in the virtual community and get back it through the data-related services which provide functions such as subsetting, reformatting, reprojection etc. This work facilitates the geospatial resources sharing and interoperating under the Grid environment, and implements geospatial resources Grid enabled and Grid technologies geospatial enabled. It 2!so makes researcher to focus on science, 2nd not cn issues with computing ability, data locztic, processir,g and management. GCWS also is a key component for workflow-based virtual geospatial data producing.

  4. Science education through open and distance learning at Higher Education level

    Directory of Open Access Journals (Sweden)

    Amrita NIGAM

    2007-10-01

    Full Text Available Abstract The changes faced by the society in the past few decades brought revolution in all areas. The job requirements have undergone change tremendously. The emergence of e-culture, e-education, e-governance, e-training, e-work sites and so on questioned the capacity of conventional face to face education in catering to all and relevance of existing job related skills to a great extent in the emerging global society. Today, every one has to update his/her educational and/or professional skills and competencies to cope up with the emerging work challenges. This is more so in the field of science and technology. At the same time, it is impossible to cater to educational and training opportunities to one and all those who aspire for it through the conventional set up. The distance and open learning (ODL seems to be one of the viable alternatives. Today, the success and viability of ODL is accepted globally. Coulter (1989, through a study demonstrated that ODL is a cost-effective medium in providing educational opportunities. Similarly Holmberg (1981 also mentioned ODL as a systematic teaching-learning medium by using variety of medium for imparting learning. The present study is an attempt to study the experiences of the open science learners of IGNOU on different aspect of the science higher education. Here a questionnaire was used to collect the data and responses from 81 students enrolled for B. Sc. from IGNOU were collected. The findings of the study reported that society has undergone drastic changes in the last few decades. The revolution led due to Information and Communication Technologies (ICTs have widely affected all aspects of society. The emerging jobs require entirely new skills and competencies i.e., employment in BPOs or switching over to e-governance, e-Banking and e- based sectors. Even e-learning has made numerous expectations from teachers and other personnel. The use of ICTs in almost every field needs adequately trained

  5. HP advances Grid Strategy for the adaptive enterprise

    CERN Multimedia

    2003-01-01

    "HP today announced plans to further enable its enterprise infrastructure technologies for grid computing. By leveraging open grid standards, HP plans to help customers simplify the use and management of distributed IT resources. The initiative will integrate industry grid standards, including the Globus Toolkit and Open Grid Services Architecture (OGSA), across HP's enterprise product lines" (1 page).

  6. Toward Open Science at the European Scale: Geospatial Semantic Array Programming for Integrated Environmental Modelling

    Science.gov (United States)

    de Rigo, Daniele; Corti, Paolo; Caudullo, Giovanni; McInerney, Daniel; Di Leo, Margherita; San-Miguel-Ayanz, Jesús

    2013-04-01

    Interfacing science and policy raises challenging issues when large spatial-scale (regional, continental, global) environmental problems need transdisciplinary integration within a context of modelling complexity and multiple sources of uncertainty [1]. This is characteristic of science-based support for environmental policy at European scale [1], and key aspects have also long been investigated by European Commission transnational research [2-5]. Parameters ofthe neededdata- transformations ? = {?1????m} (a.5) Wide-scale transdisciplinary modelling for environment. Approaches (either of computational science or of policy-making) suitable at a given domain-specific scale may not be appropriate for wide-scale transdisciplinary modelling for environment (WSTMe) and corresponding policy-making [6-10]. In WSTMe, the characteristic heterogeneity of available spatial information (a) and complexity of the required data-transformation modelling (D- TM) appeal for a paradigm shift in how computational science supports such peculiarly extensive integration processes. In particular, emerging wide-scale integration requirements of typical currently available domain-specific modelling strategies may include increased robustness and scalability along with enhanced transparency and reproducibility [11-15]. This challenging shift toward open data [16] and reproducible research [11] (open science) is also strongly suggested by the potential - sometimes neglected - huge impact of cascading effects of errors [1,14,17-19] within the impressively growing interconnection among domain-specific computational models and frameworks. From a computational science perspective, transdisciplinary approaches to integrated natural resources modelling and management (INRMM) [20] can exploit advanced geospatial modelling techniques with an awesome battery of free scientific software [21,22] for generating new information and knowledge from the plethora of composite data [23-26]. From the perspective

  7. Opening science the evolving guide on how the Internet is changing research, collaboration and scholarly publishing

    CERN Document Server

    Friesike, Sascha

    2014-01-01

    Modern information and communication technologies, together with a cultural upheaval within the research community, have profoundly changed research in nearly every aspect. Ranging from sharing and discussing ideas in social networks for scientists to new collaborative environments and novel publication formats, knowledge creation and dissemination as we know it is experiencing a vigorous shift towards increased transparency, collaboration and accessibility. Many assume that research workflows will change more in the next 20 years than they have in the last 200. This book provides researchers, decision makers, and other scientific stakeholders with a snapshot of the basics, the tools, and the underlying visions that drive the current scientific (r)evolution, often called ‘Open Science.’

  8. Welcome to Systems — A New Interdisciplinary Open Access Journal for Systems Science and Engineering

    Directory of Open Access Journals (Sweden)

    Thomas Huynh

    2012-04-01

    Full Text Available Natural and human-made systems abound around us. Our solar system, the human body, the food chain, and ecosystems are some examples of natural systems. Some human-made systems are transportation systems, weapon systems, computer systems, software systems, satellite communications systems, ships, missile defense systems, health care systems, the internet, financial systems, and regional economies. Understanding of natural systems is essential to the survival of the human species, which is intertwined with the survival of other species on earth. Having the knowledge and ability to build human-made systems is critical to the employment of systems that effectively serve the needs of their users. To gain such understanding and to acquire such knowledge and ability, it is necessary that cutting-edge research in systems science, systems engineering, and systems-related fields continue. This open access journal aims to achieve quick and global dissemination of results of such research. [...

  9. Listmania. How lists can open up fresh possibilities for research in the history of science.

    Science.gov (United States)

    Delbourgo, James; Müller-Wille, Staffan

    2012-12-01

    Anthropologists, linguists, cultural historians, and literary scholars have long emphasized the value of examining writing as a material practice and have often invoked the list as a paradigmatic example thereof. This Focus section explores how lists can open up fresh possibilities for research in the history of science. Drawing on examples from the early modern period, the contributors argue that attention to practices of list making reveals important relations between mercantile, administrative, and scientific attempts to organize the contents of the world. Early modern lists projected both spatial and temporal visions of nature: they inventoried objects in the process of exchange and collection; they projected possible trajectories for future endeavor; they publicized the social identities of scientific practitioners; and they became research tools that transformed understandings of the natural order.

  10. Supporting the advancement of science: Open access publishing and the role of mandates

    Directory of Open Access Journals (Sweden)

    Phelps Lisa

    2012-01-01

    Full Text Available Abstract In December 2011 the United States House of Representatives introduced a new bill, the Research Works Act (H.R.3699, which if passed could threaten the public's access to US government funded research. In a digital age when professional and lay parties alike look more and more to the online environment to keep up to date with developments in their fields, does this bill serve the best interests of the community? Those in support of the Research Works Act argue that government open access mandates undermine peer-review and take intellectual property from publishers without compensation, however journals like Journal of Translational Medicine show that this is not the case. Journal of Translational Medicine in affiliation with the Society for Immunotherapy of Cancer demonstrates how private and public organisations can work together for the advancement of science.

  11. Formatting Open Science: agilely creating multiple document formats for academic manuscripts with Pandoc Scholar

    Directory of Open Access Journals (Sweden)

    Albert Krewinkel

    2017-05-01

    Full Text Available The timely publication of scientific results is essential for dynamic advances in science. The ubiquitous availability of computers which are connected to a global network made the rapid and low-cost distribution of information through electronic channels possible. New concepts, such as Open Access publishing and preprint servers are currently changing the traditional print media business towards a community-driven peer production. However, the cost of scientific literature generation, which is either charged to readers, authors or sponsors, is still high. The main active participants in the authoring and evaluation of scientific manuscripts are volunteers, and the cost for online publishing infrastructure is close to negligible. A major time and cost factor is the formatting of manuscripts in the production stage. In this article we demonstrate the feasibility of writing scientific manuscripts in plain markdown (MD text files, which can be easily converted into common publication formats, such as PDF, HTML or EPUB, using Pandoc. The simple syntax of Markdown assures the long-term readability of raw files and the development of software and workflows. We show the implementation of typical elements of scientific manuscripts—formulas, tables, code blocks and citations—and present tools for editing, collaborative writing and version control. We give an example on how to prepare a manuscript with distinct output formats, a DOCX file for submission to a journal, and a LATEX/PDF version for deposition as a PeerJ preprint. Further, we implemented new features for supporting ‘semantic web’ applications, such as the ‘journal article tag suite’—JATS, and the ‘citation typing ontology’—CiTO standard. Reducing the work spent on manuscript formatting translates directly to time and cost savings for writers, publishers, readers and sponsors. Therefore, the adoption of the MD format contributes to the agile production of open science

  12. Current Grid operation and future role of the Grid

    Science.gov (United States)

    Smirnova, O.

    2012-12-01

    Grid-like technologies and approaches became an integral part of HEP experiments. Some other scientific communities also use similar technologies for data-intensive computations. The distinct feature of Grid computing is the ability to federate heterogeneous resources of different ownership into a seamless infrastructure, accessible via a single log-on. Like other infrastructures of similar nature, Grid functioning requires not only technologically sound basis, but also reliable operation procedures, monitoring and accounting. The two aspects, technological and operational, are closely related: weaker is the technology, more burden is on operations, and other way around. As of today, Grid technologies are still evolving: at CERN alone, every LHC experiment uses an own Grid-like system. This inevitably creates a heavy load on operations. Infrastructure maintenance, monitoring and incident response are done on several levels, from local system administrators to large international organisations, involving massive human effort worldwide. The necessity to commit substantial resources is one of the obstacles faced by smaller research communities when moving computing to the Grid. Moreover, most current Grid solutions were developed under significant influence of HEP use cases, and thus need additional effort to adapt them to other applications. Reluctance of many non-HEP researchers to use Grid negatively affects the outlook for national Grid organisations, which strive to provide multi-science services. We started from the situation where Grid organisations were fused with HEP laboratories and national HEP research programmes; we hope to move towards the world where Grid will ultimately reach the status of generic public computing and storage service provider and permanent national and international Grid infrastructures will be established. How far will we be able to advance along this path, depends on us. If no standardisation and convergence efforts will take place

  13. Current Grid operation and future role of the Grid

    International Nuclear Information System (INIS)

    Smirnova, O

    2012-01-01

    Grid-like technologies and approaches became an integral part of HEP experiments. Some other scientific communities also use similar technologies for data-intensive computations. The distinct feature of Grid computing is the ability to federate heterogeneous resources of different ownership into a seamless infrastructure, accessible via a single log-on. Like other infrastructures of similar nature, Grid functioning requires not only technologically sound basis, but also reliable operation procedures, monitoring and accounting. The two aspects, technological and operational, are closely related: weaker is the technology, more burden is on operations, and other way around. As of today, Grid technologies are still evolving: at CERN alone, every LHC experiment uses an own Grid-like system. This inevitably creates a heavy load on operations. Infrastructure maintenance, monitoring and incident response are done on several levels, from local system administrators to large international organisations, involving massive human effort worldwide. The necessity to commit substantial resources is one of the obstacles faced by smaller research communities when moving computing to the Grid. Moreover, most current Grid solutions were developed under significant influence of HEP use cases, and thus need additional effort to adapt them to other applications. Reluctance of many non-HEP researchers to use Grid negatively affects the outlook for national Grid organisations, which strive to provide multi-science services. We started from the situation where Grid organisations were fused with HEP laboratories and national HEP research programmes; we hope to move towards the world where Grid will ultimately reach the status of generic public computing and storage service provider and permanent national and international Grid infrastructures will be established. How far will we be able to advance along this path, depends on us. If no standardisation and convergence efforts will take place

  14. Social.Water--Open Source Citizen Science Software for CrowdHydrology

    Science.gov (United States)

    Fienen, M. N.; Lowry, C.

    2013-12-01

    CrowdHydrology is a crowd-sourced citizen science project in which passersby near streams are encouraged to read a gage and send an SMS (text) message with the water level to a number indicated on a sign. The project was initially started using free services such as Google Voice, Gmail, and Google Maps to acquire and present the data on the internet. Social.Water is open-source software, using Python and JavaScript, that automates the acquisition, categorization, and presentation of the data. Open-source objectives pervade both the project and the software as the code is hosted at Github, only free scripting codes are used, and any person or organization can install a gage and join the CrowdHydrology network. In the first year, 10 sites were deployed in upstate New York, USA. In the second year, expansion to 44 sites throughout the upper Midwest USA was achieved. Comparison with official USGS and academic measurements have shown low error rates. Citizen participation varies greatly from site to site, so surveys or other social information is sought for insight into why some sites experience higher rates of participation than others.

  15. When data sharing gets close to 100%: what human paleogenetics can teach the open science movement.

    Science.gov (United States)

    Anagnostou, Paolo; Capocasa, Marco; Milia, Nicola; Sanna, Emanuele; Battaggia, Cinzia; Luzi, Daniela; Destro Bisol, Giovanni

    2015-01-01

    This study analyzes data sharing regarding mitochondrial, Y chromosomal and autosomal polymorphisms in a total of 162 papers on ancient human DNA published between 1988 and 2013. The estimated sharing rate was not far from totality (97.6% ± 2.1%) and substantially higher than observed in other fields of genetic research (evolutionary, medical and forensic genetics). Both a questionnaire-based survey and the examination of Journals' editorial policies suggest that this high sharing rate cannot be simply explained by the need to comply with stakeholders requests. Most data were made available through body text, but the use of primary databases increased in coincidence with the introduction of complete mitochondrial and next-generation sequencing methods. Our study highlights three important aspects. First, our results imply that researchers' awareness of the importance of openness and transparency for scientific progress may complement stakeholders' policies in achieving very high sharing rates. Second, widespread data sharing does not necessarily coincide with a prevalent use of practices which maximize data findability, accessibility, useability and preservation. A detailed look at the different ways in which data are released can be very useful to detect failures to adopt the best sharing modalities and understand how to correct them. Third and finally, the case of human paleogenetics tells us that a widespread awareness of the importance of Open Science may be important to build reliable scientific practices even in the presence of complex experimental challenges.

  16. BioFed: federated query processing over life sciences linked open data.

    Science.gov (United States)

    Hasnain, Ali; Mehmood, Qaiser; Sana E Zainab, Syeda; Saleem, Muhammad; Warren, Claude; Zehra, Durre; Decker, Stefan; Rebholz-Schuhmann, Dietrich

    2017-03-15

    Biomedical data, e.g. from knowledge bases and ontologies, is increasingly made available following open linked data principles, at best as RDF triple data. This is a necessary step towards unified access to biological data sets, but this still requires solutions to query multiple endpoints for their heterogeneous data to eventually retrieve all the meaningful information. Suggested solutions are based on query federation approaches, which require the submission of SPARQL queries to endpoints. Due to the size and complexity of available data, these solutions have to be optimised for efficient retrieval times and for users in life sciences research. Last but not least, over time, the reliability of data resources in terms of access and quality have to be monitored. Our solution (BioFed) federates data over 130 SPARQL endpoints in life sciences and tailors query submission according to the provenance information. BioFed has been evaluated against the state of the art solution FedX and forms an important benchmark for the life science domain. The efficient cataloguing approach of the federated query processing system 'BioFed', the triple pattern wise source selection and the semantic source normalisation forms the core to our solution. It gathers and integrates data from newly identified public endpoints for federated access. Basic provenance information is linked to the retrieved data. Last but not least, BioFed makes use of the latest SPARQL standard (i.e., 1.1) to leverage the full benefits for query federation. The evaluation is based on 10 simple and 10 complex queries, which address data in 10 major and very popular data sources (e.g., Dugbank, Sider). BioFed is a solution for a single-point-of-access for a large number of SPARQL endpoints providing life science data. It facilitates efficient query generation for data access and provides basic provenance information in combination with the retrieved data. BioFed fully supports SPARQL 1.1 and gives access to the

  17. Opening up animal research and science-society relations? A thematic analysis of transparency discourses in the United Kingdom.

    Science.gov (United States)

    McLeod, Carmen; Hobson-West, Pru

    2016-10-01

    The use of animals in scientific research represents an interesting case to consider in the context of the contemporary preoccupation with transparency and openness in science and governance. In the United Kingdom, organisations critical of animal research have long called for more openness. More recently, organisations involved in animal research also seem to be embracing transparency discourses. This article provides a detailed analysis of publically available documents from animal protection groups, the animal research community and government/research funders. Our aim is to explore the similarities and differences in the way transparency is constructed and to identify what more openness is expected to achieve. In contrast to the existing literature, we conclude that the slipperiness of transparency discourses may ultimately have transformative implications for the relationship between science and society and that contemporary openness initiatives might be sowing the seeds for change to the status quo. © The Author(s) 2015.

  18. Towards a global participatory platform. Democratising open data, complexity science and collective intelligence

    Science.gov (United States)

    Buckingham Shum, S.; Aberer, K.; Schmidt, A.; Bishop, S.; Lukowicz, P.; Anderson, S.; Charalabidis, Y.; Domingue, J.; de Freitas, S.; Dunwell, I.; Edmonds, B.; Grey, F.; Haklay, M.; Jelasity, M.; Karpištšenko, A.; Kohlhammer, J.; Lewis, J.; Pitt, J.; Sumner, R.; Helbing, D.

    2012-11-01

    The FuturICT project seeks to use the power of big data, analytic models grounded in complexity science, and the collective intelligence they yield for societal benefit. Accordingly, this paper argues that these new tools should not remain the preserve of restricted government, scientific or corporate élites, but be opened up for societal engagement and critique. To democratise such assets as a public good, requires a sustainable ecosystem enabling different kinds of stakeholder in society, including but not limited to, citizens and advocacy groups, school and university students, policy analysts, scientists, software developers, journalists and politicians. Our working name for envisioning a sociotechnical infrastructure capable of engaging such a wide constituency is the Global Participatory Platform (GPP). We consider what it means to develop a GPP at the different levels of data, models and deliberation, motivating a framework for different stakeholders to find their ecological niches at different levels within the system, serving the functions of (i) sensing the environment in order to pool data, (ii) mining the resulting data for patterns in order to model the past/present/future, and (iii) sharing and contesting possible interpretations of what those models might mean, and in a policy context, possible decisions. A research objective is also to apply the concepts and tools of complexity science and social science to the project's own work. We therefore conceive the global participatory platform as a resilient, epistemic ecosystem, whose design will make it capable of self-organization and adaptation to a dynamic environment, and whose structure and contributions are themselves networks of stakeholders, challenges, issues, ideas and arguments whose structure and dynamics can be modelled and analysed.

  19. Neue Aufgaben für wissenschaftliche Bibliotheken: Das Beispiel Open Science Lab

    Directory of Open Access Journals (Sweden)

    Lambert Heller

    2015-10-01

    Full Text Available Vor dem Hintergrund des Aufkommens vieler neuer digitaler Werkzeuge und Methoden zur Unterstützung des wissenschaftlichen Arbeitens wird seit etwa fünf Jahren unter wissenschaftlichen Bibliothekaren in Deutschland immer häufiger über Innovationsmanagement diskutiert. Wie lassen sich relevante Trends und Herausforderungen rechtzeitig erkennen und mit den begrenzten Ressourcen einer Einrichtung des öffentlichen Dienstes adäquat aufgreifen, bis hin zu einer Veränderung der Bibliotheksstrategie? Der Beitrag behandelt das Modell des an der Technischen Informationsbibliothek Hannover (TIB 2013 ins Leben gerufenen Open Science Lab. Unter Leitung des Autors werden Trends beobachtet und aufgegriffen, um in enger Zusammenarbeit mit Wissenschaftlern und Wissenschaftlerinnen neue digitale Werkzeuge und Methoden zu erproben, eine neue Informationspraxis zu kultivieren und daraus Innovationen für das Dienste-Spektrum der Bibliothek abzuleiten. Dies wird beispielhaft anhand der beiden Schwerpunktthemen kollaboratives Schreiben sowie linked-data-basierte Forschungsinformationssysteme (FIS geschildert und diskutiert. Given the rise of many new digital tools and methods for supporting scientific work, the last five years have seen a lot of discussion amongst German academic librarians about innovation management. How can we discover relevant trends and challenges in time and respond to them adequately up to the point of changing whole library strategies, despite the limited resources of a public sector institution? The paper presents the model of the Open Science Lab which was set up at the German National Library of Science and Technology (TIB Hannover in 2013. Under the direction of the author and in close collaboration with scientific communities, the lab group keeps track of trends and selects some of them in order to try out new tools and methods. The ultimate aim is to cultivate new information practices and develop new, innovative

  20. Opening up Openness to Experience: A Four-Factor Model and Relations to Creative Achievement in the Arts and Sciences

    Science.gov (United States)

    Kaufman, Scott Barry

    2013-01-01

    Openness to experience is the broadest personality domain of the Big Five, including a mix of traits relating to intellectual curiosity, intellectual interests, perceived intelligence, imagination, creativity, artistic and aesthetic interests, emotional and fantasy richness, and unconventionality. Likewise, creative achievement is a broad…

  1. Experimental and numerical investigation of water flow through spacer grids of nuclear fuel elements using the Open FOAM code; Investigação numérica e experimental do escoamento de água através de grades espaçadoras de elementos combustíveis nucleares utilizando o código OpenFOAM

    Energy Technology Data Exchange (ETDEWEB)

    Vidal, Guilherme A.M.; Vieira, Tiago A.S.; Castro, Higor F.P., E-mail: gvidal.ufmg@gmail.com, E-mail: tiago.vieira.eng@gmail.com, E-mail: higorfabiano@hotmail.com [Universidade Federal de Minas Gerais (UFMG), Belo Horizonte, MG (Brazil). Dept. de Engenharia Mecânica; Santos, André A.C. dos; Silva, Vitor V. A.; Barros Filho, José A., E-mail: aacs@cdtn.br, E-mail: vitors@cdtn.br, E-mail: jabf@cdtn.br [Centro de Desenvolvimento da Tecnologia Nuclear (CDTN/CNEN-MG), Belo Horizonte, MG (Brazil)

    2017-07-01

    With the advancement and development of computational tools, the studies of thermofluidodynamic behavior in nuclear fuel elements have been developed in recent years. Of the devices present in these elements, the spacing grids received more attention. They have kept the fuel rods equally spaced and have fins that aim to improve the heat transfer process between the water and the fuel element. Therefore, the grids present an important structural and thermal function. This work was carried out with the purpose of verifying and validating simulations of spacer grids using OpenFOAM (2017) software of Computational Fluid Dynamics (CFD). The simulations were validated using results obtained through the commercial CFD program, Ansys CFX, and experiments available in the literature and obtained in test sections assembled on the Water-Air Circuit (CCA) of the CDTN thermo-hydraulic laboratory.

  2. Power grid control in retreat

    International Nuclear Information System (INIS)

    Morch, Stein

    2000-01-01

    Bilateral grid control that obstructs free trade of electricity is in retreat. Negotiations on opening the Skagerrak cables are in progress. The EU, national authorities, network companies with system responsibility, market actors, electricity exchanges all push for quick opening of the grid. At present, free trade of electricity is hindered not so much by physical bottlenecks in the grid as by market actors possessing control and bilateral agreements. The article discusses current bilateral agreements and how they might affect the possibility of a free trade of electricity in Europe

  3. Modelling noise propagation using Grid Resources. Progress within GDI-Grid

    Science.gov (United States)

    Kiehle, Christian; Mayer, Christian; Padberg, Alexander; Stapelfeld, Hartmut

    2010-05-01

    Modelling noise propagation using Grid Resources. Progress within GDI-Grid. GDI-Grid (english: SDI-Grid) is a research project funded by the German Ministry for Science and Education (BMBF). It aims at bridging the gaps between OGC Web Services (OWS) and Grid infrastructures and identifying the potential of utilizing the superior storage capacities and computational power of grid infrastructures for geospatial applications while keeping the well-known service interfaces specified by the OGC. The project considers all major OGC webservice interfaces for Web Mapping (WMS), Feature access (Web Feature Service), Coverage access (Web Coverage Service) and processing (Web Processing Service). The major challenge within GDI-Grid is the harmonization of diverging standards as defined by standardization bodies for Grid computing and spatial information exchange. The project started in 2007 and will continue until June 2010. The concept for the gridification of OWS developed by lat/lon GmbH and the Department of Geography of the University of Bonn is applied to three real-world scenarios in order to check its practicability: a flood simulation, a scenario for emergency routing and a noise propagation simulation. The latter scenario is addressed by the Stapelfeldt Ingenieurgesellschaft mbH located in Dortmund adapting their LimA software to utilize grid resources. Noise mapping of e.g. traffic noise in urban agglomerates and along major trunk roads is a reoccurring demand of the EU Noise Directive. Input data requires road net and traffic, terrain, buildings and noise protection screens as well as population distribution. Noise impact levels are generally calculated in 10 m grid and along relevant building facades. For each receiver position sources within a typical range of 2000 m are split down into small segments, depending on local geometry. For each of the segments propagation analysis includes diffraction effects caused by all obstacles on the path of sound propagation

  4. Estimating the Technical Potential of Grid-Connected PV Systems in Indonesia : A Comparison of a Method Based on Open Access Data with a Method Based on GIS

    NARCIS (Netherlands)

    Kunaifi, Kunaifi; Reinders, Angelina H.M.E.; Smets, Arno

    2017-01-01

    In this paper, we compare two methods for estimating the technical potential of grid-connected PV systems in Indonesia. One was a method developed by Veldhuis and Renders [1] and the other is a new method using Geographic Information System (GIS) and multi-criteria decision making (MCDM). The first

  5. OPEN INNOVATION PROJECT: THE SYSTEM OF ONLINE INDICATORS IN SCIENCE, TECHNOLOGY AND INNOVATION OF AMAZONAS (SiON

    Directory of Open Access Journals (Sweden)

    Moises Andrade Coelho

    2016-05-01

    Full Text Available This study aims to evaluate the implementation of an open innovation project in a public institution in the state of Amazonas. The theoretical and empirical background deals with science, technology and innovation indicators and open innovation. The study is characterized as a qualitative and descriptive research, with the case study as a methodological procedure. The delimitation of the universe was composed by a public institution in the area of science, technology and innovation (ST&I. In the case study, it was used an approach as tool to assess the implementation of open innovation projects. The results are shown several stages of open innovation project analyzed.  The study demonstrates the implications of open innovation project adoption to the strengthening of external networks and the maturing of the internal environment. The relevance of the study is based on the evaluation of an open innovation project in a public institution in order to foster the transition from traditional innovation processes to open innovation processes.

  6. CDF GlideinWMS usage in Grid computing of high energy physics

    International Nuclear Information System (INIS)

    Zvada, Marian; Sfiligoi, Igor; Benjamin, Doug

    2010-01-01

    Many members of large science collaborations already have specialized grids available to advance their research in the need of getting more computing resources for data analysis. This has forced the Collider Detector at Fermilab (CDF) collaboration to move beyond the usage of dedicated resources and start exploiting Grid resources. Nowadays, CDF experiment is increasingly relying on glidein-based computing pools for data reconstruction. Especially, Monte Carlo production and user data analysis, serving over 400 users by central analysis farm middleware (CAF) on the top of Condor batch system and CDF Grid infrastructure. Condor is designed as distributed architecture and its glidein mechanism of pilot jobs is ideal for abstracting the Grid computing by making a virtual private computing pool. We would like to present the first production use of the generic pilot-based Workload Management System (glideinWMS), which is an implementation of the pilot mechanism based on the Condor distributed infrastructure. CDF Grid computing uses glideinWMS for its data reconstruction on the FNAL campus Grid, user analysis and Monte Carlo production across Open Science Grid (OSG). We review this computing model and setup used including CDF specific configuration within the glideinWMS system which provides powerful scalability and makes Grid computing working like in a local batch environment with ability to handle more than 10000 running jobs at a time.

  7. Open source software and low cost sensors for teaching UAV science

    Science.gov (United States)

    Kefauver, S. C.; Sanchez-Bragado, R.; El-Haddad, G.; Araus, J. L.

    2016-12-01

    Drones, also known as UASs (unmanned aerial systems), UAVs (Unmanned Aerial Vehicles) or RPAS (Remotely piloted aircraft systems), are both useful advanced scientific platforms and recreational toys that are appealing to younger generations. As such, they can make for excellent education tools as well as low-cost scientific research project alternatives. However, the process of taking pretty pictures to remote sensing science can be daunting if one is presented with only expensive software and sensor options. There are a number of open-source tools and low cost platform and sensor options available that can provide excellent scientific research results, and, by often requiring more user-involvement than commercial software and sensors, provide even greater educational benefits. Scale-invariant feature transform (SIFT) algorithm implementations, such as the Microsoft Image Composite Editor (ICE), which can create quality 2D image mosaics with some motion and terrain adjustments and VisualSFM (Structure from Motion), which can provide full image mosaicking with movement and orthorectification capacities. RGB image quantification using alternate color space transforms, such as the BreedPix indices, can be calculated via plugins in the open-source software Fiji (http://fiji.sc/Fiji; http://github.com/george-haddad/CIMMYT). Recent analyses of aerial images from UAVs over different vegetation types and environments have shown RGB metrics can outperform more costly commercial sensors. Specifically, Hue-based pixel counts, the Triangle Greenness Index (TGI), and the Normalized Green Red Difference Index (NGRDI) consistently outperformed NDVI in estimating abiotic and biotic stress impacts on crop health. Also, simple kits are available for NDVI camera conversions. Furthermore, suggestions for multivariate analyses of the different RGB indices in the "R program for statistical computing", such as classification and regression trees can allow for a more approachable

  8. 78 FR 45992 - National Science and Technology Council; Notice of Meeting: Open Meeting of the National Science...

    Science.gov (United States)

    2013-07-30

    ..., Engineering, and Technology Subcommittee National Nanotechnology Coordination Office ACTION: Notice of public meeting. SUMMARY: The National Nanotechnology Coordination Office (NNCO), on behalf of the Nanoscale Science, Engineering, and Technology (NSET) Subcommittee of the Committee on Technology, National Science...

  9. An open source approach to enable the reproducibility of scientific workflows in the ocean sciences

    Science.gov (United States)

    Di Stefano, M.; Fox, P. A.; West, P.; Hare, J. A.; Maffei, A. R.

    2013-12-01

    Every scientist should be able to rerun data analyses conducted by his or her team and regenerate the figures in a paper. However, all too often the correct version of a script goes missing, or the original raw data is filtered by hand and the filtering process is undocumented, or there is lack of collaboration and communication among scientists working in a team. Here we present 3 different use cases in ocean sciences in which end-to-end workflows are tracked. The main tool that is deployed to address these use cases is based on a web application (IPython Notebook) that provides the ability to work on very diverse and heterogeneous data and information sources, providing an effective way to share the and track changes to source code used to generate data products and associated metadata, as well as to track the overall workflow provenance to allow versioned reproducibility of a data product. Use cases selected for this work are: 1) A partial reproduction of the Ecosystem Status Report (ESR) for the Northeast U.S. Continental Shelf Large Marine Ecosystem. Our goal with this use case is to enable not just the traceability but also the reproducibility of this biannual report, keeping track of all the processes behind the generation and validation of time-series and spatial data and information products. An end-to-end workflow with code versioning is developed so that indicators in the report may be traced back to the source datasets. 2) Realtime generation of web pages to be able to visualize one of the environmental indicators from the Ecosystem Advisory for the Northeast Shelf Large Marine Ecosystem web site. 3) Data and visualization integration for ocean climate forecasting. In this use case, we focus on a workflow to describe how to provide access to online data sources in the NetCDF format and other model data, and make use of multicore processing to generate video animation from time series of gridded data. For each use case we show how complete workflows

  10. Biochemistry - Open TG-GATEs | LSDB Archive [Life Science Database Archive metadata

    Lifescience Database Archive (English)

    Full Text Available tests. Data file File name: open_tggates_biochemistry.zip File URL: ftp://ftp.biosciencedbc.jp/archive/open-...tggates/LATEST/open_tggates_biochemistry.zip File size: 666 KB Simple search URL ...http://togodb.biosciencedbc.jp/togodb/view/open_tggates_biochemistry#en Data acquisition method - Data analy

  11. Pathological items - Open TG-GATEs | LSDB Archive [Life Science Database Archive metadata

    Lifescience Database Archive (English)

    Full Text Available . Data file File name: open_tggates_pathology.zip File URL: ftp://ftp.biosciencedbc.jp/archive/open-tggates/LATEST/open_tggates_patho...b.biosciencedbc.jp/togodb/view/open_tggates_pathology#en Data acquisition method We prepared hematoxylin-eos...logy.zip File size: 89 KB Simple search URL http://togod

  12. Open-Ended Science Inquiry in Lower Secondary School: Are Students' Learning Needs Being Met?

    Science.gov (United States)

    Whannell, Robert; Quinn, Fran; Taylor, Subhashni; Harris, Katherine; Cornish, Scott; Sharma, Manjula

    2018-01-01

    Australian science curricula have promoted the use of investigations that allow secondary students to engage deeply with the methods of scientific inquiry, through student-directed, open-ended investigations over an extended duration. This study presents the analysis of data relating to the frequency of completion and attitudes towards long…

  13. gCube Grid services

    CERN Document Server

    Andrade, Pedro

    2008-01-01

    gCube is a service-based framework for eScience applications requiring collaboratory, on-demand, and intensive information processing. It provides to these communities Virtual Research Environments (VREs) to support their activities. gCube is build on top of standard technologies for computational Grids, namely the gLite middleware. The software was produced by the DILIGENT project and will continue to be supported and further developed by the D4Science project. gCube reflects within its name a three-sided interpretation of the Grid vision of resource sharing: sharing of computational resources, sharing of structured data, and sharing of application services. As such, gCube embodies the defining characteristics of computational Grids, data Grids, and virtual data Grids. Precisely, it builds on gLite middleware for managing distributed computations and unstructured data, includes dedicated services for managing data and metadata, provides services for distributed information retrieval, allows the orchestration...

  14. Software Uncertainty in Integrated Environmental Modelling: the role of Semantics and Open Science

    Science.gov (United States)

    de Rigo, Daniele

    2013-04-01

    growing debate on open science and scientific knowledge freedom [2,56-59]. In particular, the role of free software has been underlined within the paradigm of reproducible research [50,58-60]. In the spectrum of reproducibility, the free availability of the source code is emphasized [58] as the first step from non-reproducible research (only based on classic peer-reviewed publications) toward reproducibility. Applying this paradigm to WSTMe, an alternative strategy to black-boxes would suggest exposing not only final outputs but also key intermediate layers of data and information along with the corresponding free software D- TM modules. A concise, semantically-enhanced modularization [14,15] may help not only to see the code (as a very basic prerequisite for semantic transparency) but also to understand - and correct - it [61]. Semantically-enhanced, concise modularization is e.g. supported by semantic array programming (SemAP) [14,15] and its extension to geospatial problems [8,10]. Some WSTMe may surely be classified in the subset of software systems which "are growing well past the ability of a small group of people to completely understand the content", while "data from these systems are often used for critical decision making" [52]. In this context, the further uncertainty arising from the unpredicted "(not to say unpredictable)" [53] behaviour of software errors propagation in WSTMe should be explicitly considered as software uncertainty [62,63]. Thedata and informationflow ofa black- box D-TM isoften a(hidden)compositionofD-TM modules: Semantics and design diversity. Silent faults [64] are a critical class of software errors altering computation output without evident symptoms - such as computation premature interruption (exceptions, error messages, ...), obviously unrealistic results or computation patterns (e.g. noticeably shorter/longer or endless computations). As it has been underlined, "many scientific results are corrupted, perhaps fatally so, by

  15. Grid technologies and applications: architecture and achievements

    International Nuclear Information System (INIS)

    Ian Foster

    2001-01-01

    The 18 months since CHEP'2000 have seen significant advances in Grid computing, both within and outside high energy physics. While in early 2000, Grid computing was a novel concept that most CHEP attendees were being exposed to for the first time, now considerable consensus is seen on Grid architecture, a solid and widely adopted technology base, major funding initiatives, a wide variety of projects developing applications and technologies, and major deployment projects aimed at creating robust Grid infrastructures. The author provides a summary of major developments and trends, focusing on the Globus open source Grid software project and the GriPhyN data grid project

  16. Southampton uni's computer whizzes develop "mini" grid

    CERN Multimedia

    Sherriff, Lucy

    2006-01-01

    "In a bid to help its students explore the potential of grid computing, the University of Southampton's Computer Science department has developed what it calls a "lightweight grid". The system has been designed to allow students to experiment with grid technology without the complexity of inherent security concerns of the real thing. (1 page)

  17. Grid Computing

    Indian Academy of Sciences (India)

    A computing grid interconnects resources such as high performancecomputers, scientific databases, and computercontrolledscientific instruments of cooperating organizationseach of which is autonomous. It precedes and is quitedifferent from cloud computing, which provides computingresources by vendors to customers ...

  18. Grid Computing

    Indian Academy of Sciences (India)

    IAS Admin

    emergence of supercomputers led to the use of computer simula- tion as an .... Scientific and engineering applications (e.g., Tera grid secure gate way). Collaborative ... Encryption, privacy, protection from malicious software. Physical Layer.

  19. International Symposium on Grids and Clouds (ISGC) 2017

    Science.gov (United States)

    2017-03-01

    The International Symposium on Grids and Clouds (ISGC) 2017 will be held at Academia Sinica in Taipei, Taiwan from 5-10 March 2017, with co- located events and workshops. The main theme of ISGC 2017 is "Global Challenges: From Open Data to Open Science". The unprecedented progress in ICT has transformed the way education is conducted and research is carried out. The emerging global e-Infrastructure, championed by global science communities such as High Energy Physics, Astronomy, and Bio- medicine, must permeate into other sciences. Many areas, such as climate change, disaster mitigation, and human sustainability and well-being, represent global challenges where collaboration over e-Infrastructure will presumably help resolve the common problems of the people who are impacted. Access to global e-Infrastructure helps also the less globally organized, long-tail sciences, with their own collaboration challenges. Open data are not only a political phenomenon serving government transparency; they also create an opportunity to eliminate access barriers to all scientific data, specifically data from global sciences and regional data that concern natural phenomena and people. In this regard, the purpose of open data is to improve sciences, accelerating specifically those that may benefit people. Nevertheless, to eliminate barriers to open data is itself a daunting task and the barriers to individuals, institutions and big collaborations are manifold. Open science is a step beyond open data, where the tools and understanding of scientific data must be made available to whoever is interested to participate in such scientific research. The promotion of open science may change the academic tradition practiced over the past few hundred years. This change of dynamics may contribute to the resolution of common challenges of human sustainability where the current pace of scientific progress is not sufficiently fast. ISGC 2017 created a face-to-face venue where individual

  20. ReSS: Resource Selection Service for National and Campus Grid Infrastructure

    International Nuclear Information System (INIS)

    Mhashilkar, Parag; Garzoglio, Gabriele; Levshina, Tanya; Timm, Steve

    2010-01-01

    The Open Science Grid (OSG) offers access to around hundred Compute elements (CE) and storage elements (SE) via standard Grid interfaces. The Resource Selection Service (ReSS) is a push-based workload management system that is integrated with the OSG information systems and resources. ReSS integrates standard Grid tools such as Condor, as a brokering service and the gLite CEMon, for gathering and publishing resource information in GLUE Schema format. ReSS is used in OSG by Virtual Organizations (VO) such as Dark Energy Survey (DES), DZero and Engagement VO. ReSS is also used as a Resource Selection Service for Campus Grids, such as FermiGrid. VOs use ReSS to automate the resource selection in their workload management system to run jobs over the grid. In the past year, the system has been enhanced to enable publication and selection of storage resources and of any special software or software libraries (like MPI libraries) installed at computing resources. In this paper, we discuss the Resource Selection Service, its typical usage on the two scales of a National Cyber Infrastructure Grid, such as OSG, and of a campus Grid, such as FermiGrid.

  1. ReSS: Resource Selection Service for National and Campus Grid Infrastructure

    Energy Technology Data Exchange (ETDEWEB)

    Mhashilkar, Parag; Garzoglio, Gabriele; Levshina, Tanya; Timm, Steve, E-mail: parag@fnal.go, E-mail: garzogli@fnal.go, E-mail: tlevshin@fnal.go, E-mail: timm@fnal.go [Fermi National Accelerator Laboratory, P O Box 500, Batavia, IL - 60510 (United States)

    2010-04-01

    The Open Science Grid (OSG) offers access to around hundred Compute elements (CE) and storage elements (SE) via standard Grid interfaces. The Resource Selection Service (ReSS) is a push-based workload management system that is integrated with the OSG information systems and resources. ReSS integrates standard Grid tools such as Condor, as a brokering service and the gLite CEMon, for gathering and publishing resource information in GLUE Schema format. ReSS is used in OSG by Virtual Organizations (VO) such as Dark Energy Survey (DES), DZero and Engagement VO. ReSS is also used as a Resource Selection Service for Campus Grids, such as FermiGrid. VOs use ReSS to automate the resource selection in their workload management system to run jobs over the grid. In the past year, the system has been enhanced to enable publication and selection of storage resources and of any special software or software libraries (like MPI libraries) installed at computing resources. In this paper, we discuss the Resource Selection Service, its typical usage on the two scales of a National Cyber Infrastructure Grid, such as OSG, and of a campus Grid, such as FermiGrid.

  2. ReSS: Resource Selection Service for National and Campus Grid Infrastructure

    International Nuclear Information System (INIS)

    Mhashilkar, Parag; Garzoglio, Gabriele; Levshina, Tanya; Timm, Steve

    2009-01-01

    The Open Science Grid (OSG) offers access to around hundred Compute elements (CE) and storage elements (SE) via standard Grid interfaces. The Resource Selection Service (ReSS) is a push-based workload management system that is integrated with the OSG information systems and resources. ReSS integrates standard Grid tools such as Condor, as a brokering service and the gLite CEMon, for gathering and publishing resource information in GLUE Schema format. ReSS is used in OSG by Virtual Organizations (VO) such as Dark Energy Survey (DES), DZero and Engagement VO. ReSS is also used as a Resource Selection Service for Campus Grids, such as FermiGrid. VOs use ReSS to automate the resource selection in their workload management system to run jobs over the grid. In the past year, the system has been enhanced to enable publication and selection of storage resources and of any special software or software libraries (like MPI libraries) installed at computing resources. In this paper, we discuss the Resource Selection Service, its typical usage on the two scales of a National Cyber Infrastructure Grid, such as OSG, and of a campus Grid, such as FermiGrid.

  3. Data privacy for the smart grid

    CERN Document Server

    Herold, Rebecca

    2015-01-01

    The Smart Grid and PrivacyWhat Is the Smart Grid? Changes from Traditional Energy Delivery Smart Grid Possibilities Business Model Transformations Emerging Privacy Risks The Need for Privacy PoliciesPrivacy Laws, Regulations, and Standards Privacy-Enhancing Technologies New Privacy Challenges IOT Big Data What Is the Smart Grid?Market and Regulatory OverviewTraditional Electricity Business SectorThe Electricity Open Market Classifications of Utilities Rate-Making ProcessesElectricity Consumer

  4. Fostering Data Openness by Enabling Science: A Proposal for Micro-Funding

    Directory of Open Access Journals (Sweden)

    Brian Rappert

    2017-09-01

    Full Text Available In recent years, the promotion of data sharing has come with the recognition that not all scientists around the world are equally placed to partake in such activities. Notably, those within developing countries are sometimes regarded as experiencing hardware infrastructure challenges and data management skill shortages. Proposed remedies often focus on the provision of information and communication technology as well as enhanced data management training. Building on prior empirical social research undertaken in sub-Sahara Africa, this article provides a complementary but alternative proposal; namely, fostering data openness by enabling research. Towards this end, the underlying rationale is outlined for a ‘bottom-up’ system of research support that addresses the day-to-day demands in low-resourced environments. This approach draws on lessons from development financial assistance programs in recent decades. In doing so, this article provides an initial framework for science funding that call for holding together concerns for ensuring research can be undertaken in low-resourced laboratory environments with concerns about the data generated in such settings can be shared.

  5. Grid Interoperation with ARC middleware for the CMS experiment

    International Nuclear Information System (INIS)

    Edelmann, Erik; Groenager, Michael; Johansson, Daniel; Kleist, Josva; Field, Laurence; Qing, Di; Frey, Jaime; Happonen, Kalle; Klem, Jukka; Koivumaeki, Jesper; Linden, Tomas; Pirinen, Antti

    2010-01-01

    The Compact Muon Solenoid (CMS) is one of the general purpose experiments at the CERN Large Hadron Collider (LHC). CMS computing relies on different grid infrastructures to provide computational and storage resources. The major grid middleware stacks used for CMS computing are gLite, Open Science Grid (OSG) and ARC (Advanced Resource Connector). Helsinki Institute of Physics (HIP) hosts one of the Tier-2 centers for CMS computing. CMS Tier-2 centers operate software systems for data transfers (PhEDEx), Monte Carlo production (ProdAgent) and data analysis (CRAB). In order to provide the Tier-2 services for CMS, HIP uses tools and components from both ARC and gLite grid middleware stacks. Interoperation between grid systems is a challenging problem and HIP uses two different solutions to provide the needed services. The first solution is based on gLite-ARC grid level interoperability. This allows to use ARC resources in CMS without modifying the CMS application software. The second solution is based on developing specific ARC plugins in CMS software.

  6. Grid Interoperation with ARC middleware for the CMS experiment

    Energy Technology Data Exchange (ETDEWEB)

    Edelmann, Erik; Groenager, Michael; Johansson, Daniel; Kleist, Josva [Nordic DataGrid Facility, Kastruplundgade 22, 1., DK-2770 Kastrup (Denmark); Field, Laurence; Qing, Di [CERN, CH-1211 Geneve 23 (Switzerland); Frey, Jaime [University of Wisconsin-Madison, 1210 W. Dayton St., Madison, WI (United States); Happonen, Kalle; Klem, Jukka; Koivumaeki, Jesper; Linden, Tomas; Pirinen, Antti, E-mail: Jukka.Klem@cern.c [Helsinki Institute of Physics, PO Box 64, FIN-00014 University of Helsinki (Finland)

    2010-04-01

    The Compact Muon Solenoid (CMS) is one of the general purpose experiments at the CERN Large Hadron Collider (LHC). CMS computing relies on different grid infrastructures to provide computational and storage resources. The major grid middleware stacks used for CMS computing are gLite, Open Science Grid (OSG) and ARC (Advanced Resource Connector). Helsinki Institute of Physics (HIP) hosts one of the Tier-2 centers for CMS computing. CMS Tier-2 centers operate software systems for data transfers (PhEDEx), Monte Carlo production (ProdAgent) and data analysis (CRAB). In order to provide the Tier-2 services for CMS, HIP uses tools and components from both ARC and gLite grid middleware stacks. Interoperation between grid systems is a challenging problem and HIP uses two different solutions to provide the needed services. The first solution is based on gLite-ARC grid level interoperability. This allows to use ARC resources in CMS without modifying the CMS application software. The second solution is based on developing specific ARC plugins in CMS software.

  7. Grid Interoperation with ARC Middleware for the CMS Experiment

    CERN Document Server

    Edelmann, Erik; Frey, Jaime; Gronager, Michael; Happonen, Kalle; Johansson, Daniel; Kleist, Josva; Klem, Jukka; Koivumaki, Jesper; Linden, Tomas; Pirinen, Antti; Qing, Di

    2010-01-01

    The Compact Muon Solenoid (CMS) is one of the general purpose experiments at the CERN Large Hadron Collider (LHC). CMS computing relies on different grid infrastructures to provide computational and storage resources. The major grid middleware stacks used for CMS computing are gLite, Open Science Grid (OSG) and ARC (Advanced Resource Connector). Helsinki Institute of Physics (HIP) hosts one of the Tier-2 centers for CMS computing. CMS Tier-2 centers operate software systems for data transfers (PhEDEx), Monte Carlo production (ProdAgent) and data analysis (CRAB). In order to provide the Tier-2 services for CMS, HIP uses tools and components from both ARC and gLite grid middleware stacks. Interoperation between grid systems is a challenging problem and HIP uses two different solutions to provide the needed services. The first solution is based on gLite-ARC grid level interoperability. This allows to use ARC resources in CMS without modifying the CMS application software. The second solution is based on developi...

  8. How partnership accelerates Open Science: High Energy Physics and INSPIRE, a case study of a complex repository ecosystem

    CERN Document Server

    AUTHOR|(CDS)2079501; Hecker, Bernard Louis; Holtkamp, Annette; Mele, Salvatore; O'Connell, Heath; Sachs, Kirsten; Simko, Tibor; Schwander, Thorsten

    2013-01-01

    Public calls, agency mandates and scientist demand for Open Science are by now a reality with different nuances across diverse research communities. A complex “ecosystem” of services and tools, mostly communityDdriven, will underpin this revolution in science. Repositories stand to accelerate this process, as “openness” evolves beyond text, in lockstep with scholarly communication. We present a case study of a global discipline, HighDEnergy Physics (HEP), where most of these transitions have already taken place in a “social laboratory” of multiple global information services interlinked in a complex, but successful, ecosystem at the service of scientists. We discuss our firstDhand experience, at a technical and organizational level, of leveraging partnership across repositories and with the user community in support of Open Science, along threads relevant to the OR2013 community.

  9. Information Power Grid: Distributed High-Performance Computing and Large-Scale Data Management for Science and Engineering

    Science.gov (United States)

    Johnston, William E.; Gannon, Dennis; Nitzberg, Bill

    2000-01-01

    We use the term "Grid" to refer to distributed, high performance computing and data handling infrastructure that incorporates geographically and organizationally dispersed, heterogeneous resources that are persistent and supported. This infrastructure includes: (1) Tools for constructing collaborative, application oriented Problem Solving Environments / Frameworks (the primary user interfaces for Grids); (2) Programming environments, tools, and services providing various approaches for building applications that use aggregated computing and storage resources, and federated data sources; (3) Comprehensive and consistent set of location independent tools and services for accessing and managing dynamic collections of widely distributed resources: heterogeneous computing systems, storage systems, real-time data sources and instruments, human collaborators, and communications systems; (4) Operational infrastructure including management tools for distributed systems and distributed resources, user services, accounting and auditing, strong and location independent user authentication and authorization, and overall system security services The vision for NASA's Information Power Grid - a computing and data Grid - is that it will provide significant new capabilities to scientists and engineers by facilitating routine construction of information based problem solving environments / frameworks. Such Grids will knit together widely distributed computing, data, instrument, and human resources into just-in-time systems that can address complex and large-scale computing and data analysis problems. Examples of these problems include: (1) Coupled, multidisciplinary simulations too large for single systems (e.g., multi-component NPSS turbomachine simulation); (2) Use of widely distributed, federated data archives (e.g., simultaneous access to metrological, topological, aircraft performance, and flight path scheduling databases supporting a National Air Space Simulation systems}; (3

  10. Edaq530: a transparent, open-end and open-source measurement solution in natural science education

    Energy Technology Data Exchange (ETDEWEB)

    Kopasz, Katalin; Makra, Peter; Gingl, Zoltan, E-mail: phil@titan.physx.u-szeged.hu [Department of Experimental Physics, University of Szeged, Dom ter 9, Szeged, H6720 (Hungary)

    2011-03-15

    We present Edaq530, a low-cost, compact and easy-to-use digital measurement solution consisting of a thumb-sized USB-to-sensor interface and measurement software. The solution is fully open-source, our aim being to provide a viable alternative to professional solutions. Our main focus in designing Edaq530 has been versatility and transparency. In this paper, we shall introduce the capabilities of Edaq530, complement it by showing a few sample experiments, and discuss the feedback we have received in the course of a teacher training workshop in which the participants received personal copies of Edaq530 and later made reports on how they could utilize Edaq530 in their teaching.

  11. Building Repository Networks with DRIVER: Joint Explorations with the IR Grid in China

    OpenAIRE

    Horstmann, Wolfram; Rosemann, Uwe

    2009-01-01

    Scenarios of collaboration for supporting Open Access to research results through institutional repository networks have been explored between the IR Grid of the National Science Library of the Chinese Academy of Sciences and the European DRIVER-Initiative through its German partners Bielefeld University Library as well as the State and University Library Göttingen. The activities included a joint analysis of the DRIVER infrastructure software D-NET and also resulted in the registration of ch...

  12. Grid and Entrepreneurship Workshop

    CERN Multimedia

    2006-01-01

    The CERN openlab is organising a special workshop about Grid opportunities for entrepreneurship. This one-day event will provide an overview of what is involved in spin-off technology, with a special reference to the context of computing and data Grids. Lectures by experienced entrepreneurs will introduce the key concepts of entrepreneurship and review, in particular, the industrial potential of EGEE (the EU co-funded Enabling Grids for E-sciencE project, led by CERN). Case studies will be given by CEOs of European start-ups already active in the Grid and computing cluster area, and regional experts will provide an overview of efforts in several European regions to stimulate entrepreneurship. This workshop is designed to encourage students and researchers involved or interested in Grid technology to consider the entrepreneurial opportunities that this technology may create in the coming years. This workshop is organized as part of the CERN openlab student programme, which is co-sponsored by CERN, HP, ...

  13. Hematology - Open TG-GATEs | LSDB Archive [Life Science Database Archive metadata

    Lifescience Database Archive (English)

    Full Text Available ivo tests. Data file File name: open_tggates_hematology.zip File URL: ftp://ftp.biosciencedbc.jp/archive/ope...n-tggates/LATEST/open_tggates_hematology.zip File size: 636 KB Simple search URL ...http://togodb.biosciencedbc.jp/togodb/view/open_tggates_hematology#en Data acquisition method - Data analysi

  14. Body weight - Open TG-GATEs | LSDB Archive [Life Science Database Archive metadata

    Lifescience Database Archive (English)

    Full Text Available switchLanguage; BLAST Search Image Search Home About Archive Update History Data ...le URL: ftp://ftp.biosciencedbc.jp/archive/open-tggates/LATEST/open_tggates_body_...atabase Description Download License Update History of This Database Site Policy | Contact Us Body weight - Open TG-GATEs | LSDB Archive ...

  15. Grid attacks avian flu

    CERN Multimedia

    2006-01-01

    During April, a collaboration of Asian and European laboratories analysed 300,000 possible drug components against the avian flu virus H5N1 using the EGEE Grid infrastructure. Schematic presentation of the avian flu virus.The distribution of the EGEE sites in the world on which the avian flu scan was performed. The goal was to find potential compounds that can inhibit the activities of an enzyme on the surface of the influenza virus, the so-called neuraminidase, subtype N1. Using the Grid to identify the most promising leads for biological tests could speed up the development process for drugs against the influenza virus. Co-ordinated by CERN and funded by the European Commission, the EGEE project (Enabling Grids for E-sciencE) aims to set up a worldwide grid infrastructure for science. The challenge of the in silico drug discovery application is to identify those molecules which can dock on the active sites of the virus in order to inhibit its action. To study the impact of small scale mutations on drug r...

  16. Tenure-Track Science Faculty and the 'Open Access Citation Effect'

    Directory of Open Access Journals (Sweden)

    R. Christopher Doty

    2013-02-01

    Full Text Available INTRODUCTION The observation that open access (OA articles receive more citations than subscription-based articles is known as the OA citation effect (OACE. Implicit in many OACE studies is the belief that authors are heavily invested in the number of citations their articles receive. This study seeks to determine what influence the OACE has on the decision-making process of tenure-track science faculty when they consider where to submit a manuscript for publication. METHODS Fifteen tenure-track faculty members in the Departments of Biology and Chemistry at the University of North Carolina at Chapel Hill participated in semi-structured interviews employing a variation of the critical incident tecnique. RESULTS Seven of the fifteen faculty members said they would consider making a future article freely-available based on the OACE. Due to dramatically different expectations with respect to the size of the OACE, however, only one of them is likely to seriously consider the OACE when deciding where to submit their next manuscript for publication. DISCUSSION Journal reputation and audience, and the quality of the editorial and review process are the most important factors in deciding where to submit a manuscript for publication. Once a subset of journals has satisfied these criteria, financial and access issues compete with the OACE in making a final decision. CONCLUSION In order to increase the number of OA materials, librarians should continue to emphasize depositing pre- and post-prints in disciplinary and institutional repositories and retaining the author rights prior to publication in order to make it possible to do so.

  17. The Martian Goes To College: Open Inquiry with Science Fiction in the Classroom.

    Science.gov (United States)

    Beatty, L.; Patterson, J. D.

    2015-12-01

    Storytelling is an ancient art; one that can get lost in the reams of data available in a typical geology or astronomy classroom. But storytelling draws us to a magical place. Our students, with prior experience in either a geology or astronomy course, were invited to explore Mars in a special topics course at Johnson County Community College through reading The Martian by Andy Weir. As they traveled with astronaut Mark Watney, the students used Google Mars, Java Mission-planning and Analysis for Remote Sensing (JMARS), and learning modules from the Mars for Earthlings web site to investigate the terrain and the processes at work in the past and present on Mars. Our goal was to apply their understanding of processes on Earth in order to explain and predict what they observed on Mars courtesy of the remote sensing opportunities available from Viking, Pathfinder, the Mars Exploration Rovers, and Maven missions; sort of an inter-planetary uniformitarianism. Astronaut Mark Watney's fictional journey from Acidalia Planitia to Schiaparelli Crater was analyzed using learning modules in Mars for Earthlings and exercises that we developed based on Google Mars, JMARS, Rotating Sky Explorer, and Science Friday podcasts. Each student also completed an individual project that either focused on a particular region that Astronaut Mark Watney traveled through or a problem that he faced. Through this open-inquiry learning style, they determined some processes that shaped Mars such as crater impacts, volcanism, fluid flow, mass movement, and groundwater sapping and also investigated the efficacy of solar energy as a power source based on location and the likelihood of regolith potential as a mineral matter source for soil.

  18. Accelerating target discovery using pre-competitive open science-patients need faster innovation more than anyone else.

    Science.gov (United States)

    Low, Eric; Bountra, Chas; Lee, Wen Hwa

    2016-01-01

    We are experiencing a new era enabled by unencumbered access to high quality data through the emergence of open science initiatives in the historically challenging area of early stage drug discovery. At the same time, many patient-centric organisations are taking matters into their own hands by participating in, enabling and funding research. Here we present the rationale behind the innovative partnership between the Structural Genomics Consortium (SGC)-an open, pre-competitive pre-clinical research consortium and the research-focused patient organisation Myeloma UK to create a new, comprehensive platform to accelerate the discovery and development of new treatments for multiple myeloma.

  19. The Grid2003 Production Grid Principles and Practice

    CERN Document Server

    Foster, I; Gose, S; Maltsev, N; May, E; Rodríguez, A; Sulakhe, D; Vaniachine, A; Shank, J; Youssef, S; Adams, D; Baker, R; Deng, W; Smith, J; Yu, D; Legrand, I; Singh, S; Steenberg, C; Xia, Y; Afaq, A; Berman, E; Annis, J; Bauerdick, L A T; Ernst, M; Fisk, I; Giacchetti, L; Graham, G; Heavey, A; Kaiser, J; Kuropatkin, N; Pordes, R; Sekhri, V; Weigand, J; Wu, Y; Baker, K; Sorrillo, L; Huth, J; Allen, M; Grundhoefer, L; Hicks, J; Luehring, F C; Peck, S; Quick, R; Simms, S; Fekete, G; Van den Berg, J; Cho, K; Kwon, K; Son, D; Park, H; Canon, S; Jackson, K; Konerding, D E; Lee, J; Olson, D; Sakrejda, I; Tierney, B; Green, M; Miller, R; Letts, J; Martin, T; Bury, D; Dumitrescu, C; Engh, D; Gardner, R; Mambelli, M; Smirnov, Y; Voeckler, J; Wilde, M; Zhao, Y; Zhao, X; Avery, P; Cavanaugh, R J; Kim, B; Prescott, C; Rodríguez, J; Zahn, A; McKee, S; Jordan, C; Prewett, J; Thomas, T; Severini, H; Clifford, B; Deelman, E; Flon, L; Kesselman, C; Mehta, G; Olomu, N; Vahi, K; De, K; McGuigan, P; Sosebee, M; Bradley, D; Couvares, P; De Smet, A; Kireyev, C; Paulson, E; Roy, A; Koranda, S; Moe, B; Brown, B; Sheldon, P

    2004-01-01

    The Grid2003 Project has deployed a multi-virtual organization, application-driven grid laboratory ("GridS") that has sustained for several months the production-level services required by physics experiments of the Large Hadron Collider at CERN (ATLAS and CMS), the Sloan Digital Sky Survey project, the gravitational wave search experiment LIGO, the BTeV experiment at Fermilab, as well as applications in molecular structure analysis and genome analysis, and computer science research projects in such areas as job and data scheduling. The deployed infrastructure has been operating since November 2003 with 27 sites, a peak of 2800 processors, work loads from 10 different applications exceeding 1300 simultaneous jobs, and data transfers among sites of greater than 2 TB/day. We describe the principles that have guided the development of this unique infrastructure and the practical experiences that have resulted from its creation and use. We discuss application requirements for grid services deployment and configur...

  20. Smart grid security innovative solutions for a modernized grid

    CERN Document Server

    Skopik, Florian

    2015-01-01

    The Smart Grid security ecosystem is complex and multi-disciplinary, and relatively under-researched compared to the traditional information and network security disciplines. While the Smart Grid has provided increased efficiencies in monitoring power usage, directing power supplies to serve peak power needs and improving efficiency of power delivery, the Smart Grid has also opened the way for information security breaches and other types of security breaches. Potential threats range from meter manipulation to directed, high-impact attacks on critical infrastructure that could bring down regi

  1. 15 MW HArdware-in-the-loop Grid Simulation Project

    Energy Technology Data Exchange (ETDEWEB)

    Rigas, Nikolaos [Clemson Univ., SC (United States); Fox, John Curtiss [Clemson Univ., SC (United States); Collins, Randy [Clemson Univ., SC (United States); Tuten, James [Clemson Univ., SC (United States); Salem, Thomas [Clemson Univ., SC (United States); McKinney, Mark [Clemson Univ., SC (United States); Hadidi, Ramtin [Clemson Univ., SC (United States); Gislason, Benjamin [Clemson Univ., SC (United States); Boessneck, Eric [Clemson Univ., SC (United States); Leonard, Jesse [Clemson Univ., SC (United States)

    2014-10-31

    The 15MW Hardware-in-the-loop (HIL) Grid Simulator project was to (1) design, (2) construct and (3) commission a state-of-the-art grid integration testing facility for testing of multi-megawatt devices through a ‘shared facility’ model open to all innovators to promote the rapid introduction of new technology in the energy market to lower the cost of energy delivered. The 15 MW HIL Grid Simulator project now serves as the cornerstone of the Duke Energy Electric Grid Research, Innovation and Development (eGRID) Center. This project leveraged the 24 kV utility interconnection and electrical infrastructure of the US DOE EERE funded WTDTF project at the Clemson University Restoration Institute in North Charleston, SC. Additionally, the project has spurred interest from other technology sectors, including large PV inverter and energy storage testing and several leading edge research proposals dealing with smart grid technologies, grid modernization and grid cyber security. The key components of the project are the power amplifier units capable of providing up to 20MW of defined power to the research grid. The project has also developed a one of a kind solution to performing fault ride-through testing by combining a reactive divider network and a large power converter into a hybrid method. This unique hybrid method of performing fault ride-through analysis will allow for the research team at the eGRID Center to investigate the complex differences between the alternative methods of performing fault ride-through evaluations and will ultimately further the science behind this testing. With the final goal of being able to perform HIL experiments and demonstration projects, the eGRID team undertook a significant challenge with respect to developing a control system that is capable of communicating with several different pieces of equipment with different communication protocols in real-time. The eGRID team developed a custom fiber optical network that is based upon FPGA

  2. Shining examples of grid applications

    CERN Multimedia

    Hammerle, Hannelore

    2006-01-01

    Users in more than 150 virtual organisations from fields as diverse as biomedicine, earth Sciences and high-energy physics are now using the distributed computing infrastructure of the enabling grids for E-sciencE (EGEE) project, which shows the wide adoption and versatiblity of this new technology (1 page)

  3. Food consumption - Open TG-GATEs | LSDB Archive [Life Science Database Archive metadata

    Lifescience Database Archive (English)

    Full Text Available c00954-01-011 Description of data contents The list regarding results of food consumption measurement acquir...ed from rats used in the in vivo tests. Data file File name: open_tggates_food_consumption.zip File URL: ftp...://ftp.biosciencedbc.jp/archive/open-tggates/LATEST/open_tggates_food_consumption....zip File size: 108 KB Simple search URL http://togodb.biosciencedbc.jp/togodb/view/open_tggates_food_consum...ption#en Data acquisition method The amount of daily food intake of the first day is calculated as the amount of food

  4. Ecosystem Based Business Model of Smart Grid

    DEFF Research Database (Denmark)

    Lundgaard, Morten Raahauge; Ma, Zheng; Jørgensen, Bo Nørregaard

    2015-01-01

    This paper tries to investigate the ecosystem based business model in a smart grid infrastructure and the potential of value capture in the highly complex macro infrastructure such as smart grid. This paper proposes an alternative perspective to study the smart grid business ecosystem to support...... the infrastructural challenges, such as the interoperability of business components for smart grid. So far little research has explored the business ecosystem in the smart grid concept. The study on the smart grid with the theory of business ecosystem may open opportunities to understand market catalysts. This study...... contributes an understanding of business ecosystem applicable for smart grid. Smart grid infrastructure is an intricate business ecosystem, which have several intentions to deliver the value proposition and what it should be. The findings help to identify and capture value from markets....

  5. Coalescent: an open-science framework for importance sampling in coalescent theory.

    Science.gov (United States)

    Tewari, Susanta; Spouge, John L

    2015-01-01

    Background. In coalescent theory, computer programs often use importance sampling to calculate likelihoods and other statistical quantities. An importance sampling scheme can exploit human intuition to improve statistical efficiency of computations, but unfortunately, in the absence of general computer frameworks on importance sampling, researchers often struggle to translate new sampling schemes computationally or benchmark against different schemes, in a manner that is reliable and maintainable. Moreover, most studies use computer programs lacking a convenient user interface or the flexibility to meet the current demands of open science. In particular, current computer frameworks can only evaluate the efficiency of a single importance sampling scheme or compare the efficiencies of different schemes in an ad hoc manner. Results. We have designed a general framework (http://coalescent.sourceforge.net; language: Java; License: GPLv3) for importance sampling that computes likelihoods under the standard neutral coalescent model of a single, well-mixed population of constant size over time following infinite sites model of mutation. The framework models the necessary core concepts, comes integrated with several data sets of varying size, implements the standard competing proposals, and integrates tightly with our previous framework for calculating exact probabilities. For a given dataset, it computes the likelihood and provides the maximum likelihood estimate of the mutation parameter. Well-known benchmarks in the coalescent literature validate the accuracy of the framework. The framework provides an intuitive user interface with minimal clutter. For performance, the framework switches automatically to modern multicore hardware, if available. It runs on three major platforms (Windows, Mac and Linux). Extensive tests and coverage make the framework reliable and maintainable. Conclusions. In coalescent theory, many studies of computational efficiency consider only

  6. Coalescent: an open-science framework for importance sampling in coalescent theory

    Directory of Open Access Journals (Sweden)

    Susanta Tewari

    2015-08-01

    Full Text Available Background. In coalescent theory, computer programs often use importance sampling to calculate likelihoods and other statistical quantities. An importance sampling scheme can exploit human intuition to improve statistical efficiency of computations, but unfortunately, in the absence of general computer frameworks on importance sampling, researchers often struggle to translate new sampling schemes computationally or benchmark against different schemes, in a manner that is reliable and maintainable. Moreover, most studies use computer programs lacking a convenient user interface or the flexibility to meet the current demands of open science. In particular, current computer frameworks can only evaluate the efficiency of a single importance sampling scheme or compare the efficiencies of different schemes in an ad hoc manner.Results. We have designed a general framework (http://coalescent.sourceforge.net; language: Java; License: GPLv3 for importance sampling that computes likelihoods under the standard neutral coalescent model of a single, well-mixed population of constant size over time following infinite sites model of mutation. The framework models the necessary core concepts, comes integrated with several data sets of varying size, implements the standard competing proposals, and integrates tightly with our previous framework for calculating exact probabilities. For a given dataset, it computes the likelihood and provides the maximum likelihood estimate of the mutation parameter. Well-known benchmarks in the coalescent literature validate the accuracy of the framework. The framework provides an intuitive user interface with minimal clutter. For performance, the framework switches automatically to modern multicore hardware, if available. It runs on three major platforms (Windows, Mac and Linux. Extensive tests and coverage make the framework reliable and maintainable.Conclusions. In coalescent theory, many studies of computational efficiency

  7. Auscope: Australian Earth Science Information Infrastructure using Free and Open Source Software

    Science.gov (United States)

    Woodcock, R.; Cox, S. J.; Fraser, R.; Wyborn, L. A.

    2013-12-01

    Since 2005 the Australian Government has supported a series of initiatives providing researchers with access to major research facilities and information networks necessary for world-class research. Starting with the National Collaborative Research Infrastructure Strategy (NCRIS) the Australian earth science community established an integrated national geoscience infrastructure system called AuScope. AuScope is now in operation, providing a number of components to assist in understanding the structure and evolution of the Australian continent. These include the acquisition of subsurface imaging , earth composition and age analysis, a virtual drill core library, geological process simulation, and a high resolution geospatial reference framework. To draw together information from across the earth science community in academia, industry and government, AuScope includes a nationally distributed information infrastructure. Free and Open Source Software (FOSS) has been a significant enabler in building the AuScope community and providing a range of interoperable services for accessing data and scientific software. A number of FOSS components have been created, adopted or upgraded to create a coherent, OGC compliant Spatial Information Services Stack (SISS). SISS is now deployed at all Australian Geological Surveys, many Universities and the CSIRO. Comprising a set of OGC catalogue and data services, and augmented with new vocabulary and identifier services, the SISS provides a comprehensive package for organisations to contribute their data to the AuScope network. This packaging and a variety of software testing and documentation activities enabled greater trust and notably reduced barriers to adoption. FOSS selection was important, not only for technical capability and robustness, but also for appropriate licensing and community models to ensure sustainability of the infrastructure in the long term. Government agencies were sensitive to these issues and Au

  8. Managing open innovation projects with science-based and market-based partners

    NARCIS (Netherlands)

    Du, J.; Leten, B.; Vanhaverbeke, W.

    2014-01-01

    This paper examines the relationship between (outside-in) open innovation and the financial performance of R&D projects, drawing on a unique dataset that contains information on the open innovation practices, management and performance of 489 R&D projects of a large European multinational firm. We

  9. Diagnosis method of an open-switch fault for a grid-connected T-type three-level inverter system

    DEFF Research Database (Denmark)

    Choi, U. M.; Lee, K. B.; Blaabjerg, Frede

    2012-01-01

    -tolerant control algorithm can be used when the open switch fault occurs in the middle switches. It is achieved by simply modifying the conventional SVM method. The proposed methods are advantageous as they do not require additional sensors and they do not involve complex calculations. Therefore, this method...

  10. Earth System Grid and EGI interoperability

    Science.gov (United States)

    Raciazek, J.; Petitdidier, M.; Gemuend, A.; Schwichtenberg, H.

    2012-04-01

    The Earth Science data centers have developed a data grid called Earth Science Grid Federation (ESGF) to give the scientific community world wide access to CMIP5 (Coupled Model Inter-comparison Project 5) climate data. The CMIP5 data will permit to evaluate the impact of climate change in various environmental and societal areas, such as regional climate, extreme events, agriculture, insurance… The ESGF grid provides services like searching, browsing and downloading of datasets. At the security level, ESGF data access is protected by an authentication mechanism. An ESGF trusted X509 Short-Lived EEC certificate with the correct roles/attributes is required to get access to the data in a non-interactive way (e.g. from a worker node). To access ESGF from EGI (i.e. by earth science applications running on EGI infrastructure), the security incompatibility between the two grids is the challenge: the EGI proxy certificate is not ESGF trusted nor it contains the correct roles/attributes. To solve this problem, we decided to use a Credential Translation Service (CTS) to translate the EGI X509 proxy certificate into the ESGF Short-Lived EEC certificate (the CTS will issue ESGF certificates based on EGI certificate authentication). From the end user perspective, the main steps to use the CTS are: the user binds his two identities (EGI and ESGF) together in the CTS using the CTS web interface (this steps has to be done only once) and then request an ESGF Short-Lived EEC certificate every time is needed, using a command-line tools. The implementation of the CTS is on-going. It is based on the open source MyProxy software stack, which is used in many grid infrastructures. On the client side, the "myproxy-logon" command-line tools is used to request the certificate translation. A new option has been added to "myproxy-logon" to select the original certificate (in our case, the EGI one). On the server side, MyProxy server operates in Certificate Authority mode, with a new module

  11. Flexible and Inflexible Energy Engagements – a Study of the Danish Smart Grid Strategy

    DEFF Research Database (Denmark)

    Schick, Lea; Gad, Christopher

    2015-01-01

    According to many visions for smart grids, consumers will come to play a more ‘active’ role in the energy systems of tomorrow. In this paper, we examine how the future ‘flexible electricity consumer’ is imagined in the Danish National Smart Grid Strategy. Our analysis of reports produced...... by the national Smart Grid Network shows that this vision relies on a techno-centric and rather ‘inflexible’ consumer figuration. However, rather than adopting a conventional social science approach in order to criticize this narrow imaginary, we show that potentials for critique and alternatives can be found...... internally in the Smart Grid Network. Paying attention to different stories, we thus aim to characterize particular forms of ‘infra-critique’ and ‘infra-reflexivity’ emerging from within the field. This mode of reflexivity, we argue, opens up to more flexible and reflexive conceptions of the ‘flexible...

  12. Kids Enjoy Grids

    CERN Multimedia

    2007-01-01

    I want to come back and work here when I'm older,' was the spontaneous reaction of one of the children invited to CERN by the Enabling Grids for E-sciencE project for a 'Grids for Kids' day at the end of January. The EGEE project is led by CERN, and the EGEE gender action team organized the day to introduce children to grid technology at an early age. The school group included both boys and girls, aged 9 to 11. All of the presenters were women. 'In general, before this visit, the children thought that scientists always wore white coats and were usually male, with wild Einstein-like hair,' said Jackie Beaver, the class's teacher at the Institut International de Lancy, a school near Geneva. 'They were surprised and pleased to see that women became scientists, and that scientists were quite 'normal'.' The half-day event included presentations about why Grids are needed, a visit of the computer centre, some online games, and plenty of time for questions. In the end, everyone agreed that it was a big success a...

  13. Can Clouds replace Grids? Will Clouds replace Grids?

    International Nuclear Information System (INIS)

    Shiers, J D

    2010-01-01

    The world's largest scientific machine - comprising dual 27km circular proton accelerators cooled to 1.9 o K and located some 100m underground - currently relies on major production Grid infrastructures for the offline computing needs of the 4 main experiments that will take data at this facility. After many years of sometimes difficult preparation the computing service has been declared 'open' and ready to meet the challenges that will come shortly when the machine restarts in 2009. But the service is not without its problems: reliability - as seen by the experiments, as opposed to that measured by the official tools - still needs to be significantly improved. Prolonged downtimes or degradations of major services or even complete sites are still too common and the operational and coordination effort to keep the overall service running is probably not sustainable at this level. Recently 'Cloud Computing' - in terms of pay-per-use fabric provisioning - has emerged as a potentially viable alternative but with rather different strengths and no doubt weaknesses too. Based on the concrete needs of the LHC experiments - where the total data volume that will be acquired over the full lifetime of the project, including the additional data copies that are required by the Computing Models of the experiments, approaches 1 Exabyte - we analyze the pros and cons of Grids versus Clouds. This analysis covers not only technical issues - such as those related to demanding database and data management needs - but also sociological aspects, which cannot be ignored, neither in terms of funding nor in the wider context of the essential but often overlooked role of science in society, education and economy.

  14. Can Clouds replace Grids? Will Clouds replace Grids?

    Energy Technology Data Exchange (ETDEWEB)

    Shiers, J D, E-mail: Jamie.Shiers@cern.c [CERN, 1211 Geneva 23 (Switzerland)

    2010-04-01

    The world's largest scientific machine - comprising dual 27km circular proton accelerators cooled to 1.9{sup o}K and located some 100m underground - currently relies on major production Grid infrastructures for the offline computing needs of the 4 main experiments that will take data at this facility. After many years of sometimes difficult preparation the computing service has been declared 'open' and ready to meet the challenges that will come shortly when the machine restarts in 2009. But the service is not without its problems: reliability - as seen by the experiments, as opposed to that measured by the official tools - still needs to be significantly improved. Prolonged downtimes or degradations of major services or even complete sites are still too common and the operational and coordination effort to keep the overall service running is probably not sustainable at this level. Recently 'Cloud Computing' - in terms of pay-per-use fabric provisioning - has emerged as a potentially viable alternative but with rather different strengths and no doubt weaknesses too. Based on the concrete needs of the LHC experiments - where the total data volume that will be acquired over the full lifetime of the project, including the additional data copies that are required by the Computing Models of the experiments, approaches 1 Exabyte - we analyze the pros and cons of Grids versus Clouds. This analysis covers not only technical issues - such as those related to demanding database and data management needs - but also sociological aspects, which cannot be ignored, neither in terms of funding nor in the wider context of the essential but often overlooked role of science in society, education and economy.

  15. Can Clouds replace Grids? Will Clouds replace Grids?

    Science.gov (United States)

    Shiers, J. D.

    2010-04-01

    The world's largest scientific machine - comprising dual 27km circular proton accelerators cooled to 1.9oK and located some 100m underground - currently relies on major production Grid infrastructures for the offline computing needs of the 4 main experiments that will take data at this facility. After many years of sometimes difficult preparation the computing service has been declared "open" and ready to meet the challenges that will come shortly when the machine restarts in 2009. But the service is not without its problems: reliability - as seen by the experiments, as opposed to that measured by the official tools - still needs to be significantly improved. Prolonged downtimes or degradations of major services or even complete sites are still too common and the operational and coordination effort to keep the overall service running is probably not sustainable at this level. Recently "Cloud Computing" - in terms of pay-per-use fabric provisioning - has emerged as a potentially viable alternative but with rather different strengths and no doubt weaknesses too. Based on the concrete needs of the LHC experiments - where the total data volume that will be acquired over the full lifetime of the project, including the additional data copies that are required by the Computing Models of the experiments, approaches 1 Exabyte - we analyze the pros and cons of Grids versus Clouds. This analysis covers not only technical issues - such as those related to demanding database and data management needs - but also sociological aspects, which cannot be ignored, neither in terms of funding nor in the wider context of the essential but often overlooked role of science in society, education and economy.

  16. Grid pulser

    International Nuclear Information System (INIS)

    Jansweijer, P.P.M.; Es, J.T. van.

    1990-01-01

    This report describes a fast pulse generator. This generator delivers a high-voltage pulse of at most 6000 V with a rise time being smaller than 50 nS. this results in a slew rate of more than 120.000 volts per μS. The pulse generator is used to control the grid of the injector of the electron accelerator MEA. The capacity of this grid is about 60 pF. In order to charge this capacity up to 6000 volts in 50 nS a current of 8 ampere is needed. The maximal pulse length is 50 μS with a repeat frequency of 500 Hz. During this 50 μS the stability of the pulse amplitude is better than 0.1%. (author). 20 figs

  17. The grid

    OpenAIRE

    Morrad, Annie; McArthur, Ian

    2018-01-01

    Project Anywhere Project title: The Grid   Artists: Annie Morrad: Artist/Senior Lecturer, University of Lincoln, School of Film and Media, Lincoln, UK   Dr Ian McArthur: Hybrid Practitioner/Senior Lecturer, UNSW Art & Design, UNSW Australia, Sydney, Australia   Annie Morrad is a London-based artist and musician and senior lecturer at the University of Lincoln, UK. Dr Ian McArthur is a Sydney-based hybrid practitione...

  18. The effects of blogs versus dialogue journals on open-response writing scores and attitudes of grade eight science students

    Science.gov (United States)

    Erickson, Diane K.

    Today's students have grown up surrounded by technology. They use cell phones, word processors, and the Internet with ease, talking with peers in their community and around the world through e-mails, chatrooms, instant messaging, online discussions, and weblogs ("blogs"). In the midst of this technological explosion, adolescents face a growing need for strong literacy skills in all subject areas for achievement in school and on mandated state and national high stakes tests. The purpose of this study was to examine the use of blogs as a tool for improving open-response writing in the secondary science classroom in comparison to the use of handwritten dialogue journals. The study used a mixed-method approach, gathering both quantitative and qualitative data from 94 students in four eighth-grade science classes. Two classes participated in online class blogs where they posted ideas about science and responded to the ideas of other classmates. Two classes participated in handwritten dialogue journals, writing ideas about science and exchanging journals to respond to the ideas of classmates. The study explored these research questions: Does the use of blogs, as compared to the use of handwritten dialogue journals, improve the open-response writing scores of eighth grade science students? How do students describe their experience using blogs to study science as compared to students using handwritten dialogue journals? and How do motivation, self-efficacy, and community manifest themselves in students who use blogs as compared to students who use handwritten dialogue journals? The quantitative aspect of the study used data from pre- and post-tests and from a Likert-scale post-survey. The pre- and post-writing on open-response science questions were scored using the Massachusetts Comprehensive Assessment System (MCAS) open-response scoring rubric. The study found no statistically significant difference in the writing scores between the blog group and the dialogue journal

  19. Download - Open TG-GATEs | LSDB Archive [Life Science Database Archive metadata

    Lifescience Database Archive (English)

    Full Text Available switchLanguage; BLAST Search Image Search Home About Archive Update History Data ...Download License Update History of This Database Site Policy | Contact Us Download - Open TG-GATEs | LSDB Archive ...

  20. Anatomy of BioJS, an open source community for the life sciences.

    Science.gov (United States)

    Yachdav, Guy; Goldberg, Tatyana; Wilzbach, Sebastian; Dao, David; Shih, Iris; Choudhary, Saket; Crouch, Steve; Franz, Max; García, Alexander; García, Leyla J; Grüning, Björn A; Inupakutika, Devasena; Sillitoe, Ian; Thanki, Anil S; Vieira, Bruno; Villaveces, José M; Schneider, Maria V; Lewis, Suzanna; Pettifer, Steve; Rost, Burkhard; Corpas, Manuel

    2015-07-08

    BioJS is an open source software project that develops visualization tools for different types of biological data. Here we report on the factors that influenced the growth of the BioJS user and developer community, and outline our strategy for building on this growth. The lessons we have learned on BioJS may also be relevant to other open source software projects.

  1. Medgrid: a major industrial initiative for Mediterranean power grids - Opening new electricity paths for a sustainable energy development of the Union for the Mediterranean area countries

    International Nuclear Information System (INIS)

    Merlin, Andre; Pouliquen, Herve

    2011-01-01

    The development of new electricity networks in the Mediterranean region has given rise to a lot of new hope. Electricity is truly at the heart of economic and social development, but also represents an appropriate solution for improving the energy efficiency of many end use applications and for increasing the environmental protection where it is produced from renewable forms of energy. Launched under the framework of Union for the Mediterranean, a key project of the Mediterranean Solar Plan, Medgrid was set up at the instigation of the French government; it now gathers twenty industrial shareholders from different economic sectors: power generation, transmission, distribution and supply, financing and the sustainable development service industry, who are joining together to study the feasibility of a large transmission grid between the south and north rims of the Mediterranean Sea. Medgrid forms part of a new dynamic - set in motion over ten years ago - in Euro-Mediterranean relations, in environmental initiatives and in energy infrastructure development policies. It is important to be reminded of the main features of these initiatives and of their results in order to fully understand the huge diversity of the context in which Medgrid is going to operate, be it in the political, institutional, economic, industrial and technological spheres. (authors)

  2. Gridded ionization chamber

    International Nuclear Information System (INIS)

    Houston, J.M.

    1977-01-01

    An improved ionization chamber type x-ray detector comprises a heavy gas at high pressure disposed between an anode and a cathode. An open grid structure is disposed adjacent the anode and is maintained at a voltsge intermediate between the cathode and anode potentials. The electric field which is produced by positive ions drifting toward the cathode is thus shielded from the anode. Current measuring circuits connected to the anode are, therefore, responsive only to electron current flow within the chamber and the recovery time of the chamber is shortened. The grid structure also serves to shield the anode from electrical currents which might otherwise be induced by mechanical vibrations in the ionization chamber structure

  3. Grid-Enabled Measures

    Science.gov (United States)

    Moser, Richard P.; Hesse, Bradford W.; Shaikh, Abdul R.; Courtney, Paul; Morgan, Glen; Augustson, Erik; Kobrin, Sarah; Levin, Kerry; Helba, Cynthia; Garner, David; Dunn, Marsha; Coa, Kisha

    2011-01-01

    Scientists are taking advantage of the Internet and collaborative web technology to accelerate discovery in a massively connected, participative environment —a phenomenon referred to by some as Science 2.0. As a new way of doing science, this phenomenon has the potential to push science forward in a more efficient manner than was previously possible. The Grid-Enabled Measures (GEM) database has been conceptualized as an instantiation of Science 2.0 principles by the National Cancer Institute with two overarching goals: (1) Promote the use of standardized measures, which are tied to theoretically based constructs; and (2) Facilitate the ability to share harmonized data resulting from the use of standardized measures. This is done by creating an online venue connected to the Cancer Biomedical Informatics Grid (caBIG®) where a virtual community of researchers can collaborate together and come to consensus on measures by rating, commenting and viewing meta-data about the measures and associated constructs. This paper will describe the web 2.0 principles on which the GEM database is based, describe its functionality, and discuss some of the important issues involved with creating the GEM database, such as the role of mutually agreed-on ontologies (i.e., knowledge categories and the relationships among these categories— for data sharing). PMID:21521586

  4. A grid-enabled web service for low-resolution crystal structure refinement.

    Science.gov (United States)

    O'Donovan, Daniel J; Stokes-Rees, Ian; Nam, Yunsun; Blacklow, Stephen C; Schröder, Gunnar F; Brunger, Axel T; Sliz, Piotr

    2012-03-01

    Deformable elastic network (DEN) restraints have proved to be a powerful tool for refining structures from low-resolution X-ray crystallographic data sets. Unfortunately, optimal refinement using DEN restraints requires extensive calculations and is often hindered by a lack of access to sufficient computational resources. The DEN web service presented here intends to provide structural biologists with access to resources for running computationally intensive DEN refinements in parallel on the Open Science Grid, the US cyberinfrastructure. Access to the grid is provided through a simple and intuitive web interface integrated into the SBGrid Science Portal. Using this portal, refinements combined with full parameter optimization that would take many thousands of hours on standard computational resources can now be completed in several hours. An example of the successful application of DEN restraints to the human Notch1 transcriptional complex using the grid resource, and summaries of all submitted refinements, are presented as justification.

  5. Open Content in Open Context

    Science.gov (United States)

    Kansa, Sarah Whitcher; Kansa, Eric C.

    2007-01-01

    This article presents the challenges and rewards of sharing research content through a discussion of Open Context, a new open access data publication system for field sciences and museum collections. Open Context is the first data repository of its kind, allowing self-publication of research data, community commentary through tagging, and clear…

  6. An open data repository and a data processing software toolset of an equivalent Nordic grid model matched to historical electricity market data

    Directory of Open Access Journals (Sweden)

    Luigi Vanfretti

    2017-04-01

    This Nordic 44 equivalent model was also used in iTesla project (iTesla [3] to carry out simulations within a dynamic security assessment toolset (iTesla, 2016 [4], and has been further enhanced during the ITEA3 OpenCPS project (iTEA3 [5]. The raw, processed data and output models utilized within the iTesla platform (iTesla, 2016 [4] are also available in the repository. The CIM and Modelica snapshots of the “Nordic 44” model for the year 2015 are available in a Zenodo repository.

  7. Automatic Nephelometer of an Open Type for science atmosphere researches and Practical Applications

    International Nuclear Information System (INIS)

    Razenkov, I. A.; Rostov, A. P.; Park, W. K.; Burkov, V. V.

    1997-01-01

    Intellectual nephelometer of open type designed by the authors for in situ studies of the atmosphere are described. The nephlometer operate in near IR wavelength range. The construction and concept of the devices allow them either to work independently during several hours or to be operated remotely at a distance up to 500 m from the central host computer. The results and their analysis of two weeks test at Kumkang Hu-Tech Co. are represented. This Nephelometer is a new class of intellectual instruments intended for long-term application on open air and allowing to receive qualitative and the quantitative information about a scattering coefficient in situ.

  8. Database Description - Open TG-GATEs Pathological Image Database | LSDB Archive [Life Science Database Archive metadata

    Lifescience Database Archive (English)

    Full Text Available List Contact us Open TG-GATEs Pathological Image Database Database Description General information of database Database... name Open TG-GATEs Pathological Image Database Alternative name - DOI 10.18908/lsdba.nbdc00954-0...iomedical Innovation 7-6-8, Saito-asagi, Ibaraki-city, Osaka 567-0085, Japan TEL:81-72-641-9826 Email: Database... classification Toxicogenomics Database Organism Taxonomy Name: Rattus norvegi... Article title: Author name(s): Journal: External Links: Original website information Database

  9. An Opening to Society: Chances and Prospects of Science, Research and Services.

    Science.gov (United States)

    Heckelmann, Dieter

    1989-01-01

    Describes how the Free University of Berlin (West Germany) has become more open to the public. The Free University has strengthened its cooperation with private companies in research and funding, promoted internal research, and increased its accelerated and continuing education programs in order to improve its image. (LS)

  10. NCI Core Open House Shines Spotlight on Supportive Science and Basic Research | Poster

    Science.gov (United States)

    The lobby of Building 549 at NCI at Frederick bustled with activity for two hours on Tuesday, May 1, as several dozen scientists and staff gathered for the NCI Core Open House. The event aimed to encourage discussion and educate visitors about the capabilities of the cores, laboratories, and facilities that offer support to NCI’s Center for Cancer Research.

  11. Open Science Meets Stem Cells: A New Drug Discovery Approach for Neurodegenerative Disorders

    Directory of Open Access Journals (Sweden)

    Chanshuai Han

    2018-02-01

    Full Text Available Neurodegenerative diseases are a challenge for drug discovery, as the biological mechanisms are complex and poorly understood, with a paucity of models that faithfully recapitulate these disorders. Recent advances in stem cell technology have provided a paradigm shift, providing researchers with tools to generate human induced pluripotent stem cells (iPSCs from patient cells. With the potential to generate any human cell type, we can now generate human neurons and develop “first-of-their-kind” disease-relevant assays for small molecule screening. Now that the tools are in place, it is imperative that we accelerate discoveries from the bench to the clinic. Using traditional closed-door research systems raises barriers to discovery, by restricting access to cells, data and other research findings. Thus, a new strategy is required, and the Montreal Neurological Institute (MNI and its partners are piloting an “Open Science” model. One signature initiative will be that the MNI biorepository will curate and disseminate patient samples in a more accessible manner through open transfer agreements. This feeds into the MNI open drug discovery platform, focused on developing industry-standard assays with iPSC-derived neurons. All cell lines, reagents and assay findings developed in this open fashion will be made available to academia and industry. By removing the obstacles many universities and companies face in distributing patient samples and assay results, our goal is to accelerate translational medical research and the development of new therapies for devastating neurodegenerative disorders.

  12. E-GRANTHALAYA: LIBRARY INFORMATION SCIENCE OPEN SOURCE AUTOMATION SOFTWARE: AN OVERVIEW

    OpenAIRE

    Umaiyorubagam, R.; JohnAnish, R; Jeyapragash, B

    2015-01-01

    The paper describes that Free Library software’s availability on-line. The open source software is available on three categories.They are library automation software, Digital Library software and integrated library packages. The paper discusses these aspect in detail.

  13. Promoting Continuing Computer Science Education through a Massively Open Online Course

    Science.gov (United States)

    Oliver, Kevin

    2016-01-01

    This paper presents the results of a comparison study between graduate students taking a software security course at an American university and international working professionals taking a version of the same course online through a free massive open online course (MOOC) created in the Google CourseBuilder learning environment. A goal of the study…

  14. Mixing vane grid spacer

    International Nuclear Information System (INIS)

    Patterson, J.F.; Galbraith, K.P.

    1978-01-01

    An improved mixing vane grid spacer having enhanced flow mixing capability by virtue of mixing vanes being positioned at welded intersecting joints of the spacer wherein each mixing vane has an opening or window formed therein substantially directly over the welded joint to provide improved flow mixing capability is described. Some of the vanes are slotted, depending on their particular location in the spacers. The intersecting joints are welded by initially providing consumable tabs at and within each window, which are consumed during the welding of the spacer joints

  15. To have your citizen science cake and eat it? Delivering research and outreach through Open Air Laboratories (OPAL).

    Science.gov (United States)

    Lakeman-Fraser, Poppy; Gosling, Laura; Moffat, Andy J; West, Sarah E; Fradera, Roger; Davies, Linda; Ayamba, Maxwell A; van der Wal, René

    2016-07-22

    The vast array of citizen science projects which have blossomed over the last decade span a spectrum of objectives from research to outreach. While some focus primarily on the collection of rigorous scientific data and others are positioned towards the public engagement end of the gradient, the majority of initiatives attempt to balance the two. Although meeting multiple aims can be seen as a 'win-win' situation, it can also yield significant challenges as allocating resources to one element means that they may be diverted away from the other. Here we analyse one such programme which set out to find an effective equilibrium between these arguably polarised goals. Through the lens of the Open Air Laboratories (OPAL) programme we explore the inherent trade-offs encountered under four indicators derived from an independent citizen science evaluation framework. Assimilating experience from the OPAL network we investigate practical approaches taken to tackle arising tensions. Working backwards from project delivery to design, we found the following elements to be important: ensuring outputs are fit for purpose, developing strong internal and external collaborations, building a sufficiently diverse partnership and considering target audiences. We combine these 'operational indicators' with four pre-existing 'outcome indicators' to create a model which can be used to shape the planning and delivery of a citizen science project. Our findings suggest that whether the proverb in the title rings true will largely depend on the identification of challenges along the way and the ability to address these conflicts throughout the citizen science project.

  16. Analysis of the World Experience of Smart Grid Deployment: Economic Effectiveness Issues

    Science.gov (United States)

    Ratner, S. V.; Nizhegorodtsev, R. M.

    2018-06-01

    Despite the positive dynamics in the growth of RES-based power production in electric power systems of many countries, the further development of commercially mature technologies of wind and solar generation is often constrained by the existing grid infrastructure and conventional energy supply practices. The integration of large wind and solar power plants into a single power grid and the development of microgeneration require the widespread introduction of a new smart grid technology cluster (smart power grids), whose technical advantages over the conventional ones have been fairly well studied, while issues of their economic effectiveness remain open. Estimation and forecasting potential economic effects from the introduction of innovative technologies in the power sector during the stage preceding commercial development is a methodologically difficult task that requires the use of knowledge from different sciences. This paper contains the analysis of smart grid project implementation in Europe and the United States. Interval estimates are obtained for their basic economic parameters. It was revealed that the majority of smart grid implemented projects are not yet commercially effective, since their positive externalities are usually not recognized on the revenue side due to the lack of universal methods for public benefits monetization. The results of the research can be used in modernization and development planning for the existing grid infrastructure both at the federal level and at the level of certain regions and territories.

  17. Roadmap for the ARC Grid Middleware

    DEFF Research Database (Denmark)

    Kleist, Josva; Eerola, Paula; Ekelöf, Tord

    2006-01-01

    The Advanced Resource Connector (ARC) or the NorduGrid middleware is an open source software solution enabling production quality computational and data Grids, with special emphasis on scalability, stability, reliability and performance. Since its first release in May 2002, the middleware is depl...

  18. Opening the Big Black Box: European study reveals visitors' impressions of science laboratories

    CERN Multimedia

    2004-01-01

    "On 29 - 30 March the findings of 'Inside the Big Black Box'- a Europe-wide science and society project - will be revealed during a two-day seminar hosted by CERN*. The principle aim of Inside the Big Black Box (IN3B) is to determine whether a working scientific laboratory can capture the curiosity of the general public through visits" (1 page)

  19. SMS: a linked open data infrastructure for science and innovation studies

    Energy Technology Data Exchange (ETDEWEB)

    Van den Besselaar, P.; Khalili, A.; Idrissou, A.; Loizou, A.; Schlobach, S.; Van Harmelen, F.

    2016-07-01

    In this paper we describe a data integration infrastructure for Science Technology and Innovation (STI) studies developed within the context of the RISIS project. We outline its architecture and functionalities. In the full paper, we will show the use of the infrastructure in a complex research project. At the conference we will give a demonstration. (Author)

  20. Teaching Botanical Identification to Adults: Experiences of the UK Participatory Science Project "Open Air Laboratories"

    Science.gov (United States)

    Stagg, Bethan C.; Donkin, Maria

    2013-01-01

    Taxonomic education and botany are increasingly neglected in schools and universities, leading to a "missed generation" of adults that cannot identify organisms, especially plants. This study pilots three methods for teaching identification of native plant species to forty-three adults engaged in the participatory science project…

  1. Opening the Classroom Door: Professional Learning Communities in the Math and Science Partnership Program

    Science.gov (United States)

    Hamos, James E.; Bergin, Kathleen B.; Maki, Daniel P.; Perez, Lance C.; Prival, Joan T.; Rainey, Daphne Y.; Rowell, Ginger H.; VanderPutten, Elizabeth

    2009-01-01

    This article looks at how professional learning communities (PLCs) have become an operational approach for professional development with potential to de-isolate the teaching experience in the fields of science, technology, engineering, and mathematics (STEM). The authors offer a short synopsis of the intellectual origins of PLCs, provide multiple…

  2. 76 FR 4645 - Fusion Energy Sciences Advisory Committee; Notice of Open Meeting

    Science.gov (United States)

    2011-01-26

    ..., 9 a.m. to 5 p.m.; Tuesday, March 8, 2011, 8:30 a.m. to 12 p.m. ADDRESSES: Doubletree Bethesda Hotel... year (FY) 2012 budget submission to Congress and to conduct other committee business. Tentative Agenda Items: Office of Science FY 2012 Congressional Budget Request FES Program FY 2012 Congressional Budget...

  3. 77 FR 15996 - Science Advisory Board (SAB); Notice of Open Meeting

    Science.gov (United States)

    2012-03-19

    ... limited to a total time of five (5) minutes. Individuals or groups planning to make a verbal presentation... Science Advisory Board (SAB) was established by a Decision Memorandum dated September 25, 1997, and is the... resource management. Time and Date: The meeting will be held Thursday, April 5, 2012 from 9 a.m. to 5:15 p...

  4. 76 FR 10888 - Science Advisory Board (SAB); Notice of Open Meeting

    Science.gov (United States)

    2011-02-28

    ... general, each individual or group making a verbal presentation will be limited to a total time of five (5... meeting. SUMMARY: The Science Advisory Board (SAB) was established by a Decision Memorandum dated... quality and provide optimal support to resource management. TIME AND DATE: The meeting will be held...

  5. Using a Web Site in an Elementary Science Methods Class: Are We Opening a Pandora's Box?

    Science.gov (United States)

    Lewis, Scott P.; O'Brien, George E.

    This paper describes the introduction and use of the World Wide Web (WWW) in an elementary science methods course at Florida International University (FIU). The goals of creating a web site include engaging conversations among educators, providing access to local resources for students, and examining student use of web sites and the Internet. The…

  6. Google Classroom and Open Clusters: An Authentic Science Research Project for High School Students

    Science.gov (United States)

    Johnson, Chelen H.; Linahan, Marcella; Cuba, Allison Frances; Dickmann, Samantha Rose; Hogan, Eleanor B.; Karos, Demetra N.; Kozikowski, Kendall G.; Kozikowski, Lauren Paige; Nelson, Samantha Brooks; O'Hara, Kevin Thomas; Ropinski, Brandi Lucia; Scarpa, Gabriella; Garmany, Catharine D.

    2016-01-01

    STEM education is about offering unique opportunities to our students. For the past three years, students from two high schools (Breck School in Minneapolis, MN, and Carmel Catholic High School in Mundelein, IL) have collaborated on authentic astronomy research projects. This past year they surveyed archival data of open clusters to determine if a clear turnoff point could be unequivocally determined. Age and distance to each open cluster were calculated. Additionally, students requested time on several telescopes to obtain original data to compare to the archival data. Students from each school worked in collaborative teams, sharing and verifying results through regular online hangouts and chats. Work papers were stored in a shared drive and on a student-designed Google site to facilitate dissemination of documents between the two schools.

  7. Cultures of Science and Technology in the Trading Zone: Biodiversity and Open Source Development

    OpenAIRE

    Heaton , Lorna; Dias da Silva , Patrícia

    2016-01-01

    International audience; This paper explores the work of building open source biodiversity information infrastructure. We analyse collaboration between a Canadian team and a Brazilian one. In particular we focus on the use of WingLongitude, a GitHub space, as a trading zone within which the two teams co-developed solutions. We show how the choice to work in a neutral space, belonging to everyone, and the use of display, representation and assemblage practices enabled sharing of some infrastruc...

  8. Surveying the citizen science landscape: an exploration of the design, delivery and impact of citizen science through the lens of the Open Air Laboratories (OPAL) programme.

    Science.gov (United States)

    Davies, Linda; Fradera, Roger; Riesch, Hauke; Lakeman-Fraser, Poppy

    2016-07-22

    This paper provides a short introduction to the topic of citizen science (CS) identifying the shift from the knowledge deficit model to more inclusive, participatory science. It acknowledges the benefits of new technology and the opportunities it brings for mass participation and data manipulation. It focuses on the increase in interest in CS in recent years and draws on experience gained from the Open Air Laboratories (OPAL) programme launched in England in 2007. The drivers and objectives for OPAL are presented together with background information on the partnership, methods and scales. The approaches used by researchers ranged from direct public participation in mass data collection through field surveys to research with minimal public engagement. The supporting services focused on education, particularly to support participants new to science, a media strategy and data services. Examples from OPAL are used to illustrate the different approaches to the design and delivery of CS that have emerged over recent years and the breadth of opportunities for public participation the current landscape provides. Qualitative and quantitative data from OPAL are used as evidence of the impact of CS. While OPAL was conceived ahead of the more recent formalisation of approaches to the design, delivery and analysis of CS projects and their impact, it nevertheless provides a range of examples against which to assess the various benefits and challenges emerging in this fast developing field.

  9. PIV and PTV measurements in hydro-sciences with focus on turbulent open-channel flows

    OpenAIRE

    Nezu, Iehisa; Sanjou, Michio

    2011-01-01

    PIV is one of the most popular measurement techniques in hydraulic engineering as well as in fluid sciences. It has been applied to study various turbulent phenomena in laboratory experiments related to natural rivers, e.g., bursting phenomena near the bed, mixing layers observed at confluences, wake turbulence around dikes and piers, and so on. In these studies, PIV plays important roles in revealing the space-time structure of velocity fluctuations and coherent vortices. This review article...

  10. Power Grid:Connecting the world

    Institute of Scientific and Technical Information of China (English)

    Liu Liang; Zhu Li

    2012-01-01

    With the acceleration of global economic integration,the trend has been towards opening up markets.Large enterprises in countries around the world,the developed countries in particular,attach great importance to going abroad,with the aim to optimally allocate energy resources in a wider range.Actively responding to the "going out" strategy of the State,the two giant power grid enterprises in China,State Grid Corporation of China (SGCC) and China Southern Power Grid (CSG),have made plans for grid development in line with the State's energy strategy and global resources allocation.

  11. Big data, open science and the brain: lessons learned from genomics

    Directory of Open Access Journals (Sweden)

    Suparna eChoudhury

    2014-05-01

    Full Text Available The BRAIN Initiative aims to break new ground in the scale and speed of data collection in neuroscience, requiring tools to handle data in the magnitude of yottabytes (1024. The scale, investment and organization of it are being compared to the Human Genome Project (HGP, which has exemplified ‘big science’ for biology. In line with the trend towards Big Data in genomic research, the promise of the BRAIN Initiative, as well as the European Human Brain Project, rests on the possibility to amass vast quantities of data to model the complex interactions between the brain and behaviour and inform the diagnosis and prevention of neurological disorders and psychiatric disease. Advocates of this ‘data driven’ paradigm in neuroscience argue that harnessing the large quantities of data generated across laboratories worldwide has numerous methodological, ethical and economic advantages, but it requires the neuroscience community to adopt a culture of data sharing and open access to benefit from them. In this article, we examine the rationale for data sharing among advocates and briefly exemplify these in terms of new ‘open neuroscience’ projects. Then, drawing on the frequently invoked model of data sharing in genomics, we go on to demonstrate the complexities of data sharing, shedding light on the sociological and ethical challenges within the realms of institutions, researchers and participants, namely dilemmas around public/private interests in data, (lack of motivation to share in the academic community, and potential loss of participant anonymity. Our paper serves to highlight some foreseeable tensions around data sharing relevant to the emergent ‘open neuroscience’ movement.

  12. License - Open TG-GATEs | LSDB Archive [Life Science Database Archive metadata

    Lifescience Database Archive (English)

    Full Text Available List Contact us Open TG-GATEs License License to Use This Database Last updated : 2012/05/24 You may use this database...scribed below. The Standard License specifies the license terms regarding the use of this database and the r...equirements you must follow in using this database. The Additional License specif...icense. Standard License The Standard License for this database is the license specified in the Creative Com...mons Attribution-Share Alike 2.1 Japan . If you use data from this database, plea

  13. An open annotation ontology for science on web 3.0.

    Science.gov (United States)

    Ciccarese, Paolo; Ocana, Marco; Garcia Castro, Leyla Jael; Das, Sudeshna; Clark, Tim

    2011-05-17

    There is currently a gap between the rich and expressive collection of published biomedical ontologies, and the natural language expression of biomedical papers consumed on a daily basis by scientific researchers. The purpose of this paper is to provide an open, shareable structure for dynamic integration of biomedical domain ontologies with the scientific document, in the form of an Annotation Ontology (AO), thus closing this gap and enabling application of formal biomedical ontologies directly to the literature as it emerges. Initial requirements for AO were elicited by analysis of integration needs between biomedical web communities, and of needs for representing and integrating results of biomedical text mining. Analysis of strengths and weaknesses of previous efforts in this area was also performed. A series of increasingly refined annotation tools were then developed along with a metadata model in OWL, and deployed for feedback and additional requirements the ontology to users at a major pharmaceutical company and a major academic center. Further requirements and critiques of the model were also elicited through discussions with many colleagues and incorporated into the work. This paper presents Annotation Ontology (AO), an open ontology in OWL-DL for annotating scientific documents on the web. AO supports both human and algorithmic content annotation. It enables "stand-off" or independent metadata anchored to specific positions in a web document by any one of several methods. In AO, the document may be annotated but is not required to be under update control of the annotator. AO contains a provenance model to support versioning, and a set model for specifying groups and containers of annotation. AO is freely available under open source license at http://purl.org/ao/, and extensive documentation including screencasts is available on AO's Google Code page: http://code.google.com/p/annotation-ontology/ . The Annotation Ontology meets critical requirements for

  14. Comparative analysis of existing models for power-grid synchronization

    International Nuclear Information System (INIS)

    Nishikawa, Takashi; Motter, Adilson E

    2015-01-01

    The dynamics of power-grid networks is becoming an increasingly active area of research within the physics and network science communities. The results from such studies are typically insightful and illustrative, but are often based on simplifying assumptions that can be either difficult to assess or not fully justified for realistic applications. Here we perform a comprehensive comparative analysis of three leading models recently used to study synchronization dynamics in power-grid networks—a fundamental problem of practical significance given that frequency synchronization of all power generators in the same interconnection is a necessary condition for a power grid to operate. We show that each of these models can be derived from first principles within a common framework based on the classical model of a generator, thereby clarifying all assumptions involved. This framework allows us to view power grids as complex networks of coupled second-order phase oscillators with both forcing and damping terms. Using simple illustrative examples, test systems, and real power-grid datasets, we study the inherent frequencies of the oscillators as well as their coupling structure, comparing across the different models. We demonstrate, in particular, that if the network structure is not homogeneous, generators with identical parameters need to be modeled as non-identical oscillators in general. We also discuss an approach to estimate the required (dynamical) system parameters that are unavailable in typical power-grid datasets, their use for computing the constants of each of the three models, and an open-source MATLAB toolbox that we provide for these computations. (paper)

  15. Off Grid

    OpenAIRE

    Watson, V.A.; Pandya, S.; Weng, J.; de Alberto Castro,; Justine,; Dimitriou, E.; Tengtrirat, N.; Mansoor, Z.; Yurtbulmus, Y.; Montina, J.; Klak, N.; Vasilev Martin,; de Castro, A.

    2015-01-01

    Doctor Watson Architects joined with Samir Pandya and a select group of architecture students to work with line, colour and thread as a means of achieving a psycho-dynamic suspension of matter form and space in the newly opened studios on the 4th & 5th floors of the University of Westminster's Marylebone Block.

  16. CMS Monte Carlo production in the WLCG computing grid

    International Nuclear Information System (INIS)

    Hernandez, J M; Kreuzer, P; Hof, C; Khomitch, A; Mohapatra, A; Filippis, N D; Pompili, A; My, S; Abbrescia, M; Maggi, G; Donvito, G; Weirdt, S D; Maes, J; Mulders, P v; Villella, I; Wakefield, S; Guan, W; Fanfani, A; Evans, D; Flossdorf, A

    2008-01-01

    Monte Carlo production in CMS has received a major boost in performance and scale since the past CHEP06 conference. The production system has been re-engineered in order to incorporate the experience gained in running the previous system and to integrate production with the new CMS event data model, data management system and data processing framework. The system is interfaced to the two major computing Grids used by CMS, the LHC Computing Grid (LCG) and the Open Science Grid (OSG). Operational experience and integration aspects of the new CMS Monte Carlo production system is presented together with an analysis of production statistics. The new system automatically handles job submission, resource monitoring, job queuing, job distribution according to the available resources, data merging, registration of data into the data bookkeeping, data location, data transfer and placement systems. Compared to the previous production system automation, reliability and performance have been considerably improved. A more efficient use of computing resources and a better handling of the inherent Grid unreliability have resulted in an increase of production scale by about an order of magnitude, capable of running in parallel at the order of ten thousand jobs and yielding more than two million events per day

  17. Impact of Considering 110 kV Grid Structures on the Congestion Management in the German Transmission Grid

    Science.gov (United States)

    Hoffrichter, André; Barrios, Hans; Massmann, Janek; Venkataramanachar, Bhavasagar; Schnettler, Armin

    2018-02-01

    The structural changes in the European energy system lead to an increase of renewable energy sources that are primarily connected to the distribution grid. Hence the stationary analysis of the transmission grid and the regionalization of generation capacities are strongly influenced by subordinate grid structures. To quantify the impact on the congestion management in the German transmission grid, a 110 kV grid model is derived using publicly available data delivered by Open Street Map and integrated into an existing model of the European transmission grid. Power flow and redispatch simulations are performed for three different regionalization methods and grid configurations. The results show a significant impact of the 110 kV system and prove an overestimation of power flows in the transmission grid when neglecting subordinate grids. Thus, the redispatch volume in Germany to dissolve bottlenecks in case of N-1 contingencies decreases by 38 % when considering the 110 kV grid.

  18. Establishment of the Slovenian Universities' Repositories and of the National Open Science Portal

    Directory of Open Access Journals (Sweden)

    Milan Ojsteršek

    2014-12-01

    Full Text Available AbstractThe paper presents the legal, organisational and technical perspectives regarding the implementation of the Slovenian national open access infrastructure for electronic theses and dissertations as well as for research publications. The infrastructure consists of four institutional repositories and a national portal that aggregates content from the university repositories and other Slovenian archives in order to provide a common search engine, recommendation of similar publications, and similar text detection. We have developed the software which is integrated with the universities' information and authentication systems and with the COBISS.SI. During the project the necessary legal background was defined and processes for mandatory submission of electronic theses and dissertations as well as of research publications were designed. The processes for data exchange between the institutional repositories and the national portal, and the processes for similar text detection and recommendation system were established. Bilingual web and mobile applications, a recommendation system and the interface suitable for persons with disabilities are provided to the users from around the world. The repositories are an effective promotion tool for universities and their researchers. It is expected that they will improve the recognition of Slovenian universities in the world. The complex national open access infrastructure with similar text detection support and integration with other systems will enable the storage of almost eighty percent of peer-reviewed scientific papers, annually published by Slovenian researchers. The majority of electronic theses and dissertations yearly produced at the Slovenian higher education institutions will also be accessible.

  19. Do-it-yourself biology: challenges and promises for an open science and technology movement.

    Science.gov (United States)

    Landrain, Thomas; Meyer, Morgan; Perez, Ariel Martin; Sussan, Remi

    2013-09-01

    The do-it-yourself biology (DIYbio) community is emerging as a movement that fosters open access to resources permitting modern molecular biology, and synthetic biology among others. It promises in particular to be a source of cheaper and simpler solutions for environmental monitoring, personal diagnostic and the use of biomaterials. The successful growth of a global community of DIYbio practitioners will depend largely on enabling safe access to state-of-the-art molecular biology tools and resources. In this paper we analyze the rise of DIYbio, its community, its material resources and its applications. We look at the current projects developed for the international genetically engineered machine competition in order to get a sense of what amateur biologists can potentially create in their community laboratories over the coming years. We also show why and how the DIYbio community, in the context of a global governance development, is putting in place a safety/ethical framework for guarantying the pursuit of its activity. And finally we argue that the global spread of DIY biology potentially reconfigures and opens up access to biological information and laboratory equipment and that, therefore, it can foster new practices and transversal collaborations between professional scientists and amateurs.

  20. Nuclear reactor spring strip grid spacer

    International Nuclear Information System (INIS)

    Patterson, J.F.; Flora, B.S.

    1978-01-01

    A bimetallic grid spacer is described comprising a grid structure of zircaloy formed by intersecting striplike members which define fuel element openings for receiving fuel elements and spring strips made of Inconel positioned within the grid structure for cooperating with the fuel elements to maintain them in their desired position. A plurality of these spring strips extend longitudinally between sides of the grid structure, being locked in position by the grid retaining strips. The fuel rods, which are disposed in the fuel openings formed in the grid structure, are positioned by means of the springs associated with the spring strips and a plurality of dimples which extend from the zircaloy grid structure into the openings. In one embodiment the strips are disposed in a plurality of arrays with those spring strip arrays situated in opposing diagonal quadrants of the grid structure extending in the same direction and adjacent spring strip arrays in each half of the spacer extending in relatively perpendicular directions. Other variations of the spring strip arrangements for a particular fuel design are disclosed herein

  1. The MammoGrid Project Grids Architecture

    CERN Document Server

    McClatchey, Richard; Hauer, Tamas; Estrella, Florida; Saiz, Pablo; Rogulin, Dmitri; Buncic, Predrag; Clatchey, Richard Mc; Buncic, Predrag; Manset, David; Hauer, Tamas; Estrella, Florida; Saiz, Pablo; Rogulin, Dmitri

    2003-01-01

    The aim of the recently EU-funded MammoGrid project is, in the light of emerging Grid technology, to develop a European-wide database of mammograms that will be used to develop a set of important healthcare applications and investigate the potential of this Grid to support effective co-working between healthcare professionals throughout the EU. The MammoGrid consortium intends to use a Grid model to enable distributed computing that spans national borders. This Grid infrastructure will be used for deploying novel algorithms as software directly developed or enhanced within the project. Using the MammoGrid clinicians will be able to harness the use of massive amounts of medical image data to perform epidemiological studies, advanced image processing, radiographic education and ultimately, tele-diagnosis over communities of medical "virtual organisations". This is achieved through the use of Grid-compliant services [1] for managing (versions of) massively distributed files of mammograms, for handling the distri...

  2. ASCR Science Network Requirements

    Energy Technology Data Exchange (ETDEWEB)

    Dart, Eli; Tierney, Brian

    2009-08-24

    The Energy Sciences Network (ESnet) is the primary provider of network connectivity for the US Department of Energy Office of Science, the single largest supporter of basic research in the physical sciences in the United States. In support of the Office of Science programs, ESnet regularly updates and refreshes its understanding of the networking requirements of the instruments, facilities, scientists, and science programs that it serves. This focus has helped ESnet to be a highly successful enabler of scientific discovery for over 20 years. In April 2009 ESnet and the Office of Advanced Scientific Computing Research (ASCR), of the DOE Office of Science, organized a workshop to characterize the networking requirements of the programs funded by ASCR. The ASCR facilities anticipate significant increases in wide area bandwidth utilization, driven largely by the increased capabilities of computational resources and the wide scope of collaboration that is a hallmark of modern science. Many scientists move data sets between facilities for analysis, and in some cases (for example the Earth System Grid and the Open Science Grid), data distribution is an essential component of the use of ASCR facilities by scientists. Due to the projected growth in wide area data transfer needs, the ASCR supercomputer centers all expect to deploy and use 100 Gigabit per second networking technology for wide area connectivity as soon as that deployment is financially feasible. In addition to the network connectivity that ESnet provides, the ESnet Collaboration Services (ECS) are critical to several science communities. ESnet identity and trust services, such as the DOEGrids certificate authority, are widely used both by the supercomputer centers and by collaborations such as Open Science Grid (OSG) and the Earth System Grid (ESG). Ease of use is a key determinant of the scientific utility of network-based services. Therefore, a key enabling aspect for scientists beneficial use of high

  3. Grid computing in large pharmaceutical molecular modeling.

    Science.gov (United States)

    Claus, Brian L; Johnson, Stephen R

    2008-07-01

    Most major pharmaceutical companies have employed grid computing to expand their compute resources with the intention of minimizing additional financial expenditure. Historically, one of the issues restricting widespread utilization of the grid resources in molecular modeling is the limited set of suitable applications amenable to coarse-grained parallelization. Recent advances in grid infrastructure technology coupled with advances in application research and redesign will enable fine-grained parallel problems, such as quantum mechanics and molecular dynamics, which were previously inaccessible to the grid environment. This will enable new science as well as increase resource flexibility to load balance and schedule existing workloads.

  4. A multi VO Grid infrastructure at DESY

    International Nuclear Information System (INIS)

    Gellrich, Andreas

    2010-01-01

    As a centre for research with particle accelerators and synchrotron light, DESY operates a Grid infrastructure in the context of the EU-project EGEE and the national Grid initiative D-GRID. All computing and storage resources are located in one Grid infrastructure which supports a number of Virtual Organizations of different disciplines, including non-HEP groups such as the Photon Science community. Resource distribution is based on fair share methods without dedicating hardware to user groups. Production quality of the infrastructure is guaranteed by embedding it into the DESY computer centre.

  5. Special issue on enabling open and interoperable access to Planetary Science and Heliophysics databases and tools

    Science.gov (United States)

    2018-01-01

    The large amount of data generated by modern space missions calls for a change of organization of data distribution and access procedures. Although long term archives exist for telescopic and space-borne observations, high-level functions need to be developed on top of these repositories to make Planetary Science and Heliophysics data more accessible and to favor interoperability. Results of simulations and reference laboratory data also need to be integrated to support and interpret the observations. Interoperable software and interfaces have recently been developed in many scientific domains. The Virtual Observatory (VO) interoperable standards developed for Astronomy by the International Virtual Observatory Alliance (IVOA) can be adapted to Planetary Sciences, as demonstrated by the VESPA (Virtual European Solar and Planetary Access) team within the Europlanet-H2020-RI project. Other communities have developed their own standards: GIS (Geographic Information System) for Earth and planetary surfaces tools, SPASE (Space Physics Archive Search and Extract) for space plasma, PDS4 (NASA Planetary Data System, version 4) and IPDA (International Planetary Data Alliance) for planetary mission archives, etc, and an effort to make them interoperable altogether is starting, including automated workflows to process related data from different sources.

  6. Triple-layer smart grid business model

    DEFF Research Database (Denmark)

    Ma, Zheng; Lundgaard, Morten; Jørgensen, Bo Nørregaard

    2016-01-01

    Viewing the smart grid with the theory of business models may open opportunities in understanding and capturing values in new markets. This study tries to discover and map the smart grid ecosystem-based business model framework with two different environments (sub-Saharan Africa and Denmark......), and identifies the parameters for the smart grid solutions to the emerging markets. This study develops a triple-layer business model including the organizational (Niche), environmental (Intermediate), and global (Dominators) factors. The result uncovers an interface of market factors and stakeholders...... in a generic smart grid constellation. The findings contribute the transferability potential of the smart grid solutions between countries, and indicate the potential to export and import smart grid solutions based on the business modeling....

  7. Automatic sleep spindle detection: Benchmarking with fine temporal resolution using open science tools

    Directory of Open Access Journals (Sweden)

    Christian eO'Reilly

    2015-06-01

    Full Text Available Sleep spindle properties index cognitive faculties such as memory consolidation and diseases such as major depression. For this reason, scoring sleep spindle properties in polysomnographic recordings has become an important activity in both research and clinical settings. The tediousness of this manual task has motivated efforts for its automation. Although some progress has been made, increasing the temporal accuracy of spindle scoring and improving the performance assessment methodology are two aspects needing more attention. In this paper, four open-access automated spindle detectors with fine temporal resolution are proposed and tested against expert scoring of two proprietary and two open-access databases. Results highlight several findings: 1 that expert scoring and polysomnographic databases are important confounders when comparing the performance of spindle detectors tested using different databases or scorings; 2 because spindles are sparse events, specificity estimates are potentially misleading for assessing automated detector performance; 3 reporting the performance of spindle detectors exclusively with sensitivity and specificity estimates, as is often seen in the literature, is insufficient; including sensitivity, precision and a more comprehensive statistic such as Matthew’s correlation coefficient, F1-score, or Cohen’s κ is necessary for adequate evaluation; 4 reporting statistics for some reasonable range of decision thresholds provides a much more complete and useful benchmarking; 5 performance differences between tested automated detectors were found to be similar to those between available expert scorings; 6 much more development is needed to effectively compare the performance of spindle detectors developed by different research teams. Finally, this work clarifies a long-standing but only seldom posed question regarding whether expert scoring truly is a reliable gold standard for sleep spindle assessment.

  8. An Open Hardware seismic data recorder - a solid basis for citizen science

    Science.gov (United States)

    Mertl, Stefan

    2015-04-01

    "Ruwai" is a 24-Bit Open Hardware seismic data recorder. It is built up of four stackable printed circuit boards fitting the Arduino Mega 2560 microcontroller prototyping platform. An interface to the BeagleBone Black single-board computer enables extensive data storage, -processing and networking capabilities. The four printed circuit boards provide a uBlox Lea-6T GPS module and real-time clock (GPS Timing shield), an Texas Instruments ADS1274 24-Bit analog to digital converter (ADC main shield), an analog input section with a Texas Instruments PGA281 programmable gain amplifier and an analog anti-aliasing filter (ADC analog interface pga) and the power conditioning based on 9-36V DC input (power supply shield). The Arduino Mega 2560 is used for controlling the hardware components, timestamping sampled data using the GPS timing information and transmitting the data to the BeagleBone Black single-board computer. The BeagleBone Black provides local data storage, wireless mesh networking using the optimized link state routing daemon and differential GNSS positioning using the RTKLIB software. The complete hardware and software is published under free software - or open hardware licenses and only free software (e.g. KiCad) was used for the development to facilitate the reusability of the design and increases the sustainability of the project. "Ruwai" was developed within the framework of the "Community Environmental Observation Network (CEON)" (http://www.mertl-research.at/ceon/) which was supported by the Internet Foundation Austria (IPA) within the NetIdee 2013 call.

  9. Grid accounting service: state and future development

    International Nuclear Information System (INIS)

    Levshina, T; Sehgal, C; Bockelman, B; Weitzel, D; Guru, A

    2014-01-01

    During the last decade, large-scale federated distributed infrastructures have been continually developed and expanded. One of the crucial components of a cyber-infrastructure is an accounting service that collects data related to resource utilization and identity of users using resources. The accounting service is important for verifying pledged resource allocation per particular groups and users, providing reports for funding agencies and resource providers, and understanding hardware provisioning requirements. It can also be used for end-to-end troubleshooting as well as billing purposes. In this work we describe Gratia, a federated accounting service jointly developed at Fermilab and Holland Computing Center at University of Nebraska-Lincoln. The Open Science Grid, Fermilab, HCC, and several other institutions have used Gratia in production for several years. The current development activities include expanding Virtual Machines provisioning information, XSEDE allocation usage accounting, and Campus Grids resource utilization. We also identify the direction of future work: improvement and expansion of Cloud accounting, persistent and elastic storage space allocation, and the incorporation of WAN and LAN network metrics.

  10. Grid accounting service: state and future development

    Science.gov (United States)

    Levshina, T.; Sehgal, C.; Bockelman, B.; Weitzel, D.; Guru, A.

    2014-06-01

    During the last decade, large-scale federated distributed infrastructures have been continually developed and expanded. One of the crucial components of a cyber-infrastructure is an accounting service that collects data related to resource utilization and identity of users using resources. The accounting service is important for verifying pledged resource allocation per particular groups and users, providing reports for funding agencies and resource providers, and understanding hardware provisioning requirements. It can also be used for end-to-end troubleshooting as well as billing purposes. In this work we describe Gratia, a federated accounting service jointly developed at Fermilab and Holland Computing Center at University of Nebraska-Lincoln. The Open Science Grid, Fermilab, HCC, and several other institutions have used Gratia in production for several years. The current development activities include expanding Virtual Machines provisioning information, XSEDE allocation usage accounting, and Campus Grids resource utilization. We also identify the direction of future work: improvement and expansion of Cloud accounting, persistent and elastic storage space allocation, and the incorporation of WAN and LAN network metrics.

  11. Opening up to Big Data: Computer-Assisted Analysis of Textual Data in Social Sciences

    Directory of Open Access Journals (Sweden)

    Gregor Wiedemann

    2013-05-01

    Full Text Available Two developments in computational text analysis may change the way qualitative data analysis in social sciences is performed: 1. the availability of digital text worth to investigate is growing rapidly, and 2. the improvement of algorithmic information extraction approaches, also called text mining, allows for further bridging the gap between qualitative and quantitative text analysis. The key factor hereby is the inclusion of context into computational linguistic models which extends conventional computational content analysis towards the extraction of meaning. To clarify methodological differences of various computer-assisted text analysis approaches the article suggests a typology from the perspective of a qualitative researcher. This typology shows compatibilities between manual qualitative data analysis methods and computational, rather quantitative approaches for large scale mixed method text analysis designs. URN: http://nbn-resolving.de/urn:nbn:de:0114-fqs1302231

  12. Clinical research data sharing: what an open science world means for researchers involved in evidence synthesis.

    Science.gov (United States)

    Ross, Joseph S

    2016-09-20

    The International Committee of Medical Journal Editors (ICMJE) recently announced a bold step forward to require data generated by interventional clinical trials that are published in its member journals to be responsibly shared with external investigators. The movement toward a clinical research culture that supports data sharing has important implications for the design, conduct, and reporting of systematic reviews and meta-analyses. While data sharing is likely to enhance the science of evidence synthesis, facilitating the identification and inclusion of all relevant research, it will also pose key challenges, such as requiring broader search strategies and more thorough scrutiny of identified research. Furthermore, the adoption of data sharing initiatives by the clinical research community should challenge the community of researchers involved in evidence synthesis to follow suit, including the widespread adoption of systematic review registration, results reporting, and data sharing, to promote transparency and enhance the integrity of the research process.

  13. Computer Sciences Applied to Management at Open University of Catalonia: Development of Competences of Teamworks

    Science.gov (United States)

    Pisa, Carlos Cabañero; López, Enric Serradell

    Teamwork is considered one of the most important professional skills in today's business environment. More specifically, the collaborative work between professionals and information technology managers from various functional areas is a strategic key in competitive business. Several university-level programs are focusing on developing these skills. This article presents the case of the course Computer Science Applied to Management (hereafter CSAM) that has been designed with the objective to develop the ability to work cooperatively in interdisciplinary teams. For their design and development have been addressed to the key elements of efficiency that appear in the literature, most notably the establishment of shared objectives and a feedback system, the management of the harmony of the team, their level of autonomy, independence, diversity and level of supervision. The final result is a subject in which, through a working virtual platform, interdisciplinary teams solve a problem raised by a case study.

  14. Open3DQSAR

    DEFF Research Database (Denmark)

    Tosco, Paolo; Balle, Thomas

    2011-01-01

    Open3DQSAR is a freely available open-source program aimed at chemometric analysis of molecular interaction fields. MIFs can be imported from different sources (GRID, CoMFA/CoMSIA, quantum-mechanical electrostatic potential or electron density grids) or generated by Open3DQSAR itself. Much focus...... has been put on automation through the implementation of a scriptable interface, as well as on high computational performance achieved by algorithm parallelization. Flexibility and interoperability with existing molecular modeling software make Open3DQSAR a powerful tool in pharmacophore assessment...

  15. Conference on wind energy and grid integration

    International Nuclear Information System (INIS)

    Laffaille, Didier; Boemer, Jens; Fraisse, Jean-Luc; Mignon, Herve; Gonot, Jean-Pierre; Rohrig, Kurt; Lange, Matthias; Bagusche, Daniel; Wagner, Stefan; Schiel, Johannes

    2008-01-01

    The French-German office for Renewable energies (OFAEnR) organised a conference on the grid integration of wind farms. In the framework of this French-German exchange of experience, more than 80 participants exchanged views on the evolutions of tariffs and licensing procedures, and on grid capacity improvements and production forecasts. This document brings together the available presentations (slides) made during this event: 1 - The necessary evolution of billing and procedures for wind turbines connection to the grid in France (Didier Laffaille); 2 - Improvement of wind turbines integration to the grid in the framework of the EEG 2009 law (Jens Boemer); 3 - Decentralized power generation on the French power grids - 15, 20 kV and low voltage (Jean-Luc Fraisse); 4 - GOTTESWIND? Solution for the future: towards a grid evolution (Herve Mignon); 5 - Production forecasts in Germany - State-of-the-art and challenges for the grid exploitation (Kurt Rohrig); 6 - High-voltage lines capacity evaluation in meteorological situations with high wind energy production (Matthias Lange); 7 - The IPES project for the integration of wind energy production in the exploitation of the French power system (Jean-Pierre Gonot); 8 - Experience feedback from a wind turbine manufacturer in France and in Germany (Daniel Bagusche); 9 - Solutions for grid security improvement and capacity enhancement: cooperation between grid and power plant operators (Stefan Wagner); 10 - Open questions on wind energy integration to French and German grids (Johannes Schiel)

  16. CILogon-HA. Higher Assurance Federated Identities for DOE Science

    Energy Technology Data Exchange (ETDEWEB)

    Basney, James [Univ. of Illinois, Urbana-Champaign, IL (United States)

    2015-08-01

    The CILogon-HA project extended the existing open source CILogon service (initially developed with funding from the National Science Foundation) to provide credentials at multiple levels of assurance to users of DOE facilities for collaborative science. CILogon translates mechanism and policy across higher education and grid trust federations, bridging from the InCommon identity federation (which federates university and DOE lab identities) to the Interoperable Global Trust Federation (which defines standards across the Worldwide LHC Computing Grid, the Open Science Grid, and other cyberinfrastructure). The CILogon-HA project expanded the CILogon service to support over 160 identity providers (including 6 DOE facilities) and 3 internationally accredited certification authorities. To provide continuity of operations upon the end of the CILogon-HA project period, project staff transitioned the CILogon service to operation by XSEDE.

  17. Protecting the pipeline of science: openness, scientific methods and the lessons from ticagrelor and the PLATO trial.

    Science.gov (United States)

    Coats, Andrew J Stewart; Nijjer, Sukhjinder S; Francis, Darrel P

    2014-10-20

    Ticagrelor, a potent antiplatelet, has been shown to be beneficial in patients with acute coronary syndromes in a randomised controlled trial published in a highly ranked peer reviewed journal. Accordingly it has entered guidelines and has been approved for clinical use by authorities. However, there remains a controversy regarding aspects of the PLATO trial, which are not immediately apparent from the peer-reviewed publications. A number of publications have sought to highlight potential discrepancies, using data available in publicly published documents from the US Food and Drug Administration (FDA) leading to disagreement regarding the value of open science and data sharing. We reflect upon potential sources of bias present in even rigorously performed randomised controlled trials, on whether peer review can establish the presence of bias and the need to constantly challenge and question even accepted data. Copyright © 2014 Elsevier Ireland Ltd. All rights reserved.

  18. The Hyper-Commons: how open science prizes can expand and level the medical research playing field.

    Science.gov (United States)

    Hynek, Paul

    2008-12-01

    The largest industry in America is increasingly incapable of serving its customers. Over-fencing of the information commons has led to unaffordable medicine, for want of which millions of Americans and people around the world go without lifesaving treatments. Eliminating patent distribution exclusivity altogether, however, is not feasible, given the entrenched nature of the health-care industry. This paper proposes a program of voluntary Open Science Prizes that would draw large numbers of new players, who would in turn produce much new medical innovation, provide academic priority recognition, and develop a growing body of patent-beating prior art that would serve as public domain firewalls on a new supranational Hyper-Commons.

  19. Overview: Routes to Open Access

    OpenAIRE

    Tullney, Marco; van Wezenbeek, Wilma

    2017-01-01

    Slides of an overview presentation given at a CESAER workshop on Open Access, February 2nd, 2017, in Brussels Cover major routes to more open access as discussed in the Task Force Open Science of CESAER: (national) open access strategies open access mandates open access incentives open access awareness open access publishing open access infrastructure

  20. Middleware for the next generation Grid infrastructure

    CERN Document Server

    Laure, E; Prelz, F; Beco, S; Fisher, S; Livny, M; Guy, L; Barroso, M; Buncic, P; Kunszt, Peter Z; Di Meglio, A; Aimar, A; Edlund, A; Groep, D; Pacini, F; Sgaravatto, M; Mulmo, O

    2005-01-01

    The aim of the EGEE (Enabling Grids for E-Science in Europe) project is to create a reliable and dependable European Grid infrastructure for e-Science. The objective of the EGEE Middleware Re-engineering and Integration Research Activity is to provide robust middleware components, deployable on several platforms and operating systems, corresponding to the core Grid services for resource access, data management, information collection, authentication & authorization, resource matchmaking and brokering, and monitoring and accounting. For achieving this objective, we developed an architecture and design of the next generation Grid middleware leveraging experiences and existing components essentially from AliEn, EDG, and VDT. The architecture follows the service breakdown developed by the LCG ARDA group. Our strategy is to do as little original development as possible but rather re-engineer and harden existing Grid services. The evolution of these middleware components towards a Service Oriented Architecture ...

  1. Vote for the GridCafé!

    CERN Multimedia

    2004-01-01

    CERN's GridCafé website (http://www.gridcafe.org) has been nominated for the 8th Annual Webby Awards, together with four other finalists in the Technical Achievement category. The Webby Awards have been hailed as the "online Oscars" by Time Magazine, and are the leading international honours for websites, so this nomination represents a significant achievement. The winner in this category last year was Google. The GridCafé website, which was launched at Telecom '03 and forms part of the Microcosm exhibit on computing, introduces Grid technology to the general public, and provides information on all major Grid projects around the world, focusing in particular on the pioneering Grid developments being carried out by CERN and its many international partners for the Large Hadron Collider project. Being nominated for a Webby Award represents a great opportunity to draw positive media attention to Grid technology, to CERN and to science in general. Last year's nominees were covered...

  2. Toward a Grid Work flow Formal Composition

    International Nuclear Information System (INIS)

    Hlaoui, Y. B.; BenAyed, L. J.

    2007-01-01

    This paper exposes a new approach for the composition of grid work flow models. This approach proposes an abstract syntax for the UML Activity Diagrams (UML-AD) and a formal foundation for grid work flow composition in form of a work flow algebra based on UML-AD. This composition fulfils the need for collaborative model development particularly the specification and the reduction of the complexity of grid work flow model verification. This complexity has arisen with the increase in scale of grid work flow applications such as science and e-business applications since large amounts of computational resources are required and multiple parties could be involved in the development process and in the use of grid work flows. Furthermore, the proposed algebra allows the definition of work flow views which are useful to limit the access to predefined users in order to ensure the security of grid work flow applications. (Author)

  3. First Thuesday - CERN, The Grid gets real

    CERN Multimedia

    Robertson, Leslie

    2003-01-01

    A few years ago, "the Grid" was just a vision dreamt up by some computer scientists who wanted to share processor power and data storage capacity between computers around the world - in much the same way as today's Web shares information seamlessly between millions of computers. Today, Grid technology is a huge enterprise, involving hundreds of software engineers, and generating exciting opportunities for industry. "Computing on demand", "utility computing", "web services", and "virtualisation" are just a few of the buzzwords in the IT industry today that are intimately connected to the development of Grid technology. For this third First Tuesday @CERN, the panel will survey some of the latest major breakthroughs in building international computer Grids for science. It will also provide a snapshot of Grid-related industrial activities, with contributions from both major players in the IT sector as well as emerging Grid technology start-ups.

  4. 75 FR 7526 - Consumer Interface With the Smart Grid

    Science.gov (United States)

    2010-02-19

    ... OFFICE OF SCIENCE AND TECHNOLOGY POLICY Consumer Interface With the Smart Grid AGENCY: Office of... realize these benefits. Demand-side Smart Grid technologies include ``smart meters'' (which provide two... information exchange between the home and the Smart Grid. Section 1305 of the Energy Independence and Security...

  5. From testbed to reality grid computing steps up a gear

    CERN Multimedia

    2004-01-01

    "UK plans for Grid computing changed gear this week. The pioneering European DataGrid (EDG) project came to a successful conclusion at the end of March, and on 1 April a new project, known as Enabling Grids for E-Science in Europe (EGEE), begins" (1 page)

  6. Trends in Research and Publication: Science 2.0 and Open Access

    Directory of Open Access Journals (Sweden)

    Geir Hovland

    2009-07-01

    Full Text Available This paper considers current trends in academic research and publication, in particular as seen from the control community. The introduction of Web 2.0 applications for scientists and engineers is currently changing the way research is being conducted. In the near future, participants in the research community will be able to share ideas, data and results like never before. They will also be able to manage the rapidly increasing amount of scientific information much more effectively than today through collaborative efforts enabled by the new Internet tools. However, an important premise for such a development is the availability of research material. Many research results are currently shielded behind expensive subscription schemes that impede the sharing of information. At the same time, an increasing amount of research is being published through open access channels with unrestricted availability. Interestingly, recent studies show that such policies contribute to an increased number of citations compared to the pay-based alternatives. In sum, the parallel development of new tools for research collaboration and an increased access to research material may fundamentally transform the way research is going to be conducted in the future.

  7. HashDist: Reproducible, Relocatable, Customizable, Cross-Platform Software Stacks for Open Hydrological Science

    Science.gov (United States)

    Ahmadia, A. J.; Kees, C. E.

    2014-12-01

    Developing scientific software is a continuous balance between not reinventing the wheel and getting fragile codes to interoperate with one another. Binary software distributions such as Anaconda provide a robust starting point for many scientific software packages, but this solution alone is insufficient for many scientific software developers. HashDist provides a critical component of the development workflow, enabling highly customizable, source-driven, and reproducible builds for scientific software stacks, available from both the IPython Notebook and the command line. To address these issues, the Coastal and Hydraulics Laboratory at the US Army Engineer Research and Development Center has funded the development of HashDist in collaboration with Simula Research Laboratories and the University of Texas at Austin. HashDist is motivated by a functional approach to package build management, and features intelligent caching of sources and builds, parametrized build specifications, and the ability to interoperate with system compilers and packages. HashDist enables the easy specification of "software stacks", which allow both the novice user to install a default environment and the advanced user to configure every aspect of their build in a modular fashion. As an advanced feature, HashDist builds can be made relocatable, allowing the easy redistribution of binaries on all three major operating systems as well as cloud, and supercomputing platforms. As a final benefit, all HashDist builds are reproducible, with a build hash specifying exactly how each component of the software stack was installed. This talk discusses the role of HashDist in the hydrological sciences, including its use by the Coastal and Hydraulics Laboratory in the development and deployment of the Proteus Toolkit as well as the Rapid Operational Access and Maneuver Support project. We demonstrate HashDist in action, and show how it can effectively support development, deployment, teaching, and

  8. Open source hardware solutions for low-cost, do-it-yourself environmental monitoring, citizen science, and STEM education

    Science.gov (United States)

    Hicks, S. D.; Aufdenkampe, A. K.; Horsburgh, J. S.; Arscott, D. B.; Muenz, T.; Bressler, D. W.

    2016-12-01

    The explosion in DIY open-source hardware and software has resulted in the development of affordable and accessible technologies, like drones and weather stations, that can greatly assist the general public in monitoring environmental health and its degradation. It is widely recognized that education and support of audiences in pursuit of STEM literacy and the application of emerging technologies is a challenge for the future of citizen science and for preparing high school graduates to be actively engaged in environmental stewardship. It is also clear that detecting environmental change/degradation over time and space will be greatly enhanced with expanded use of networked, remote monitoring technologies by watershed organizations and citizen scientists if data collection and reporting are properly carried out and curated. However, there are few focused efforts to link citizen scientists and school programs with these emerging tools. We have started a multi-year program to develop hardware and teaching materials for training students and citizen scientists about the use of open source hardware in environmental monitoring. Scientists and educators around the world have started building their own dataloggers and devices using a variety of boards based on open source electronics. This new hardware is now providing researchers with an inexpensive alternative to commercial data logging and transmission hardware. We will present a variety of hardware solutions using the Arduino-compatible EnviroDIY Mayfly board (http://envirodiy.org/mayfly) that can be used to build and deploy a rugged environmental monitoring station using a wide variety of sensors and options, giving the users a fully customizable device for making measurements almost anywhere. A database and visualization system is being developed that will allow the users to view and manage the data their devices are collecting. We will also present our plan for developing curricula and leading workshops to various

  9. Open evaluation (OE: A vision for entirely transparent post-publication peer review and rating for science

    Directory of Open Access Journals (Sweden)

    Nikolaus eKriegeskorte

    2012-10-01

    Full Text Available The two major functions of a scientific publishing system are to provide access to and evaluation of scientific papers. While open access (OA is becoming a reality, open evaluation (OE, the other side of coin, has received less attention. Evaluation steers the attention of the scientific community and thus the very course of science. It also influences the use of scientific findings in public policy. The current system of scientific publishing provides only journal prestige as an indication of the quality of new papers and relies on a non-transparent and noisy pre-publication peer review process, which delays publication by many months on average. Here I propose an OE system, in which papers are evaluated post-publication in an ongoing fashion by means of open peer review and rating. Through signed ratings and reviews, scientists steer the attention of their field and build their reputation. Reviewers are motivated to be objective, because low-quality or self-serving signed evaluations will negatively impact their reputation. A core feature of this proposal is a division of powers between the accumulation of evaluative evidence and the analysis of this evidence by paper evaluation functions (PEFs. PEFs can be freely defined by individuals or groups (e.g. scientific societies and provide a plurality of perspectives on the scientific literature. Simple PEFs will use averages of ratings, weighting reviewers (e.g. by H-factor and rating scales (e.g. by relevance to a decision process in different ways. Complex PEFs will use advanced statistical techniques to infer the quality of a paper. Papers with initially promising ratings will be more deeply evaluated. The continual refinement of PEFs in response to attempts by individuals to influence evaluations in their own favor will make the system ungameable. OA and OE together have the power to revolutionize scientific publishing and usher in a new culture of transparency, constructive criticism, and

  10. Earth-Base: A Free And Open Source, RESTful Earth Sciences Platform

    Science.gov (United States)

    Kishor, P.; Heim, N. A.; Peters, S. E.; McClennen, M.

    2012-12-01

    This presentation describes the motivation, concept, and architecture behind Earth-Base, a web-based, RESTful data-management, analysis and visualization platform for earth sciences data. Traditionally web applications have been built directly accessing data from a database using a scripting language. While such applications are great at bring results to a wide audience, they are limited in scope to the imagination and capabilities of the application developer. Earth-Base decouples the data store from the web application by introducing an intermediate "data application" tier. The data application's job is to query the data store using self-documented, RESTful URIs, and send the results back formatted as JavaScript Object Notation (JSON). Decoupling the data store from the application allows virtually limitless flexibility in developing applications, both web-based for human consumption or programmatic for machine consumption. It also allows outside developers to use the data in their own applications, potentially creating applications that the original data creator and app developer may not have even thought of. Standardized specifications for URI-based querying and JSON-formatted results make querying and developing applications easy. URI-based querying also allows utilizing distributed datasets easily. Companion mechanisms for querying data snapshots aka time-travel, usage tracking and license management, and verification of semantic equivalence of data are also described. The latter promotes the "What You Expect Is What You Get" (WYEIWYG) principle that can aid in data citation and verification.

  11. Citizen Science and Open Data: a Model for Invasive Alien Plant Species in Kenya's Northern Rangelands

    Science.gov (United States)

    Amirazodi, S.; Griffin, R.; Flores Cordova, A. I.; Ouko, E.; Omondi, S.; Mugo, R. M.; Farah, H.; Flores Cordova, A. I.; Adams, E. C.

    2017-12-01

    Invasive species in African savannas pose great threat to the native biodiversity and changes ecosystem functioning. In the forest sector, for instance Acacia species are important sources of fuel-wood, yet at the same time they have increased strain on water resources and shrunken forage spaces for both livestock and wildlife. In recently infested regions, invasive species can progress through the stages of introduction, establishment and dispersal to a full range. Currently there is much worldwide interest in predicting distributions of invasive species, and several organizations are faced with questions of whether and how to tackle such environmental challenges, or how to interpret predictions from the science community. Conservation practioners require mapped estimates of where species could persist in a given region, and this is associated to information about the biotope - i.e. the geographic location of the species' niche. The process of collecting species distribution data for identifying the potential distribution of the invasive species in the invaded ranges has become a challenge both in terms of resource and time allocation. This study highlights innovative approaches in crowdsourcing validation data in mapping and modelling invasive species (Acacia reficiens and Cactus) through involvement of the local communities. The general approach was to model the distribution of A. reficiens and Cactus (Opuntia Spp) using occurrence records from native range, then project the model into new regions to assess susceptibility to invasion using climatic and topographic environmental variables. The models performed better than random prediction (P 0.75.

  12. FIFE-Jobsub: a grid submission system for intensity frontier experiments at Fermilab

    International Nuclear Information System (INIS)

    Box, Dennis

    2014-01-01

    The Fermilab Intensity Frontier Experiments use an integrated submission system known as FIFE-jobsub, part of the FIFE (Fabric for Frontier Experiments) initiative, to submit batch jobs to the Open Science Grid. FIFE-jobsub eases the burden on experimenters by integrating data transfer and site selection details in an easy to use and well-documented format. FIFE-jobsub automates tedious details of maintaining grid proxies for the lifetime of the grid job. Data transfer is handled using the Intensity Frontier Data Handling Client (IFDHC) [1] tool suite, which facilitates selecting the appropriate data transfer method from many possibilities while protecting shared resources from overload. Chaining of job dependencies into Directed Acyclic Graphs (Condor DAGS) is well supported and made easier through the use of input flags and parameters.

  13. FIFE-Jobsub: a grid submission system for intensity frontier experiments at Fermilab

    Science.gov (United States)

    Box, Dennis

    2014-06-01

    The Fermilab Intensity Frontier Experiments use an integrated submission system known as FIFE-jobsub, part of the FIFE (Fabric for Frontier Experiments) initiative, to submit batch jobs to the Open Science Grid. FIFE-jobsub eases the burden on experimenters by integrating data transfer and site selection details in an easy to use and well-documented format. FIFE-jobsub automates tedious details of maintaining grid proxies for the lifetime of the grid job. Data transfer is handled using the Intensity Frontier Data Handling Client (IFDHC) [1] tool suite, which facilitates selecting the appropriate data transfer method from many possibilities while protecting shared resources from overload. Chaining of job dependencies into Directed Acyclic Graphs (Condor DAGS) is well supported and made easier through the use of input flags and parameters.

  14. The dynamic management system for grid resources information of IHEP

    International Nuclear Information System (INIS)

    Gu Ming; Sun Gongxing; Zhang Weiyi

    2003-01-01

    The Grid information system is an essential base for building a Grid computing environment, it collects timely the resources information of each resource in a Grid, and provides an entire information view of all resources to the other components in a Grid computing system. The Grid technology could support strongly the computing of HEP (High Energy Physics) with big science and multi-organization features. In this article, the architecture and implementation of a dynamic management system are described, as well as the grid and LDAP (Lightweight Directory Access Protocol), including Web-based design for resource information collecting, querying and modifying. (authors)

  15. Learning At The Boundaries In An “Open Regional Innovation System”: A Focus On Firms’ Innovation Strategies In The Emilia Romagna Life Science Industry

    OpenAIRE

    fiorenza belussi; silvia rita sedita; alessia sammarra

    2010-01-01

    The paper investigates the existence of an Open Regional Innovation System (ORIS model). This model is characterised by the firms’ adoption of an open innovation strategy, which overcomes not only the boundaries of the firms but also the boundaries of the region. Using data collected in a sample of life science firms, our research provides the evidence that the Emilia Romagna RIS has evolved towards an ORIS model, where firms’ innovation search strategy, despite being still embedded in local ...

  16. ScienceCentral: open access full-text archive of scientific journals based on Journal Article Tag Suite regardless of their languages.

    Science.gov (United States)

    Huh, Sun

    2013-01-01

    ScienceCentral, a free or open access, full-text archive of scientific journal literature at the Korean Federation of Science and Technology Societies, was under test in September 2013. Since it is a Journal Article Tag Suite-based full text database, extensible markup language files of all languages can be presented, according to Unicode Transformation Format 8-bit encoding. It is comparable to PubMed Central: however, there are two distinct differences. First, its scope comprises all science fields; second, it accepts all language journals. Launching ScienceCentral is the first step for free access or open access academic scientific journals of all languages to leap to the world, including scientific journals from Croatia.

  17. JGrass-NewAge hydrological system: an open-source platform for the replicability of science.

    Science.gov (United States)

    Bancheri, Marialaura; Serafin, Francesco; Formetta, Giuseppe; Rigon, Riccardo; David, Olaf

    2017-04-01

    JGrass-NewAge is an open source semi-distributed hydrological modelling system. It is based on the object modelling framework (OMS version 3), on the JGrasstools and on the Geotools. OMS3 allows to create independent packages of software which can be connected at run-time in a working modelling solution. These components are available as library/dependency or as repository to fork in order to add further features. Different tools are adopted to make easier the integration, the interoperability and the use of each package. Most of the components are Gradle integrated, since it represents the state-of-art of the building systems, especially for Java projects. The continuous integration is a further layer between local source code (client-side) and remote repository (server-side) and ensures the building and the testing of the source code at each commit. Finally, the use of Zenodo makes the code hosted in GitHub unique, citable and traceable, with a defined DOI. Following the previous standards, each part of the hydrological cycle is implemented in JGrass-NewAge as a component that can be selected, adopted, and connected to obtain a user "customized" hydrological model. A variety of modelling solutions are possible, allowing a complete hydrological analysis. Moreover, thanks to the JGrasstools and the Geotools, the visualization of the data and of the results using a selected GIS is possible. After the geomorphological analysis of the watershed, the spatial interpolation of the meteorological inputs can be performed using both deterministic (IDW) and geostatistic (Kriging) algorithms. For the radiation balance, the shortwave and longwave radiation can be estimated, which are, in turn, inputs for the simulation of the evapotranspiration, according to Priestly-Taylor and Penman-Monteith formulas. Three degree-day models are implemented for the snow melting and SWE. The runoff production can be simulated using two different components, "Adige" and "Embedded Reservoirs

  18. Grid Integration Research | Wind | NREL

    Science.gov (United States)

    Grid Integration Research Grid Integration Research Researchers study grid integration of wind three wind turbines with transmission lines in the background. Capabilities NREL's grid integration electric power system operators to more efficiently manage wind grid system integration. A photo of

  19. Interoperability and HealthGRID.

    Science.gov (United States)

    Bescos, C; Schmitt, D; Kass, J; García-Barbero, M; Kantchev, P

    2005-01-01

    GRID technology, with initiatives like the GGF, will have the potential to allow both competition and interoperability not only among applications and toolkits, but also among implementations of key services. The pyramid of eHealth interoperability should be achieved from standards in communication and data security, storage and processing, to the policy initiatives, including organizational protocols, financing procedures, and legal framework. The open challenges for GRID use in clinical fields illustrate the potential of the combination of grid technologies with medical routine into a wider interoperable framework. The Telemedicine Alliance is a consortium (ESA, WHO and ITU), initiated in 2002, in building a vision for the provision of eHealth to European citizens by 2010. After a survey with more that 50 interviews of experts, interoperability was identified as the main showstopper to eHealth implementation. There are already several groups and organizations contributing to standardization. TM-Alliance is supporting the "e-Health Standardization Coordination Group" (eHSCG). It is now, in the design and development phase of GRID technology in Health, the right moment to act with the aim of achieving an interoperable and open framework. The Health area should benefit from the initiatives started at the GGF in terms of global architecture and services definitions, as well as from the security and other web services applications developed under the Internet umbrella. There is a risk that existing important results of the standardization efforts in this area are not taken up simply because they are not always known.

  20. Benchmarking Swiss electricity grids

    International Nuclear Information System (INIS)

    Walti, N.O.; Weber, Ch.

    2001-01-01

    This extensive article describes a pilot benchmarking project initiated by the Swiss Association of Electricity Enterprises that assessed 37 Swiss utilities. The data collected from these utilities on a voluntary basis included data on technical infrastructure, investments and operating costs. These various factors are listed and discussed in detail. The assessment methods and rating mechanisms that provided the benchmarks are discussed and the results of the pilot study are presented that are to form the basis of benchmarking procedures for the grid regulation authorities under the planned Switzerland's electricity market law. Examples of the practical use of the benchmarking methods are given and cost-efficiency questions still open in the area of investment and operating costs are listed. Prefaces by the Swiss Association of Electricity Enterprises and the Swiss Federal Office of Energy complete the article

  1. International outreach for promoting open geoscience content in Finnish university libraries - libraries as the advocates of citizen science awareness on emerging open geospatial data repositories in Finnish society

    Science.gov (United States)

    Rousi, A. M.; Branch, B. D.; Kong, N.; Fosmire, M.

    2013-12-01

    In their Finnish National Spatial Strategy 2010-2015 the Finland's Ministry of Agriculture and Forestry delineated e.g. that spatial data skills should support citizens everyday activities and facilitate decision-making and participation of citizens. Studies also predict that open data, particularly open spatial data, would create, when fully realizing their potential, a 15% increase into the turnovers of Finnish private sector companies. Finnish libraries have a long tradition of serving at the heart of Finnish information society. However, with the emerging possibilities of educating their users on open spatial data a very few initiatives have been made. The National Survey of Finland opened its data in 2012. Finnish technology university libraries, such as Aalto University Library, are open environments for all citizens, and seem suitable of being the first thriving entities in educating citizens on open geospatial data. There are however many obstacles to overcome, such as lack of knowledge about policies, lack of understanding of geospatial data services and insufficient know-how of GIS software among the personnel. This framework examines the benefits derived from an international collaboration between Purdue University Libraries and Aalto University Library to create local strategies in implementing open spatial data education initiatives in Aalto University Library's context. The results of this international collaboration are explicated for the benefit of the field as a whole.

  2. Synergisms between smart metering and smart grid; Synergien zwischen Smart Metering und Smart Grid

    Energy Technology Data Exchange (ETDEWEB)

    Maas, Peter [IDS GmbH, Ettlingen (Germany)

    2010-04-15

    With the implementation of a smart metering solution, it is not only possible to acquire consumption data for billing but also to acquire relevant data of the distribution grid for grid operation. There is still a wide gap between the actual condition and the target condition. Synergies result from the use of a common infrastructure which takes account both of the requirements of smart metering and of grid operation. An open architecture also enables the future integration of further applications of the fields of smart grid and smart home. (orig.)

  3. From Open Data to Science-Based Services for Disaster Risk Management: the Experience of the GEO Geohazards Supersite Network

    Science.gov (United States)

    Salvi, S.; Poland, M. P.; Sigmundsson, F.; Puglisi, G.; Borgstrom, S.; Ergintav, S.; Vogfjord, K. S.; Fournier, N.; Hamling, I. J.; Mothes, P. A.; Savvaidis, A.; Wicks, C. W., Jr.

    2017-12-01

    In 2010, the Geohazard Supersites and Natural Laboratories initiative (GSNL) established, in the framework of GEO, the concept of a global partnership among the geophysical scientific community, space agencies, and in-situ data providers, with the aim of promoting scientific advancements in the understanding of seismic and volcanic phenomena. This goal is achieved through open sharing of large volumes of remote sensing and in-situ data from specific volcanic or seismic areas of particularly high risk or scientific interest (the Supersites) as proposed by the scientific community. Data provision to the Supersites is coordinated by local research and monitoring institutions, which deploy and manage geophysical monitoring networks and have an institutional mandate for the provision of scientific data and services to the national government and other regional users. Starting in 2015, following the changes in GEO and the call for action given by the Sendai Framework 2015-2030, the GSNL initiative has promoted the rapid uptake of newly developed scientific information for maximum societal benefit in Disaster Risk Management (DRM). While the procedures by which the scientific products are provided to the local decision makers depend on the different national operational frameworks and are largely independent of the Supersite existence, the quality of the scientific information, and thus its actual benefit for DRM, is considerably enhanced at each Supersite. This growth in scientific understanding of specific volcanic and seismic areas is not only due to wider accessibility of data, but also to the increased collaboration and sharing of resources and capacities that occurs inside the Supersite scientific community. For maximum effectiveness, the GSNL initiative supports an Open Science approach, where different collaboration and communication approaches and technological solutions are developed, tested, and shared, thereby helping to sustain the scientific investigation

  4. Parallel grid population

    Science.gov (United States)

    Wald, Ingo; Ize, Santiago

    2015-07-28

    Parallel population of a grid with a plurality of objects using a plurality of processors. One example embodiment is a method for parallel population of a grid with a plurality of objects using a plurality of processors. The method includes a first act of dividing a grid into n distinct grid portions, where n is the number of processors available for populating the grid. The method also includes acts of dividing a plurality of objects into n distinct sets of objects, assigning a distinct set of objects to each processor such that each processor determines by which distinct grid portion(s) each object in its distinct set of objects is at least partially bounded, and assigning a distinct grid portion to each processor such that each processor populates its distinct grid portion with any objects that were previously determined to be at least partially bounded by its distinct grid portion.

  5. Utilizing Public Access Data and Open Source Statistical Programs to Teach Climate Science to Interdisciplinary Undergraduate Students

    Science.gov (United States)

    Collins, L.

    2014-12-01

    Students in the Environmental Studies major at the University of Southern California fulfill their curriculum requirements by taking a broad range of courses in the social and natural sciences. Climate change is often taught in 1-2 lectures in these courses with limited examination of this complex topic. Several upper division elective courses focus on the science, policy, and social impacts of climate change. In an upper division course focused on the scientific tools used to determine paleoclimate and predict future climate, I have developed a project where students download, manipulate, and analyze data from the National Climatic Data Center. Students are required to download 100 or more years of daily temperature records and use the statistical program R to analyze that data, calculating daily, monthly, and yearly temperature averages along with changes in the number of extreme hot or cold days (≥90˚F and ≤30˚F, respectively). In parallel, they examine population growth, city expansion, and changes in transportation looking for correlations between the social data and trends observed in the temperature data. Students examine trends over time to determine correlations to urban heat island effect. This project exposes students to "real" data, giving them the tools necessary to critically analyze scientific studies without being experts in the field. Utilizing the existing, public, online databases provides almost unlimited, free data. Open source statistical programs provide a cost-free platform for examining the data although some in-class time is required to help students navigate initial data importation and analysis. Results presented will highlight data compiled over three years of course projects.

  6. A pressure drop model for PWR grids

    Energy Technology Data Exchange (ETDEWEB)

    Oh, Dong Seok; In, Wang Ki; Bang, Je Geon; Jung, Youn Ho; Chun, Tae Hyun [Korea Atomic Energy Research Institute, Taejon (Korea, Republic of)

    1999-12-31

    A pressure drop model for the PWR grids with and without mixing device is proposed at single phase based on the fluid mechanistic approach. Total pressure loss is expressed in additive way for form and frictional losses. The general friction factor correlations and form drag coefficients available in the open literatures are used to the model. As the results, the model shows better predictions than the existing ones for the non-mixing grids, and reasonable agreements with the available experimental data for mixing grids. Therefore it is concluded that the proposed model for pressure drop can provide sufficiently good approximation for grid optimization and design calculation in advanced grid development. 7 refs., 3 figs., 3 tabs. (Author)

  7. A pressure drop model for PWR grids

    Energy Technology Data Exchange (ETDEWEB)

    Oh, Dong Seok; In, Wang Ki; Bang, Je Geon; Jung, Youn Ho; Chun, Tae Hyun [Korea Atomic Energy Research Institute, Taejon (Korea, Republic of)

    1998-12-31

    A pressure drop model for the PWR grids with and without mixing device is proposed at single phase based on the fluid mechanistic approach. Total pressure loss is expressed in additive way for form and frictional losses. The general friction factor correlations and form drag coefficients available in the open literatures are used to the model. As the results, the model shows better predictions than the existing ones for the non-mixing grids, and reasonable agreements with the available experimental data for mixing grids. Therefore it is concluded that the proposed model for pressure drop can provide sufficiently good approximation for grid optimization and design calculation in advanced grid development. 7 refs., 3 figs., 3 tabs. (Author)

  8. Smart grid security

    CERN Document Server

    Goel, Sanjay; Papakonstantinou, Vagelis; Kloza, Dariusz

    2015-01-01

    This book on smart grid security is meant for a broad audience from managers to technical experts. It highlights security challenges that are faced in the smart grid as we widely deploy it across the landscape. It starts with a brief overview of the smart grid and then discusses some of the reported attacks on the grid. It covers network threats, cyber physical threats, smart metering threats, as well as privacy issues in the smart grid. Along with the threats the book discusses the means to improve smart grid security and the standards that are emerging in the field. The second part of the b

  9. Wireless communications networks for the smart grid

    CERN Document Server

    Ho, Quang-Dung; Rajalingham, Gowdemy; Le-Ngoc, Tho

    2014-01-01

    This brief presents a comprehensive review of the network architecture and communication technologies of the smart grid communication network (SGCN). It then studies the strengths, weaknesses and applications of two promising wireless mesh routing protocols that could be used to implement the SGCN. Packet transmission reliability, latency and robustness of these two protocols are evaluated and compared by simulations in various practical SGCN scenarios. Finally, technical challenges and open research opportunities of the SGCN are addressed. Wireless Communications Networks for Smart Grid provi

  10. An Open Science Peer Review Oath [v2; ref status: indexed, http://f1000r.es/4wf

    Directory of Open Access Journals (Sweden)

    Jelena Aleksic

    2015-01-01

    Full Text Available One of the foundations of the scientific method is to be able to reproduce experiments and corroborate the results of research that has been done before. However, with the increasing complexities of new technologies and techniques, coupled with the specialisation of experiments, reproducing research findings has become a growing challenge. Clearly, scientific methods must be conveyed succinctly, and with clarity and rigour, in order for research to be reproducible. Here, we propose steps to help increase the transparency of the scientific method and the reproducibility of research results: specifically, we introduce a peer-review oath and accompanying manifesto. These have been designed to offer guidelines to enable reviewers (with the minimum friction or bias to follow and apply open science principles, and support the ideas of transparency, reproducibility and ultimately greater societal impact. Introducing the oath and manifesto at the stage of peer review will help to check that the research being published includes everything that other researchers would need to successfully repeat the work. Peer review is the lynchpin of the publishing system: encouraging the community to consciously (and conscientiously uphold these principles should help to improve published papers, increase confidence in the reproducibility of the work and, ultimately, provide strategic benefits to authors and their institutions.

  11. PhLeGrA: Graph Analytics in Pharmacology over the Web of Life Sciences Linked Open Data.

    Science.gov (United States)

    Kamdar, Maulik R; Musen, Mark A

    2017-04-01

    Integrated approaches for pharmacology are required for the mechanism-based predictions of adverse drug reactions that manifest due to concomitant intake of multiple drugs. These approaches require the integration and analysis of biomedical data and knowledge from multiple, heterogeneous sources with varying schemas, entity notations, and formats. To tackle these integrative challenges, the Semantic Web community has published and linked several datasets in the Life Sciences Linked Open Data (LSLOD) cloud using established W3C standards. We present the PhLeGrA platform for Linked Graph Analytics in Pharmacology in this paper. Through query federation, we integrate four sources from the LSLOD cloud and extract a drug-reaction network, composed of distinct entities. We represent this graph as a hidden conditional random field (HCRF), a discriminative latent variable model that is used for structured output predictions. We calculate the underlying probability distributions in the drug-reaction HCRF using the datasets from the U.S. Food and Drug Administration's Adverse Event Reporting System. We predict the occurrence of 146 adverse reactions due to multiple drug intake with an AUROC statistic greater than 0.75. The PhLeGrA platform can be extended to incorporate other sources published using Semantic Web technologies, as well as to discover other types of pharmacological associations.

  12. Grid generation methods

    CERN Document Server

    Liseikin, Vladimir D

    2010-01-01

    This book is an introduction to structured and unstructured grid methods in scientific computing, addressing graduate students, scientists as well as practitioners. Basic local and integral grid quality measures are formulated and new approaches to mesh generation are reviewed. In addition to the content of the successful first edition, a more detailed and practice oriented description of monitor metrics in Beltrami and diffusion equations is given for generating adaptive numerical grids. Also, new techniques developed by the author are presented, in particular a technique based on the inverted form of Beltrami’s partial differential equations with respect to control metrics. This technique allows the generation of adaptive grids for a wide variety of computational physics problems, including grid clustering to given function values and gradients, grid alignment with given vector fields, and combinations thereof. Applications of geometric methods to the analysis of numerical grid behavior as well as grid ge...

  13. Proposal for grid computing for nuclear applications

    International Nuclear Information System (INIS)

    Faridah Mohamad Idris; Wan Ahmad Tajuddin Wan Abdullah; Zainol Abidin Ibrahim; Zukhaimira Zolkapli

    2013-01-01

    Full-text: The use of computer clusters for computational sciences including computational physics is vital as it provides computing power to crunch big numbers at a faster rate. In compute intensive applications that requires high resolution such as Monte Carlo simulation, the use of computer clusters in a grid form that supplies computational power to any nodes within the grid that needs computing power, has now become a necessity. In this paper, we described how the clusters running on a specific application could use resources within the grid, to run the applications to speed up the computing process. (author)

  14. The Role of Citizen Science in Risk Mitigation and Disaster Response: A Case Study of 2015 Nepalese Earthquake Using OpenStreetMap

    Science.gov (United States)

    Rieger, C.; Byrne, J. M.

    2015-12-01

    Citizen science includes networks of ordinary people acting as sensors, observing and recording information for science. OpenStreetMap is one such sensor network which empowers citizens to collaboratively produce a global picture from free geographic information. The success of this open source software is extended by the development of freely used open databases for the user community. Participating citizens do not require a high level of skill. Final results are processed by professionals following quality assurance protocols before map information is released. OpenStreetMap is not only the cheapest source of timely maps in many cases but also often the only source. This is particularly true in developing countries. Emergency responses to the recent earthquake in Nepal illustrates the value for rapidly updated geographical information. This includes emergency management, damage assessment, post-disaster response, and future risk mitigation. Local disaster conditions (landslides, road closings, bridge failures, etc.) were documented for local aid workers by citizen scientists working remotely. Satellites and drones provide digital imagery of the disaster zone and OpenStreetMap participants shared the data from locations around the globe. For the Nepal earthquake, OpenStreetMap provided a team of volunteers on the ground through their Humanitarian OpenStreetMap Team (HOT) which contribute data to the disaster response through smartphones and laptops. This, combined with global citizen science efforts, provided immediate geographically useful maps to assist aid workers, including the Red Cross and Canadian DART Team, and the Nepalese government. As of August 2014, almost 1.7 million users provided over 2.5 billion edits to the OpenStreetMap map database. Due to the increased usage of smartphones, GPS-enabled devices, and the growing participation in citizen science projects, data gathering is proving an effective way to contribute as a global citizen. This paper

  15. Chimera Grid Tools

    Science.gov (United States)

    Chan, William M.; Rogers, Stuart E.; Nash, Steven M.; Buning, Pieter G.; Meakin, Robert

    2005-01-01

    Chimera Grid Tools (CGT) is a software package for performing computational fluid dynamics (CFD) analysis utilizing the Chimera-overset-grid method. For modeling flows with viscosity about geometrically complex bodies in relative motion, the Chimera-overset-grid method is among the most computationally cost-effective methods for obtaining accurate aerodynamic results. CGT contains a large collection of tools for generating overset grids, preparing inputs for computer programs that solve equations of flow on the grids, and post-processing of flow-solution data. The tools in CGT include grid editing tools, surface-grid-generation tools, volume-grid-generation tools, utility scripts, configuration scripts, and tools for post-processing (including generation of animated images of flows and calculating forces and moments exerted on affected bodies). One of the tools, denoted OVERGRID, is a graphical user interface (GUI) that serves to visualize the grids and flow solutions and provides central access to many other tools. The GUI facilitates the generation of grids for a new flow-field configuration. Scripts that follow the grid generation process can then be constructed to mostly automate grid generation for similar configurations. CGT is designed for use in conjunction with a computer-aided-design program that provides the geometry description of the bodies, and a flow-solver program.

  16. Bayesian grid matching

    DEFF Research Database (Denmark)

    Hartelius, Karsten; Carstensen, Jens Michael

    2003-01-01

    A method for locating distorted grid structures in images is presented. The method is based on the theories of template matching and Bayesian image restoration. The grid is modeled as a deformable template. Prior knowledge of the grid is described through a Markov random field (MRF) model which r...

  17. Smart grid in China

    DEFF Research Database (Denmark)

    Sommer, Simon; Ma, Zheng; Jørgensen, Bo Nørregaard

    2015-01-01

    China is planning to transform its traditional power grid in favour of a smart grid, since it allows a more economically efficient and a more environmentally friendly transmission and distribution of electricity. Thus, a nationwide smart grid is likely to save tremendous amounts of resources...

  18. Comparison of tentative radiographic working length with and without grid versus electronic apex locator

    Directory of Open Access Journals (Sweden)

    Tanikonda Rambabu

    2018-01-01

    Full Text Available The apical termination of obturation is the most important factor influencing the success of root canal treatment (RCT. Working length (WL is the key element in achieving this. Aim: The aim of this study is to compare and evaluate the preoperative estimated WL with conventional radiograph and with grid radiograph, with reference to electronic apex locator (EAL in single-rooted teeth. Settings and Design: Thirty permanent anterior teeth with complete root formation indicated for RCT were included in this study. Materials and Methods: Conventional radiograph (Group 1 and conventional radiograph with external grid (Group 2 were made before access opening. WL with EAL (Group 3 was determined after access opening. Statistical Analysis: The statistical package for the social sciences (SPSS version 16.0 (SPSS Inc., Chicago, IL, USA was used to compare the WLs of three groups, and the statistical significance was considered to be P ≤ 0.05. ANOVA, post hoc test were made to measure the intergroup comparison, and Pearson correlation values were obtained. Results and Conclusion: The results of the study showed a higher correlation between grid WL and apex locator WL than conventional WL and apex locator WL. Preoperative metrics with radiographic grid along with the apex locator is a better measuring tool compared to the conventional radiographic WL in a single-rooted tooth.

  19. Building a multi-scaled geospatial temporal ecology database from disparate data sources: Fostering open science through data reuse

    Science.gov (United States)

    Soranno, Patricia A.; Bissell, E.G.; Cheruvelil, Kendra S.; Christel, Samuel T.; Collins, Sarah M.; Fergus, C. Emi; Filstrup, Christopher T.; Lapierre, Jean-Francois; Lotting, Noah R.; Oliver, Samantha K.; Scott, Caren E.; Smith, Nicole J.; Stopyak, Scott; Yuan, Shuai; Bremigan, Mary Tate; Downing, John A.; Gries, Corinna; Henry, Emily N.; Skaff, Nick K.; Stanley, Emily H.; Stow, Craig A.; Tan, Pang-Ning; Wagner, Tyler; Webster, Katherine E.

    2015-01-01

    Although there are considerable site-based data for individual or groups of ecosystems, these datasets are widely scattered, have different data formats and conventions, and often have limited accessibility. At the broader scale, national datasets exist for a large number of geospatial features of land, water, and air that are needed to fully understand variation among these ecosystems. However, such datasets originate from different sources and have different spatial and temporal resolutions. By taking an open-science perspective and by combining site-based ecosystem datasets and national geospatial datasets, science gains the ability to ask important research questions related to grand environmental challenges that operate at broad scales. Documentation of such complicated database integration efforts, through peer-reviewed papers, is recommended to foster reproducibility and future use of the integrated database. Here, we describe the major steps, challenges, and considerations in building an integrated database of lake ecosystems, called LAGOS (LAke multi-scaled GeOSpatial and temporal database), that was developed at the sub-continental study extent of 17 US states (1,800,000 km2). LAGOS includes two modules: LAGOSGEO, with geospatial data on every lake with surface area larger than 4 ha in the study extent (~50,000 lakes), including climate, atmospheric deposition, land use/cover, hydrology, geology, and topography measured across a range of spatial and temporal extents; and LAGOSLIMNO, with lake water quality data compiled from ~100 individual datasets for a subset of lakes in the study extent (~10,000 lakes). Procedures for the integration of datasets included: creating a flexible database design; authoring and integrating metadata; documenting data provenance; quantifying spatial measures of geographic data; quality-controlling integrated and derived data; and extensively documenting the database. Our procedures make a large, complex, and integrated

  20. Building a multi-scaled geospatial temporal ecology database from disparate data sources: fostering open science and data reuse.

    Science.gov (United States)

    Soranno, Patricia A; Bissell, Edward G; Cheruvelil, Kendra S; Christel, Samuel T; Collins, Sarah M; Fergus, C Emi; Filstrup, Christopher T; Lapierre, Jean-Francois; Lottig, Noah R; Oliver, Samantha K; Scott, Caren E; Smith, Nicole J; Stopyak, Scott; Yuan, Shuai; Bremigan, Mary Tate; Downing, John A; Gries, Corinna; Henry, Emily N; Skaff, Nick K; Stanley, Emily H; Stow, Craig A; Tan, Pang-Ning; Wagner, Tyler; Webster, Katherine E

    2015-01-01

    Although there are considerable site-based data for individual or groups of ecosystems, these datasets are widely scattered, have different data formats and conventions, and often have limited accessibility. At the broader scale, national datasets exist for a large number of geospatial features of land, water, and air that are needed to fully understand variation among these ecosystems. However, such datasets originate from different sources and have different spatial and temporal resolutions. By taking an open-science perspective and by combining site-based ecosystem datasets and national geospatial datasets, science gains the ability to ask important research questions related to grand environmental challenges that operate at broad scales. Documentation of such complicated database integration efforts, through peer-reviewed papers, is recommended to foster reproducibility and future use of the integrated database. Here, we describe the major steps, challenges, and considerations in building an integrated database of lake ecosystems, called LAGOS (LAke multi-scaled GeOSpatial and temporal database), that was developed at the sub-continental study extent of 17 US states (1,800,000 km(2)). LAGOS includes two modules: LAGOSGEO, with geospatial data on every lake with surface area larger than 4 ha in the study extent (~50,000 lakes), including climate, atmospheric deposition, land use/cover, hydrology, geology, and topography measured across a range of spatial and temporal extents; and LAGOSLIMNO, with lake water quality data compiled from ~100 individual datasets for a subset of lakes in the study extent (~10,000 lakes). Procedures for the integration of datasets included: creating a flexible database design; authoring and integrating metadata; documenting data provenance; quantifying spatial measures of geographic data; quality-controlling integrated and derived data; and extensively documenting the database. Our procedures make a large, complex, and integrated

  1. Using a Massive Open Online Course (MOOC) for Earth Science Education: Who Did We Teach and What Did We Learn?

    Science.gov (United States)

    Gold, Anne; Gordon, Eric

    2016-04-01

    Over the last decade, Massive Open Online Courses (MOOCs) have rapidly gained traction as a way to provide virtually anyone with an internet connection free access to a broad variety of high-quality college-level courses. That means Earth science instructors can now teach courses that reach tens of thousands of students--an incredible opportunity, but one that also poses many novel challenges. In April 2015, we used the Coursera platform to run a MOOC entitled "Water in the Western United States," to deliver a survey course of broad interest and partly as a venue to make research efforts accessible to a wide audience. Leveraging a previous online course run on a smaller MOOC platform (Canvas), we created a course largely based on short expert video lectures tied together by various types of assessments.Over a dozen experts provided short lectures offering a survey course that touches on the social, legal, natural, and societal aspects of the topic.This style of MOOC, in which the content is not delivered by one expert but by many, helped us showcase the breadth of available expertise both at the University of Colorado and elsewhere. In this presentation we will discuss the challenges that arose from planning a MOOC with no information about the characteristics of the student body, teaching thousands of unidentified students, and understanding the nature of online learning in an increasingly mobile-dominated world. We will also discuss the opportunities a MOOC offers for changes in undergraduate education, sharing across campuses or even across levels, and promoting flipped classroom-style learning. Finally, we will describe the general characteristics of our MOOC student body and describe lessons learned from our experience while aiming to place the MOOC experience into a larger conversation about the future of education at multiple levels.

  2. Just Roll with It? Rolling Volumes vs. Discrete Issues in Open Access Library and Information Science Journals

    Directory of Open Access Journals (Sweden)

    Jill Cirasella

    2013-08-01

    Full Text Available INTRODUCTION Articles in open access (OA journals can be published on a rolling basis, as they become ready, or in complete, discrete issues. This study examines the prevalence of and reasons for rolling volumes vs. discrete issues among scholarly OA library and information science (LIS journals based in the United States. METHODS A survey was distributed to journal editors, asking them about their publication model and their reasons for and satisfaction with that model. RESULTS Of the 21 responding journals, 12 publish in discrete issues, eight publish in rolling volumes, and one publishes in rolling volumes with an occasional special issue. Almost all editors, regardless of model, cited ease of workflow as a justification for their chosen publication model, suggesting that there is no single best workflow for all journals. However, while all rolling-volume editors reported being satisfied with their model, satisfaction was less universal among discrete-issue editors. DISCUSSION The unexpectedly high number of rolling-volume journals suggests that LIS journal editors are making forward-looking choices about publication models even though the topic has not been much addressed in the library literature. Further research is warranted; possibilities include expanding the study’s geographic scope, broadening the study to other disciplines, and investigating publication model trends across the entire scholarly OA universe. CONCLUSION Both because satisfaction is high among editors of rolling-volume journals and because readers and authors appreciate quick publication times, the rolling-volume model will likely become even more prevalent in coming years.

  3. The Open Science Peer Review Oath [v1; ref status: indexed, http://f1000r.es/4ou

    Directory of Open Access Journals (Sweden)

    Jelena Aleksic

    2014-11-01

    Full Text Available One of the foundations of the scientific method is to be able to reproduce experiments and corroborate the results of research that has been done before. However, with the increasing complexities of new technologies and techniques, coupled with the specialisation of experiments, reproducing research findings has become a growing challenge. Clearly, scientific methods must be conveyed succinctly, and with clarity and rigour, in order for research to be reproducible. Here, we propose steps to help increase the transparency of the scientific method and the reproducibility of research results: specifically, we introduce a peer-review oath and accompanying manifesto. These have been designed to offer guidelines to enable reviewers (with the minimum friction or bias to follow and apply open science principles, and support the ideas of transparency, reproducibility and ultimately greater societal impact. Introducing the oath and manifesto at the stage of peer review will help to check that the research being published includes everything that other researchers would need to successfully repeat the work. Peer review is the lynchpin of the publishing system: encouraging the community to consciously (and conscientiously uphold these principles should help to improve published papers, increase confidence in the reproducibility of the work and, ultimately, provide strategic benefits to authors and their institutions. Future incarnations of the various national Research Excellence Frameworks (REFs will evolve away from simple citations towards measurable societal value and impact. The proposed manifesto aspires to facilitate this goal by making transparency, reproducibility and citizen-scientist engagement (with the knowledge-creation and dissemination processes the default parameters for performing sound research.

  4. Smart grid technologies in local electric grids

    Science.gov (United States)

    Lezhniuk, Petro D.; Pijarski, Paweł; Buslavets, Olga A.

    2017-08-01

    The research is devoted to the creation of favorable conditions for the integration of renewable sources of energy into electric grids, which were designed to be supplied from centralized generation at large electric power stations. Development of distributed generation in electric grids influences the conditions of their operation - conflict of interests arises. The possibility of optimal functioning of electric grids and renewable sources of energy, when complex criterion of the optimality is balance reliability of electric energy in local electric system and minimum losses of electric energy in it. Multilevel automated system for power flows control in electric grids by means of change of distributed generation of power is developed. Optimization of power flows is performed by local systems of automatic control of small hydropower stations and, if possible, solar power plants.

  5. Mapping of grid faults and grid codes

    DEFF Research Database (Denmark)

    Iov, Florin; Hansen, A.D.; Sørensen, P.

    loads of wind turbines. The goal is also to clarify and define possible new directions in the certification process of power plant wind turbines, namely wind turbines, which participate actively in the stabilisation of power systems. Practical experience shows that there is a need...... challenges for the design of both the electrical system and the mechanical structure of wind turbines. An overview over the frequency of grid faults and the grid connection requirements in different relevant countries is done in this report. The most relevant study cases for the quantification of the loads......The present report is a part of the research project "Grid fault and design basis for wind turbine" supported by Energinet.dk through the grant PSO F&U 6319. The objective of this project is to investigate into the consequences of the new grid connection requirements for the fatigue and extreme...

  6. Mapping of grid faults and grid codes

    DEFF Research Database (Denmark)

    Iov, F.; Hansen, Anca Daniela; Sørensen, Poul Ejnar

    loads of wind turbines. The goal is also to clarify and define possible new directions in the certification process of power plant wind turbines, namely wind turbines, which participate actively in the stabilisation of power systems. Practical experience shows that there is a need...... challenges for the design of both the electrical system and the mechanical structure of wind turbines. An overview over the frequency of grid faults and the grid connection requirements in different relevant countries is done in this report. The most relevant study cases for the quantification of the loads......The present report is a part of the research project ''Grid fault and designbasis for wind turbine'' supported by Energinet.dk through the grant PSO F&U 6319. The objective of this project is to investigate into the consequences of the new grid connection requirements for the fatigue and extreme...

  7. Automated tools and techniques for distributed Grid Software: Development of the testbed infrastructure

    OpenAIRE

    Aguado Sanchez, C; Di Meglio, A

    2007-01-01

    Grid technology is becoming more and more important as the new paradigm for sharing computational resources across different organizations in a secure way. The great powerfulness of this solution, requires the definition of a generic stack of services and protocols and this is the scope of the different Grid initiatives. As a result of international collaborations for its development, the Open Grid Forum created the Open Grid Services Architecture (OGSA) which aims to define the common set of...

  8. Geospatial Applications on Different Parallel and Distributed Systems in enviroGRIDS Project

    Science.gov (United States)

    Rodila, D.; Bacu, V.; Gorgan, D.

    2012-04-01

    The execution of Earth Science applications and services on parallel and distributed systems has become a necessity especially due to the large amounts of Geospatial data these applications require and the large geographical areas they cover. The parallelization of these applications comes to solve important performance issues and can spread from task parallelism to data parallelism as well. Parallel and distributed architectures such as Grid, Cloud, Multicore, etc. seem to offer the necessary functionalities to solve important problems in the Earth Science domain: storing, distribution, management, processing and security of Geospatial data, execution of complex processing through task and data parallelism, etc. A main goal of the FP7-funded project enviroGRIDS (Black Sea Catchment Observation and Assessment System supporting Sustainable Development) [1] is the development of a Spatial Data Infrastructure targeting this catchment region but also the development of standardized and specialized tools for storing, analyzing, processing and visualizing the Geospatial data concerning this area. For achieving these objectives, the enviroGRIDS deals with the execution of different Earth Science applications, such as hydrological models, Geospatial Web services standardized by the Open Geospatial Consortium (OGC) and others, on parallel and distributed architecture to maximize the obtained performance. This presentation analysis the integration and execution of Geospatial applications on different parallel and distributed architectures and the possibility of choosing among these architectures based on application characteristics and user requirements through a specialized component. Versions of the proposed platform have been used in enviroGRIDS project on different use cases such as: the execution of Geospatial Web services both on Web and Grid infrastructures [2] and the execution of SWAT hydrological models both on Grid and Multicore architectures [3]. The current

  9. Research for Forming up the Controlling Diagram Utilizing the Connection of a LV Resistor on Voltage Transformer’s Open-Triagle Coil to Reduce over Voltage Caused by Earth Fault in 6 kV Grid of QuangNinh Underground Mines

    Directory of Open Access Journals (Sweden)

    Viet Bun Ho

    2018-01-01

    Full Text Available Single phase earth-fault in MV grids usually causes overvoltage that harm to human being and electric equipment. If the magnitude of over voltage is so great, many grids’ eco-technical parameters will be affected. The paper analyzes all possible consequences of over voltage occurred in 6 kV grid of QuangNinh underground mines. Base on the analysis, a controlling diagram utilizing the connection of a LV resistor on voltage transformer’s open-triangle coil to reduce over voltage is recommended. The simulation results of the diagram are used to prove the effectiveness of solution: the over voltage magnitude is only in the range of (2,1–2,4.Uf. Other advantage that solution brings to relay system will be pointed out.

  10. OpenVIVO: Transparency in Scholarship

    Directory of Open Access Journals (Sweden)

    Violeta Ilik

    2018-03-01

    Full Text Available OpenVIVO is a free and open-hosted semantic web platform that anyone can join and that gathers and shares open data about scholarship in the world. OpenVIVO, based on the VIVO open-source platform, provides transparent access to data about the scholarly work of its participants. OpenVIVO demonstrates the use of persistent identifiers, the automatic real-time ingest of scholarly ecosystem metadata, the use of VIVO-ISF and related ontologies, the attribution of work, and the publication and reuse of data—all critical components of presenting, preserving, and tracking scholarship. The system was created by a cross-institutional team over the course of 3 months. The team created and used RDF models for research organizations in the world based on Digital Science GRID data, for academic journals based on data from CrossRef and the US National Library of Medicine, and created a new model for attribution of scholarly work. All models, data, and software are available in open repositories.

  11. GridPP returns to CERN

    CERN Document Server

    Neasan O'Neill

    2011-01-01

    In early September, GridPP, the collaboration that manages the UK’s contribution to the worldwide LHC Computing Grid (wLCG), celebrated a decade of work by holding its twenty-seventh collaboration meeting at CERN.   Officially launched in September 2001, GridPP was one of the original partners in wLCG, funding much of the early work at CERN. Over the last decade GridPP has gone from a mere proposal to almost 30,000 CPUs working for researchers scattered across the globe. Twice a year, GridPP meets to discuss the progress and future plans of the community and this year, for the first time since 2004, decamped to CERN for this biannual meeting on the theme “GridPP in the International Context”. The main meeting was held over 2 days in the IT auditorium and was the perfect opportunity to have contributions from experts based at CERN, alongside those from within GridPP. Opening with a welcome from Frederic Hemmer, Head of the IT Department at CERN, the meeting began with...

  12. Use of the Repertory Grid for collaboration and reflection in a research context

    CSIR Research Space (South Africa)

    Alexander, P

    2010-09-01

    Full Text Available The Repertory Grid (RepGrid) technique has been used extensively in Management Sciences research, including Information Systems research, in order to reveal the personal views of individual research subjects regarding the issue being studied...

  13. Smart grid security

    Energy Technology Data Exchange (ETDEWEB)

    Cuellar, Jorge (ed.) [Siemens AG, Muenchen (Germany). Corporate Technology

    2013-11-01

    The engineering, deployment and security of the future smart grid will be an enormous project requiring the consensus of many stakeholders with different views on the security and privacy requirements, not to mention methods and solutions. The fragmentation of research agendas and proposed approaches or solutions for securing the future smart grid becomes apparent observing the results from different projects, standards, committees, etc, in different countries. The different approaches and views of the papers in this collection also witness this fragmentation. This book contains the following papers: 1. IT Security Architecture Approaches for Smart Metering and Smart Grid. 2. Smart Grid Information Exchange - Securing the Smart Grid from the Ground. 3. A Tool Set for the Evaluation of Security and Reliability in Smart Grids. 4. A Holistic View of Security and Privacy Issues in Smart Grids. 5. Hardware Security for Device Authentication in the Smart Grid. 6. Maintaining Privacy in Data Rich Demand Response Applications. 7. Data Protection in a Cloud-Enabled Smart Grid. 8. Formal Analysis of a Privacy-Preserving Billing Protocol. 9. Privacy in Smart Metering Ecosystems. 10. Energy rate at home Leveraging ZigBee to Enable Smart Grid in Residential Environment.

  14. The R package 'icosa' for coarse resolution global triangular and penta-hexagonal gridding

    Science.gov (United States)

    Kocsis, Adam T.

    2017-04-01

    With the development of the internet and the computational power of personal computers, open source programming environments have become indispensable for science in the past decade. This includes the increase of the GIS capacity of the free R environment, which was originally developed for statistical analyses. The flexibility of R made it a preferred programming tool in a multitude of disciplines from the area of the biological and geological sciences. Many of these subdisciplines operate with incidence (occurrence) data that are in a large number of cases to be grained before further analyses can be conducted. This graining is executed mostly by gridding data to cells of a Gaussian grid of various resolutions to increase the density of data in a single unit of the analyses. This method has obvious shortcomings despite the ease of its application: well-known systematic biases are induced to cell sizes and shapes that can interfere with the results of statistical procedures, especially if the number of incidence points influences the metrics in question. The 'icosa' package employs a common method to overcome this obstacle by implementing grids with roughly equal cell sizes and shapes that are based on tessellated icosahedra. These grid objects are essentially polyhedra with xyz Cartesian vertex data that are linked to tables of faces and edges. At its current developmental stage, the package uses a single method of tessellation which balances grid cell size and shape distortions, but its structure allows the implementation of various other types of tessellation algorithms. The resolution of the grids can be set by the number of breakpoints inserted into a segment forming an edge of the original icosahedron. Both the triangular and their inverted penta-hexagonal grids are available for creation with the package. The package also incorporates functions to look up coordinates in the grid very effectively and data containers to link data to the grid structure. The

  15. Family Open House

    Science.gov (United States)

    Search Family Open House Join us for an afternoon of science fun. The Fermilab Family Open House is a party for children of all ages to learn about the world of physics. The Open House is supported by Open House? Check out our YouTube video to learn more! Explore physics concepts with hands-on

  16. National transmission grid study

    Energy Technology Data Exchange (ETDEWEB)

    Abraham, Spencer [USDOE Office of the Secretary of Energy, Washington, DC (United States)

    2003-05-31

    The National Energy Policy Plan directed the U.S. Department of Energy (DOE) to conduct a study to examine the benefits of establishing a national electricity transmission grid and to identify transmission bottlenecks and measures to address them. DOE began by conducting an independent analysis of U.S. electricity markets and identifying transmission system bottlenecks using DOE’s Policy Office Electricity Modeling System (POEMS). DOE’s analysis, presented in Section 2, confirms the central role of the nation’s transmission system in lowering costs to consumers through increased trade. More importantly, DOE’s analysis also confirms the results of previous studies, which show that transmission bottlenecks and related transmission system market practices are adding hundreds of millions of dollars to consumers’ electricity bills each year. A more detailed technical overview of the use of POEMS is provided in Appendix A. DOE led an extensive, open, public input process and heard a wide range of comments and recommendations that have all been considered.1 More than 150 participants registered for three public workshops held in Detroit, MI (September 24, 2001); Atlanta, GA (September 26, 2001); and Phoenix, AZ (September 28, 2001).

  17. Using CyberShake Workflows to Manage Big Seismic Hazard Data on Large-Scale Open-Science HPC Resources

    Science.gov (United States)

    Callaghan, S.; Maechling, P. J.; Juve, G.; Vahi, K.; Deelman, E.; Jordan, T. H.

    2015-12-01

    migrating from file-based communication to MPI messaging, to greatly reduce the I/O demands and node-hour requirements of CyberShake. We will also present performance metrics from CyberShake Study 15.4, and discuss challenges that producers of Big Data on open-science HPC resources face moving forward.

  18. National power grid simulation capability : need and issues

    Energy Technology Data Exchange (ETDEWEB)

    Petri, Mark C. [Argonne National Lab. (ANL), Argonne, IL (United States)

    2009-06-02

    On December 9 and 10, 2008, the Department of Homeland Security (DHS) Science and Technology Directorate sponsored a national workshop at Argonne National Laboratory to explore the need for a comprehensive modeling and simulation capability for the national electric power grid system. The workshop brought together leading electric power grid experts from federal agencies, the national laboratories, and academia to discuss the current state of power grid science and engineering and to assess if important challenges are being met. The workshop helped delineate gaps between grid needs and current capabilities and identify issues that must be addressed if a solution is to be implemented. This report is a result of the workshop and highlights power grid modeling and simulation needs, the barriers that must be overcome to address them, and the benefits of a national power grid simulation capability.

  19. Building Grid applications using Web Services

    CERN Multimedia

    CERN. Geneva

    2004-01-01

    There has been a lot of discussion within the Grid community about the use of Web Services technologies in building large-scale, loosely-coupled, cross-organisation applications. In this talk we are going to explore the principles that govern Service-Oriented Architectures and the promise of Web Services technologies for integrating applications that span administrative domains. We are going to see how existing Web Services specifications and practices could provide the necessary infrastructure for implementing Grid applications. Biography Dr. Savas Parastatidis is a Principal Research Associate at the School of Computing Science, University of Newcastle upon Tyne, UK. Savas is one of the authors of the "Grid Application Framework based on Web Services Specifications and Practices" document that was influential in the convergence between Grid and Web Services and the move away from OGSI (more information can be found at http://www.neresc.ac.uk/ws-gaf). He has done research on runtime support for distributed-m...

  20. Developing a grid infrastructure in Cuba

    Energy Technology Data Exchange (ETDEWEB)

    Lopez Aldama, D.; Dominguez, M.; Ricardo, H.; Gonzalez, A.; Nolasco, E.; Fernandez, E.; Fernandez, M.; Sanchez, M.; Suarez, F.; Nodarse, F.; Moreno, N.; Aguilera, L.

    2007-07-01

    A grid infrastructure was deployed at Centro de Gestion de la Informacion y Desarrollo de la Energia (CUBAENERGIA) in the frame of EELA project and of a national initiative for developing a Cuban Network for Science. A stand-alone model was adopted to overcome connectivity limitations. The e-infrastructure is based on gLite-3.0 middleware and is fully compatible with EELA-infrastructure. Afterwards, the work was focused on grid applications. The application GATE was deployed from the early beginning for biomedical users. Further, two applications were deployed on the local grid infrastructure: MOODLE for e-learning and AERMOD for assessment of local dispersion of atmospheric pollutants. Additionally, our local grid infrastructure was made interoperable with a Java based distributed system for bioinformatics calculations. This experience could be considered as a suitable approach for national networks with weak Internet connections. (Author)